Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

There is a newer version of the record available.

Published August 21, 2023 | Version 0.6.0
Software Open

deephyper/deephyper: Changelog - DeepHyper 0.6.0

  • 1. Argonne National Laboratory
  • 2. Northwestern University
  • 3. Pennsylvania State University
  • 4. Argonne National Lab
  • 5. University of Notre Dame
  • 6. HKUST (Guangzhou)
  • 7. @argonne-lcf
  • 8. Lawrence Berkeley National Laboratory; Northwestern University
  • 9. Telecom ParisTech

Description

@misc{deephyper_software,
title = {"DeepHyper: A Python Package for Scalable Neural Architecture and Hyperparameter Search"},
author = {{DeepHyper Development Team}},
organization = {DeepHyper Team},
year = 2018,
url = {https://github.com/deephyper/deephyper}
}
deephyper.evaluator
  • @profile(memory=True) decorator can now profile memory using tracemalloc (adding an overhead).
  • RayStorage is now available for the ray parallel backend. It is based on remote actors and is a wrapper around the base MemoryStorage. This allows to use deephyper.stopper in parallel only with ray backend requirements.
deephyper.search
  • Multi-objective optimization (MOO) has been upgraded for better optimization performance. A new tutorial to discover this feature is available at Multi-Objective Optimization - 101.
    • A minimum-lower bound performance can be specified to avoid exploring not interesting trade-offs moo_lower_bounds=....
    • A new objective scaler is available to normalize objectives (e.g., accuracy and latency) more efficiently objective_scaler="quantile-uniform".
    • The results.csv or DataFrame now contains a new information pareto_efficient which indicates the optimal solution in a multi-objective problem (i.e., Pareto-set/front).
  • Random-Forest (RF) surrogate model predictions are faster by about x1.5 factor, speeding up the Bayesian optimization process.
  • Added a dynamic prior update for Bayesian optimization: update_prior=..., update_prior_quantile=... This allows to increase the density of sampling in areas of interest and makes "random"-sampling-based optimization of the surrogate model more competitive (against more expensive optimizers like gradient-based or genetic algorithms).
deephyper.stopper
  • SuccessiveHalvingStopper is now compatible with failures. If a "failure" is observed during training (i.e., observation starting with "F") then previous observations are replaced in shared memory to notify other competitors of the failure.
deephyper.analysis
  • Creation of a new module to provide utilities for the analysis of experiments.

Files

deephyper/deephyper-0.6.0.zip

Files (529.3 kB)

Name Size Download all
md5:c7ded85c6e05beb6dc7ff349dc045ddc
529.3 kB Preview Download

Additional details

Related works