Published August 21, 2023
| Version 0.6.0
Software
Open
deephyper/deephyper: Changelog - DeepHyper 0.6.0
Creators
- Romain Egele
- Misha Salim
- Prasanna Balaprakash1
- Joceran Gouneau
- Dipendra Jha2
- Kyle Gerard Felker1
- felixeperez
- Matthieu Dorier1
- Romit Maulik3
- Bethany Lusch4
- Tyler H Chang1
- Shengli Jiang
- Yixuan Sun1
- Albert Lam
- rmjcs2020
- Taylor Childers1
- Z223I
- Zachariah Carmichael5
- Akalanka
- Hongyuan Liu6
- Nesar Ramachandra1
- Sam Foreman7
- Sandeep Madireddy1
- Stefan M Wild8
- Sun Haozhe9
- aperezdieguez
- 1. Argonne National Laboratory
- 2. Northwestern University
- 3. Pennsylvania State University
- 4. Argonne National Lab
- 5. University of Notre Dame
- 6. HKUST (Guangzhou)
- 7. @argonne-lcf
- 8. Lawrence Berkeley National Laboratory; Northwestern University
- 9. Telecom ParisTech
Description
- The documentation site theme was updated.
- PyPI Release: https://pypi.org/project/deephyper/0.6.0/
- New BibTeX citation for DeepHyper to include our growing community:
@misc{deephyper_software,
title = {"DeepHyper: A Python Package for Scalable Neural Architecture and Hyperparameter Search"},
author = {{DeepHyper Development Team}},
organization = {DeepHyper Team},
year = 2018,
url = {https://github.com/deephyper/deephyper}
}
deephyper.evaluator
@profile(memory=True)
decorator can now profile memory usingtracemalloc
(adding an overhead).RayStorage
is now available for theray
parallel backend. It is based on remote actors and is a wrapper around the baseMemoryStorage
. This allows to usedeephyper.stopper
in parallel only withray
backend requirements.
- Multi-objective optimization (MOO) has been upgraded for better optimization performance. A new tutorial to discover this feature is available at Multi-Objective Optimization - 101.
- A minimum-lower bound performance can be specified to avoid exploring not interesting trade-offs
moo_lower_bounds=...
. - A new objective scaler is available to normalize objectives (e.g., accuracy and latency) more efficiently
objective_scaler="quantile-uniform"
. - The
results.csv
or DataFrame now contains a new informationpareto_efficient
which indicates the optimal solution in a multi-objective problem (i.e., Pareto-set/front).
- A minimum-lower bound performance can be specified to avoid exploring not interesting trade-offs
- Random-Forest (RF) surrogate model predictions are faster by about x1.5 factor, speeding up the Bayesian optimization process.
- Added a dynamic prior update for Bayesian optimization:
update_prior=..., update_prior_quantile=...
This allows to increase the density of sampling in areas of interest and makes "random"-sampling-based optimization of the surrogate model more competitive (against more expensive optimizers like gradient-based or genetic algorithms).
SuccessiveHalvingStopper
is now compatible with failures. If a "failure" is observed during training (i.e., observation starting with"F"
) then previous observations are replaced in shared memory to notify other competitors of the failure.
- Creation of a new module to provide utilities for the analysis of experiments.
Files
deephyper/deephyper-0.6.0.zip
Files
(529.3 kB)
Name | Size | Download all |
---|---|---|
md5:c7ded85c6e05beb6dc7ff349dc045ddc
|
529.3 kB | Preview Download |
Additional details
Related works
- Is supplement to
- https://github.com/deephyper/deephyper/tree/0.6.0 (URL)