Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

There is a newer version of the record available.

Published June 10, 2021 | Version 0.2.5
Software Open

deephyper/deephyper: Changelog - DeepHyper 0.2.5

  • 1. Argonne Leadership Computing Facility
  • 2. Argonne National Laboratory
  • 3. Northwestern University
  • 4. Argonne National Lab
  • 5. William & Mary

Description

General Full API documentation

The DeepHyper API is now fully documented at DeepHyper API

Tensorflow-Probability as a new dependency

TensorFlow Probability is now part of DeepHyper default set of dependencies

Automatique submission with Ray at ALCF

It is now possible to directly submit with deephyper ray-submit ... for DeepHyper at the ALCF. This feature is only available on ThetaGPU for now but can be extended to other systems by following this script.

ThetaGPU at ALCF New documentation for auto-sklearn search with DeepHyper

The access to auto-sklearn features was changed to deephyper.sklearn and a new documentation is available for this feature at User guide: AutoSklearn

New command lines for DeepHyper Analytics

The deephyper-analytics command was modified and enhanced with new features. The see the full updated documentation follow DeepHyper Analytics Tools.

The topk command is now available to have quick feedback from the results of an experiment:

$ deephyper-analytics topk combo_8gpu_8_agebo/infos/results.csv -k 2
'0':
arch_seq: '[229, 0, 22, 1, 1, 53, 29, 1, 119, 1, 0, 116, 123, 1, 273, 0, 1, 388]'
batch_size: 59
elapsed_sec: 10259.2741303444
learning_rate: 0.0001614947
loss: log_cosh
objective: 0.9236862659
optimizer: adam
patience_EarlyStopping: 22
patience_ReduceLROnPlateau: 10
'1':
arch_seq: '[229, 0, 22, 0, 1, 235, 29, 1, 313, 1, 0, 116, 123, 1, 37, 0, 1, 388]'
batch_size: 51
elapsed_sec: 8818.2674164772
learning_rate: 0.0001265946
loss: mae
objective: 0.9231553674
optimizer: nadam
patience_EarlyStopping: 23
patience_ReduceLROnPlateau: 14
Neural architecture search New documentation for the problem definition

A new documentation for the neural architecture search problem setup can be found here.

It is now possible to defined auto-tuned hyperparameters in addition of the architecture in a NAS Problem.

New Algorithms for Joint Hyperparameter and Neural Architecture Search

Three new algorithms are available to run a joint Hyperparameter and neural architecture search. The Hyperparameter optimisation is defined as HPO and neural architecture search as NAS.

  • agebo (Aging Evolution for NAS with Bayesian Optimisation for HPO)
  • ambsmixed (an extension of Asynchronous Model-Based Search for HPO + NAS)
  • regevomixed (an extension of regularised evolution for HPO + NAS)
A run function to use data-parallelism with Tensorflow

A new run function to use data-parallelism during neural architecture search is available (link to code)

To use this function pass it to the run argument of the command line such as:

deephyper nas agebo ... --run deephyper.nas.run.tf_distributed.run ... --num-cpus-per-task 2 --num-gpus-per-task 2 --evaluator ray --address auto ...

This function allows for new hyperparameters in the Problem.hyperparameters(...):

...
Problem.hyperparameters(
    ...
    lsr_batch_size=True,
    lsr_learning_rate=True,
    warmup_lr=True,
    warmup_epochs=5,
    ...
)
...
Optimization of the input pipeline for the training

The data-ingestion pipeline was better optimised to reduce the overheads on GPU instances:

self.dataset_train = (
  self.dataset_train.cache()            
  .shuffle(self.train_size, reshuffle_each_iteration=True)            
  .batch(self.batch_size)
  .prefetch(tf.data.AUTOTUNE)
  .repeat(self.num_epochs)        
)
Easier model generation from Neural Architecture Search results

A new method is now available from the Problem object Problem.get_keras_model(arch_seq) to easily build a Keras model instance from an arch_seq (list encoding a neural network).

Files

deephyper/deephyper-0.2.5.zip

Files (2.0 MB)

Name Size Download all
md5:7d6ea8da807dcf792f3973e6495d37ea
2.0 MB Preview Download

Additional details

Related works