Published December 15, 2023 | Version v1
Dataset Open

Neural Field Arena - Classification

  • 1. ROR icon University of Amsterdam
  • 2. ROR icon Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital

Description

Neural fields (NeFs) have recently emerged as a versatile method for modeling signals of various modalities, including images, shapes, and scenes. Subsequently, many works have explored the use of NeFs as representations for downstream tasks, e.g. classifying an image based on the parameters of a NeF that has been fit to it. However, the impact of the NeF hyperparameters on their quality as downstream representation is scarcely understood and remains largely unexplored. This is partly caused by the large amount of time required to fit datasets of neural fields.

Thanks to fit-a-nef, a JAX-based library that leverages parallelization to enable fast optimization of large-scale NeF datasets, we performed a comprehensive study that investigates the effects of different hyperparameters --including initialization, network architecture, and optimization strategies-- on fitting NeFs for downstream tasks.
Based on the proposed library and our analysis, we propose Neural Field Arena, a benchmark consisting of neural field variants of popular vision datasets, including MNIST, CIFAR, variants of ImageNet, and ShapeNetv2.
Our library and the Neural Field Arena will be open-sourced to introduce standardized benchmarking and promote further research on neural fields.

The datasets that are currently available are the following:

  1. MNIST, SIREN.
  2. CIFAR10, SIREN,
  3. MicroImageNet, SIREN.
  4. ShapeNet, SIREN.

More datasets will be added in the future.

Files

neural_field_arena.zip

Files (1.0 GB)

Name Size Download all
md5:b4d259ab4955b5b267e741b1e3815868
1.0 GB Preview Download

Additional details

Related works

Is part of
Publication: arXiv:2312.10531 (arXiv)

Dates

Created
2023-12-19
The datasets for SIREN MNIST, CIFAR10, ShapeNet and MicroImageNet are created and added to the neural field arena.