--- title: Optimizers keywords: fastai sidebar: home_sidebar summary: "This contains a set of optimizers." description: "This contains a set of optimizers." nb_path: "nbs/053_optimizer.ipynb" ---
You can natively use any of the optimizers included in the fastai library. You just need to pass it to the learner as the opt_func.
In addition, you will be able to use any of the optimizers from:
torch-optimizer
first)Examples of use:
adamw = wrap_optimizer(torch.optim.AdamW)
import torch_optimizer as optim
adabelief = wrap_optimizer(optim.AdaBelief)
If you want to use any these last 2, you can use the wrap_optimizer function. Here are a few examples: