The package provides extra flexibility to Flax using ideas originated at Trax.
Extras include:
- Support of combinators (serial, parallel, branch, etc.) backed by Redex.
- A modular training loop working nicely with Optax optimizers and metrics.
- Pluggable logging to stdout.
- Parallel execution of training and evaluation tasks.
- Regular and the best metric checkpoints.
- Tensorboard integration.
- Additional linen modules.
Check out documentation, introductory tutorials, or examples that include:
- Perceiver IO
- Classification model pretrained on images from ImageNet.
- Masked-language model pretrained using a large text corpus obtained by combining English Wikipedia and C4.
- Autoencoder model pretrained on multimodal input (audio, video, and label) of the Kinetics-700-2020 dataset.