Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enabling Visualization for Real-Time and Post-Hoc Monitoring of Training and Evaluation Metrics #4187

Open
giladrubin1 opened this issue May 9, 2024 · 0 comments
Labels
enhancement New feature or request module: tabular priority: 2 Medium priority: Nice-to-have
Milestone

Comments

@giladrubin1
Copy link

Description

I am proposing the addition of a feature to visualize training and evaluation metrics during or after AutoGluon training, similar to the functionality provided by Tensorboard in TensorFlow. This feature would be particularly beneficial for the tabular module but could potentially be extended to multimodal and timeseries modules.

The ideal implementation would allow users to access training and evaluation metrics in a structured format (e.g., a DataFrame) post-training, and to visualize these metrics through dynamically updating graphs during the training process. This would not only enhance user understanding of model behavior but also aid in quicker debugging and optimization of model parameters.

Proposed API Changes

from autogluon.tabular import TabularPredictor
from autogluon.visualization import TrainingMonitor

predictor = TabularPredictor(label='target').fit(train_data)
monitor = TrainingMonitor(predictor)
monitor.plot_metrics()  # Generates dynamic plots for training and validation metrics over epochs

References

Tensorboard: Visualization toolkit for machine learning experimentation https://www.tensorflow.org/tensorboard
Scikit-learn plotting API: Provides a model visualization for inspection. https://scikit-learn.org/stable/visualizations.html

Open-Source Implementations
TensorBoard provides an excellent example of real-time plotting and logging capabilities integrated into TensorFlow training routines. It serves as a benchmark for what could be implemented in AutoGluon.
MLflow's Tracking API is another relevant tool that allows logging metrics, parameters, and artifacts to help visualize the machine learning lifecycle. https://mlflow.org/docs/latest/tracking.html

@giladrubin1 giladrubin1 added the enhancement New feature or request label May 9, 2024
@Innixma Innixma added this to the 2024 Tracker milestone May 10, 2024
@Innixma Innixma added module: tabular priority: 2 Medium priority: Nice-to-have labels May 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request module: tabular priority: 2 Medium priority: Nice-to-have
Projects
None yet
Development

No branches or pull requests

2 participants