How to Train and Tune Your Models with Amazon SageMaker – AWS Online Tech Talks
Training Machine Learning (ML) models entail many iterations to isolate and measure the impact of changing data sets, algorithm versions, and model parameters. You produce hundreds of artifacts such as models, training data, platform configurations, and parameter settings during these iterations. Amazon SageMaker makes it much easier to train ML models by providing everything you need to tune and debug models and execute training experiments. During this tech talk, we explain how to use Amazon SageMaker Experiments and how Amazon SageMaker Debugger improves model quality through better model training and tuning. You will see how to manage iterations by automatically capturing the input parameters, configurations, and results and automatically capturing real-time metrics during training such as training and validation and confusion matrices.

Learning Objectives:
*Learn how to use Amazon SageMaker to accelerate ML model training and also improve model quality
*See how Amazon SageMaker Experiments helps you manage training iterations by automatically capturing the input parameters, configurations, and results
*Understand how Amazon SageMaker Debugger makes the training process more transparent by automatically capturing real-time metrics, such as training and validation, confusion matrices, and learning gradients, to help improve model accuracy

***To learn more about the services featured in this talk, please visit: https://console.aws.amazon.com/sagemaker

View on YouTube