Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

LG introduces Auptimizer, an open-source ML model optimization tool for efficient hyperparameter tuning at scale

Save for later
  • 4 min read
  • 12 Nov 2019

article-image

Last week, researchers from LG’s Advanced AI team open-sourced Auptimizer. It is a general hyperparameter optimization (HPO) framework to help data scientists speed up machine learning model tuning.

What challenges Auptimizer aims to address


Hyperparameters are adjustable parameters that govern the training process of a machine learning model. These represent important properties of a model, for instance, a penalty in Logistic Regression Classifier or learning rate for training a neural network.

Tuning hyperparameters can often be a very tedious task, especially when the model training is computationally intensive. There are currently both open source and commercial automated HPO solutions like Google AutoML, Amazon SageMaker, and Optunity. However, using them at scale still poses some challenges. In a paper explaining the motivation and system design behind Auptimizer, the team wrote, “But, in all cases, adopting new algorithms or accommodating new computing resources is still challenging.”

To address these challenges and more, the team has come up with Auptimizer with which they aim to automate all the tedious tasks you do when building a machine learning model. This initial open-sourced version provides the following advantages:

Easily switch among different Hyperparameter Optimization algorithms without rewriting the training script


Getting started with Auptimizer only requires adding a few lines of code and it will then guide you to setup all other experiment-related configurations. This enables users to switch among different HPO algorithms and computing resources without rewriting their training script, which is one of the key hurdles in the HPO adoption process. Once set up, it will run and record sophisticated Hyperparameter Optimization experiments for you.

Orchestrating compute resources for faster hyperparameter tuning


Users can specify the resources they want to be used in experiment configurations including processors, graphics chips, nodes, and public cloud instances like Amazon Web Services EC2. Auptimizer keeps track of the resources in a persistent database and queries it to check if the resources specified by the user are available. If the resource is available, it will be taken by Auptimizer for job execution, and if not the system will wait until it is free. Auptimizer is also compatible with existing resource management tools such as Boto 3.

A single interface to various sophisticated HPO algorithms


The current Auptimizer implementation provides a “single seamless access point to top-notch HPO algorithms” such as Spearmint, Hyperopt, Hyperband, BOHB and also supports the simple random search and grid search. Users can also integrate their own proprietary solution and switch between different HPO algorithms with minimal changes to their existing code.

The following table shows the currently supported techniques by Auptimizer:

lg-introduces-auptimizer-an-open-source-ml-model-optimization-tool-for-efficient-hyperparameter-tuning-at-scale-img-0

Source: LG

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €18.99/month. Cancel anytime


How Auptimizer works


The following figure shows the system design for Auptimizer:

[caption id="attachment_30616" align="aligncenter" width="699"]lg-introduces-auptimizer-an-open-source-ml-model-optimization-tool-for-efficient-hyperparameter-tuning-at-scale-img-1 Auptimizer System Design[/caption]

Source: LG


The key components of Auptimizer are the Proposer and Resource Manager. The Proposer interface defines two functions: ‘get_param()’ to return the new hyperparameter values and ‘update()’ to update the history. Resource Manager is responsible for automatically connecting compute resources to model training when they are available. Its ‘get_available()’ function acts as the interface between Auptimizer and typical resource management and job scheduling tools. The ‘run()’ function, as the name suggests, executes the provided code.

To enable reproducibility, Auptimizer has a provision for tracking all the experiment history in the user-specified database. Users can also visualize the results from history with a basic visualization tool that comes integrated with Auptimizer. For further analysis, users can directly access the results stored in the database.

Sharing the future vision for Auptimizer the team wrote, “As development progress, Auptimizer will support the end-to-end development cycle for building models for edge devices including robust support for model compression and neural architecture search.”

This article gave you a basic introduction to Auptimizer. Check out the paper, Auptimizer - an Extensible, Open-Source Framework for Hyperparameter Tuning and GitHub repository to know more in detail.

Facebook open-sources Hyperparameter autotuning for fastText to automatically find best hyperparameters for your dataset

PyTorch announces the availability of PyTorch Hub for improving machine learning research reproducibility

Training Deep Convolutional GANs to generate Anime Characters [Tutorial]