LG has created a tool to optimise AI models. The tool is intended to support the tinkering with AI models and the organisation of data.
The tool is available on GitHub. In a research paper, the company announced that the tool simplifies the production of a multiple models with different configurations. Specifically, these are hyperparameter models, in which the values of the model parameters are already fixed before the model is trained. The pre-fixed values are then optimized by training the models.
Auptimizer uses a gradient based architecture search, with which a controller generates a series of “child models”. These models are then tested for their accuracy, after which architectures with a higher accuracy are selected.
According to VentureBeat, the tool is easy to use, because Auptimizer only needs a few lines of code. Users are also guided through the set-up of configurations step by step. Furthermore, users can switch between different hyperparameter algorithms and computing resources without having to rewrite training scripts.
The tool continuously checks available resources and proposals for hyperparameters. It also carries out jobs that focus on selecting the best model from a group. After a workload is completed, Auptimizer starts a function that captures and stores the results.
When it comes to highly advanced algorithms, where scores need to be matched with specific input hyperparameters, Auptimizer automatically performs mapping. Also, the hyperparameter values are stored in a file so that they can be reused for certain tasks. In the meantime, the supporting data is recorded so that it can be used for further tasks, like the fine-tuning of certain models.
Finally, users can specify the resources to be used in test configurations, if necessary. Think of processors, graphics chips, nodes and cloud instances such as Amazon Web Services EC2. Auptimizer is also compatible with existing resource management tools like Boto 3.