Hyperparameter optimization is the process of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter’s value is used to control the learning process, to define the model architecture or the data preprocessing process, and so on. Examples of hyperparameters include learning rate, batch size, different optimizers, and number of layers.
There are several strategies for searching the hyperparameter space, such as Random search and Grid search.
We ran hyperparameter optimization by using two GPU instances with two concurrent pods with total of four completions:
runai submit hpo1 -I gcr.io/run-ai-demo/quickstart-hpo -g 2 \ --parallelism 2–-completions 4 -v /root/runai-robin-test:/nfs
Figure 5. Hyperparameter tuning with Run:ai Atlas