Title: Multi-objective Hyperparamater Optimization of Deep Neural Networks Abstract: Deep neural network models are full of hyperparameters. To obtain a good model, one must carefully experiment with hyperparameters such as the number of layers, the number of hidden nodes, the type of non-linearity, the learning rate, and the drop-out parameter, just to name a few. I will discuss general hyperparameter optimization algorithms, based on evolutionary strategies or Bayesian techniques, to automate this laborious process. Further, I will argue for the necessity of a multi-objective approach: we desire models that are not only optimized for accuracy, but also respect practical computational constraints such as model size and run-time. In the experiments, I will demonstrate how our algorithm discovers accurate and compact neural language models. This is joint work with Tomohiro Tanaka, Takafumi Moriya, and Takahiro Shinozaki at the Tokyo Institute of Technology, and Shinji Watanabe and Takaaki Hori at MERL.