Parallelization of Automatic Tuning for Hyperparameter Optimization of Pedestrian Route Prediction Applications using Machine Learning

Sorataro Fujika, Yuga Yajima, Teruo Tanaka, A. Fujii, Yuka Kato, S. Ohshima, T. Katagiri
{"title":"Parallelization of Automatic Tuning for Hyperparameter Optimization of Pedestrian Route Prediction Applications using Machine Learning","authors":"Sorataro Fujika, Yuga Yajima, Teruo Tanaka, A. Fujii, Yuka Kato, S. Ohshima, T. Katagiri","doi":"10.1145/3578178.3578235","DOIUrl":null,"url":null,"abstract":"We study software automatic tuning. Automatic tuning tools using iterative one-dimensional search estimate hyperparameters of machine learning programs. Iterative one-dimensional search searches the parameter space consisting of possible values of the parameters to be tuned by repeatedly measuring and evaluating the target program. Since it takes time to train a machine learning program, estimating the optimal hyperparameters is time-consuming. Therefore, we propose a method to reduce the time required for automatic tuning by parallelization of iterative one-dimensional search. For parallelization, we use multiple job execution on a supercomputer that can utilize multiple GPUs, which is effective for machine learning. In this method, each job measures different hyperparameters. The next search point is determined by referring to the data obtained from each job. The target program is a pedestrian path prediction application. This program predicts future routes and arrival points based on past pedestrian trajectory data. The program is intended to be used in a variety of locations, and the locations and movement patterns will vary depending on the dataset used for training. We hypothesized that the estimation results of one dataset could be used for automatic tuning of another dataset, thereby reducing the time required for automatic tuning. Experimental results confirm that the parallelized iterative one-dimensional search reduces the estimation time from 89.5 hours to 4 hours compared to the sequential search. We also show that the iterative one-dimensional search efficiently investigates the point at which the performance index improves. Moreover, the hyperparameters estimated for one data set are used as the initial point for the search and automatic tuning for another data set. Compared to the results of automatic tuning with the currently used hyperparameters as the initial values, both the number of executions and execution time were reduced.","PeriodicalId":314778,"journal":{"name":"Proceedings of the International Conference on High Performance Computing in Asia-Pacific Region","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the International Conference on High Performance Computing in Asia-Pacific Region","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3578178.3578235","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We study software automatic tuning. Automatic tuning tools using iterative one-dimensional search estimate hyperparameters of machine learning programs. Iterative one-dimensional search searches the parameter space consisting of possible values of the parameters to be tuned by repeatedly measuring and evaluating the target program. Since it takes time to train a machine learning program, estimating the optimal hyperparameters is time-consuming. Therefore, we propose a method to reduce the time required for automatic tuning by parallelization of iterative one-dimensional search. For parallelization, we use multiple job execution on a supercomputer that can utilize multiple GPUs, which is effective for machine learning. In this method, each job measures different hyperparameters. The next search point is determined by referring to the data obtained from each job. The target program is a pedestrian path prediction application. This program predicts future routes and arrival points based on past pedestrian trajectory data. The program is intended to be used in a variety of locations, and the locations and movement patterns will vary depending on the dataset used for training. We hypothesized that the estimation results of one dataset could be used for automatic tuning of another dataset, thereby reducing the time required for automatic tuning. Experimental results confirm that the parallelized iterative one-dimensional search reduces the estimation time from 89.5 hours to 4 hours compared to the sequential search. We also show that the iterative one-dimensional search efficiently investigates the point at which the performance index improves. Moreover, the hyperparameters estimated for one data set are used as the initial point for the search and automatic tuning for another data set. Compared to the results of automatic tuning with the currently used hyperparameters as the initial values, both the number of executions and execution time were reduced.
基于机器学习的行人路径预测超参数优化自动调优并行化
我们研究软件自动调谐。使用迭代一维搜索的自动调优工具估计机器学习程序的超参数。迭代一维搜索通过对目标程序的反复测量和评估,搜索由待调优参数的可能值组成的参数空间。由于训练机器学习程序需要时间,因此估计最优超参数非常耗时。因此,我们提出了一种通过并行化迭代一维搜索来减少自动调优所需时间的方法。对于并行化,我们在可以利用多个gpu的超级计算机上使用多个作业执行,这对于机器学习是有效的。在这种方法中,每个作业测量不同的超参数。通过参考从每个作业获得的数据来确定下一个搜索点。目标程序是一个行人路径预测应用程序。该程序根据过去的行人轨迹数据预测未来的路线和到达点。该程序旨在用于各种位置,并且位置和移动模式将根据用于训练的数据集而变化。我们假设一个数据集的估计结果可以用于另一个数据集的自动调优,从而减少自动调优所需的时间。实验结果证实,与顺序搜索相比,并行迭代一维搜索将估计时间从89.5小时减少到4小时。我们还证明了迭代一维搜索有效地研究了性能指标提高的点。此外,对一个数据集估计的超参数用作对另一个数据集的搜索和自动调优的初始点。与使用当前使用的超参数作为初始值的自动调优结果相比,执行次数和执行时间都减少了。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信