Time Series Ensembling with Optimized Learning Schedule

王 澤宇

(指導教員:清 智也 准教授/ 数理情報第4研究室

資料PDF(WangZeyu.pdf
研究概要

Model ensembling is a powerful technique to increase accuracy on a variety of machine learning tasks by combining several base prediction models. The current time series prediction model ensembling approaches are mostly blending or simple out-of-sample stacking, i.e. an initial part of the base models’ output is used for fitting meta-model and left part is held out for testing. In this paper we discussed the importance of selecting a proper “look back”窶塗ow long we look back to fit the meta-model of stacking and proposed a new stacking framework that can optimize the learning schedule of the stacking generalization.
修論の感想

My thesis is an empirical research about model ensembling in sequential data. In this thesis I want to answer two main questions: for time series emsbeling, is the larger training data always better, and is it possible to optimize the learning schedule. The empirical research can give as some basic guideline when deal with time series ensembling in real production.


>
ISTyくん