{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,17]],"date-time":"2026-02-17T12:13:09Z","timestamp":1771330389671,"version":"3.50.1"},"reference-count":84,"publisher":"Association for Computing Machinery (ACM)","issue":"11","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Proc. VLDB Endow."],"published-print":{"date-parts":[[2021,7]]},"abstract":"<jats:p>\n            End-to-end AutoML has attracted intensive interests from both academia and industry, which automatically searches for ML pipelines in a space induced by feature engineering, algorithm\/model selection, and hyper-parameter tuning. Existing AutoML systems, however, suffer from scalability issues when applying to application domains with large, high-dimensional search spaces. We present VOLCANOML, a scalable and extensible framework that facilitates systematic exploration of large AutoML search spaces. VOLCANOML introduces and implements basic building blocks that decompose a large search space into smaller ones, and allows users to utilize these building blocks to compose an\n            <jats:italic>execution plan<\/jats:italic>\n            for the AutoML problem at hand. VOLCANOML further supports a Volcano-style\n            <jats:italic>execution model<\/jats:italic>\n            - akin to the one supported by modern database systems - to execute the plan constructed. Our evaluation demonstrates that, not only does VOLCANOML raise the level of expressiveness for search space decomposition in AutoML, it also leads to actual findings of decomposition strategies that are significantly more efficient than the ones employed by state-of-the-art AutoML systems such as auto-sklearn.\n          <\/jats:p>","DOI":"10.14778\/3476249.3476270","type":"journal-article","created":{"date-parts":[[2021,10,27]],"date-time":"2021-10-27T16:46:23Z","timestamp":1635353183000},"page":"2167-2176","source":"Crossref","is-referenced-by-count":20,"title":["VolcanoML"],"prefix":"10.14778","volume":"14","author":[{"given":"Yang","family":"Li","sequence":"first","affiliation":[{"name":"Peking University and ETH Z\u00fcrich"}]},{"given":"Yu","family":"Shen","sequence":"additional","affiliation":[{"name":"Peking University"}]},{"given":"Wentao","family":"Zhang","sequence":"additional","affiliation":[{"name":"Peking University"}]},{"given":"Jiawei","family":"Jiang","sequence":"additional","affiliation":[{"name":"ETH Z\u00fcrich"}]},{"given":"Bolin","family":"Ding","sequence":"additional","affiliation":[{"name":"Alibaba Group"}]},{"given":"Yaliang","family":"Li","sequence":"additional","affiliation":[{"name":"Alibaba Group"}]},{"given":"Jingren","family":"Zhou","sequence":"additional","affiliation":[{"name":"Alibaba Group"}]},{"given":"Zhi","family":"Yang","sequence":"additional","affiliation":[{"name":"Peking University"}]},{"given":"Wentao","family":"Wu","sequence":"additional","affiliation":[{"name":"Microsoft Research"}]},{"given":"Ce","family":"Zhang","sequence":"additional","affiliation":[{"name":"ETH Z\u00fcrich"}]},{"given":"Bin","family":"Cui","sequence":"additional","affiliation":[{"name":"Peking University"}]}],"member":"320","published-online":{"date-parts":[[2021,10,27]]},"reference":[{"key":"e_1_2_1_1_1","doi-asserted-by":"publisher","DOI":"10.5555\/3042817.3042916"},{"key":"e_1_2_1_2_1","volume-title":"Azure machine learning. Microsoft Azure Essentials","author":"Barnes Jeff","year":"2015","unstructured":"Jeff Barnes . 2015. Azure machine learning. Microsoft Azure Essentials . 1 st ed, Microsoft ( 2015 ). Jeff Barnes. 2015. Azure machine learning. Microsoft Azure Essentials. 1st ed, Microsoft (2015).","edition":"1"},{"key":"e_1_2_1_3_1","doi-asserted-by":"publisher","DOI":"10.1145\/3097983.3098021"},{"key":"e_1_2_1_4_1","doi-asserted-by":"publisher","DOI":"10.5555\/2503308.2188395"},{"key":"e_1_2_1_5_1","doi-asserted-by":"publisher","DOI":"10.5555\/2986459.2986743"},{"key":"e_1_2_1_6_1","unstructured":"Matthias Boehm Iulian Antonov Sebastian Baunsgaard Mark Dokter Robert Ginth\u00f6r Kevin Innerebner Florijan Klezin Stefanie Lindstaedt Arnab Phani Benjamin Rath etal 2019. SystemDS: A declarative machine learning system for the end-to-end data science lifecycle. arXiv preprint arXiv:1909.02976 (2019).  Matthias Boehm Iulian Antonov Sebastian Baunsgaard Mark Dokter Robert Ginth\u00f6r Kevin Innerebner Florijan Klezin Stefanie Lindstaedt Arnab Phani Benjamin Rath et al. 2019. SystemDS: A declarative machine learning system for the end-to-end data science lifecycle. arXiv preprint arXiv:1909.02976 (2019)."},{"key":"e_1_2_1_7_1","unstructured":"Eric Breck Neoklis Polyzotis Sudip Roy Steven Whang and Martin Zinkevich. 2019. Data Validation for Machine Learning.. In MLSys.  Eric Breck Neoklis Polyzotis Sudip Roy Steven Whang and Martin Zinkevich. 2019. Data Validation for Machine Learning.. In MLSys."},{"key":"e_1_2_1_8_1","doi-asserted-by":"publisher","DOI":"10.1016\/S0167-6377(98)00050-9"},{"key":"e_1_2_1_9_1","doi-asserted-by":"publisher","DOI":"10.1145\/1015330.1015432"},{"key":"e_1_2_1_10_1","doi-asserted-by":"publisher","DOI":"10.1145\/3205455.3205586"},{"key":"e_1_2_1_11_1","volume-title":"Pappa","author":"de S\u00e1 Alex G.C.","year":"2017","unstructured":"Alex G.C. de S\u00e1 , Walter Jos\u00e9 G.S. Pinto , Luiz Otavio V.B. Oliveira , and Gisele L . Pappa . 2017 . RECIPE : A grammar-based framework for automatically evolving classification pipelines. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) . Alex G.C. de S\u00e1, Walter Jos\u00e9 G.S. Pinto, Luiz Otavio V.B. Oliveira, and Gisele L. Pappa. 2017. RECIPE: A grammar-based framework for automatically evolving classification pipelines. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)."},{"key":"e_1_2_1_12_1","doi-asserted-by":"publisher","DOI":"10.1145\/2949741.2949756"},{"key":"e_1_2_1_13_1","doi-asserted-by":"publisher","DOI":"10.5555\/299068.299074"},{"key":"e_1_2_1_14_1","volume-title":"AutoML Workshop at ICML","author":"Drori Iddo","year":"2018","unstructured":"Iddo Drori , Yamuna Krishnamurthy , Remi Rampin , Raoni De , Paula Lourenco , Jorge Piazentin Ono , Kyunghyun Cho , Claudio Silva , and Juliana Freire . 2018 . AlphaD3M: Machine Learning Pipeline Synthesis . AutoML Workshop at ICML (2018). Iddo Drori, Yamuna Krishnamurthy, Remi Rampin, Raoni De, Paula Lourenco, Jorge Piazentin Ono, Kyunghyun Cho, Claudio Silva, and Juliana Freire. 2018. AlphaD3M: Machine Learning Pipeline Synthesis. AutoML Workshop at ICML (2018)."},{"key":"e_1_2_1_15_1","volume-title":"International Conference on Machine Learning AutoML Workshop.","author":"Efimova Valeria","year":"2017","unstructured":"Valeria Efimova , Andrey Filchenkov , and Viacheslav Shalamov . 2017 . Fast Automated Selection of Learning Algorithm And its Hyperparameters by Reinforcement Learning . In International Conference on Machine Learning AutoML Workshop. Valeria Efimova, Andrey Filchenkov, and Viacheslav Shalamov. 2017. Fast Automated Selection of Learning Algorithm And its Hyperparameters by Reinforcement Learning. In International Conference on Machine Learning AutoML Workshop."},{"key":"e_1_2_1_16_1","volume-title":"NIPS workshop on Bayesian Optimization in Theory and Practice","volume":"10","author":"Eggensperger Katharina","year":"2013","unstructured":"Katharina Eggensperger , Matthias Feurer , Frank Hutter , James Bergstra , Jasper Snoek , Holger Hoos , and Kevin Leyton-Brown . 2013 . Towards an empirical foundation for assessing bayesian optimization of hyperparameters . In NIPS workshop on Bayesian Optimization in Theory and Practice , Vol. 10 . 3. Katharina Eggensperger, Matthias Feurer, Frank Hutter, James Bergstra, Jasper Snoek, Holger Hoos, and Kevin Leyton-Brown. 2013. Towards an empirical foundation for assessing bayesian optimization of hyperparameters. In NIPS workshop on Bayesian Optimization in Theory and Practice, Vol. 10. 3."},{"key":"e_1_2_1_17_1","volume-title":"International Conference on Machine Learning. PMLR, 1437--1446","author":"Falkner Stefan","year":"2018","unstructured":"Stefan Falkner , Aaron Klein , and Frank Hutter . 2018 . BOHB: Robust and efficient hyperparameter optimization at scale . In International Conference on Machine Learning. PMLR, 1437--1446 . Stefan Falkner, Aaron Klein, and Frank Hutter. 2018. BOHB: Robust and efficient hyperparameter optimization at scale. In International Conference on Machine Learning. PMLR, 1437--1446."},{"key":"e_1_2_1_18_1","doi-asserted-by":"publisher","DOI":"10.5555\/2969442.2969547"},{"key":"e_1_2_1_19_1","volume-title":"AutoML Workshop at ICML.","author":"Feurer Matthias","year":"2018","unstructured":"Matthias Feurer , Benjamin Letham , and Eytan Bakshy . 2018 . Scalable meta-learning for bayesian optimization using ranking-weighted gaussian process ensembles . In AutoML Workshop at ICML. Matthias Feurer, Benjamin Letham, and Eytan Bakshy. 2018. Scalable meta-learning for bayesian optimization using ranking-weighted gaussian process ensembles. In AutoML Workshop at ICML."},{"key":"e_1_2_1_20_1","doi-asserted-by":"publisher","DOI":"10.5555\/1450931"},{"key":"e_1_2_1_21_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICDE.2011.5767930"},{"key":"e_1_2_1_22_1","doi-asserted-by":"publisher","DOI":"10.1145\/3097983.3098043"},{"key":"e_1_2_1_23_1","unstructured":"Google. 2020. Google Prediction API . https:\/\/developers.google.com\/prediction.  Google. 2020. Google Prediction API . https:\/\/developers.google.com\/prediction."},{"key":"e_1_2_1_24_1","doi-asserted-by":"publisher","DOI":"10.1109\/69.273032"},{"key":"e_1_2_1_25_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.knosys.2020.106622"},{"key":"e_1_2_1_26_1","volume-title":"Multi-Fidelity Automatic Hyper-Parameter Tuning via Transfer Series Expansion. AAAI","author":"Hu Yi-Qi","year":"2019","unstructured":"Yi-Qi Hu , Yang Yu , Wei-Wei Tu , Qiang Yang , Yuqiang Chen , and Wenyuan Dai . 2019. Multi-Fidelity Automatic Hyper-Parameter Tuning via Transfer Series Expansion. AAAI ( 2019 ). Yi-Qi Hu, Yang Yu, Wei-Wei Tu, Qiang Yang, Yuqiang Chen, and Wenyuan Dai. 2019. Multi-Fidelity Automatic Hyper-Parameter Tuning via Transfer Series Expansion. AAAI (2019)."},{"key":"e_1_2_1_27_1","doi-asserted-by":"publisher","DOI":"10.5555\/3044805.3044891"},{"key":"e_1_2_1_28_1","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-642-25566-3_40"},{"key":"e_1_2_1_29_1","doi-asserted-by":"publisher","DOI":"10.5555\/3360092"},{"key":"e_1_2_1_30_1","doi-asserted-by":"publisher","DOI":"10.1007\/s13218-015-0381-0"},{"key":"e_1_2_1_31_1","unstructured":"IBM. 2020. IBM Watson Studio AutoAI. https:\/\/www.ibm.com\/cloud\/watson-studio\/autoai.  IBM. 2020. IBM Watson Studio AutoAI. https:\/\/www.ibm.com\/cloud\/watson-studio\/autoai."},{"key":"e_1_2_1_32_1","unstructured":"Kevin Jamieson and Ameet Talwalkar. 2016. Non-stochastic best arm identification and hyperparameter optimization. In Artificial Intelligence and Statistics. 240--248.  Kevin Jamieson and Ameet Talwalkar. 2016. Non-stochastic best arm identification and hyperparameter optimization. In Artificial Intelligence and Statistics. 240--248."},{"key":"e_1_2_1_33_1","doi-asserted-by":"publisher","DOI":"10.1023\/A:1008306431147"},{"key":"e_1_2_1_34_1","doi-asserted-by":"publisher","DOI":"10.5555\/3305381.3305567"},{"key":"e_1_2_1_35_1","doi-asserted-by":"publisher","DOI":"10.1109\/DSAA.2015.7344858"},{"key":"e_1_2_1_36_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICDM.2016.0123"},{"key":"e_1_2_1_37_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICDM.2017.31"},{"key":"e_1_2_1_38_1","volume-title":"32nd AAAI Conf. Artif. Intell. AAAI","author":"Khurana Udayan","year":"2018","unstructured":"Udayan Khurana , Horst Samulowitz , and Deepak Turaga . 2018 . Feature engineering for predictive modeling using reinforcement learning . In 32nd AAAI Conf. Artif. Intell. AAAI 2018. Udayan Khurana, Horst Samulowitz, and Deepak Turaga. 2018. Feature engineering for predictive modeling using reinforcement learning. In 32nd AAAI Conf. Artif. Intell. AAAI 2018."},{"key":"e_1_2_1_39_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICDMW.2016.0190"},{"key":"e_1_2_1_40_1","volume-title":"Proceedings of the 20th International Conference on Artificial Intelligence and Statistics. 528--536","author":"Klein Aaron","year":"2017","unstructured":"Aaron Klein , Stefan Falkner , Simon Bartels , Philipp Hennig , and Frank Hutter . 2017 . Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets . In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics. 528--536 . Aaron Klein, Stefan Falkner, Simon Bartels, Philipp Hennig, and Frank Hutter. 2017. Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics. 528--536."},{"key":"e_1_2_1_41_1","doi-asserted-by":"publisher","DOI":"10.25080\/Majora-14bd3278-006"},{"key":"e_1_2_1_42_1","doi-asserted-by":"publisher","DOI":"10.14778\/3229863.3240493"},{"key":"e_1_2_1_43_1","doi-asserted-by":"publisher","DOI":"10.14778\/2994509.2994514"},{"key":"e_1_2_1_44_1","doi-asserted-by":"publisher","DOI":"10.1007\/s41019-020-00149-7"},{"key":"e_1_2_1_45_1","volume-title":"Proceedings of the AutoML Workshop at ICML","volume":"2020","author":"LeDell Erin","year":"2020","unstructured":"Erin LeDell and S Poirier . 2020 . H2o automl: Scalable automatic machine learning . In Proceedings of the AutoML Workshop at ICML , Vol. 2020 . Erin LeDell and S Poirier. 2020. H2o automl: Scalable automatic machine learning. In Proceedings of the AutoML Workshop at ICML, Vol. 2020."},{"key":"e_1_2_1_46_1","unstructured":"Nir Levine Koby Crammer and Shie Mannor. 2017. Rotting bandits. In Advances in NIPS. 3074--3083.  Nir Levine Koby Crammer and Shie Mannor. 2017. Rotting bandits. In Advances in NIPS. 3074--3083."},{"key":"e_1_2_1_47_1","volume-title":"Proceedings of the International Conference on Learning Representations","author":"Li Lisha","year":"2018","unstructured":"Lisha Li , Kevin Jamieson , Giulia DeSalvo , Afshin Rostamizadeh , and Ameet Talwalkar . 2018 . Hyperband: A novel bandit-based approach to hyperparameter optimization . Proceedings of the International Conference on Learning Representations (2018), 1--48. Lisha Li, Kevin Jamieson, Giulia DeSalvo, Afshin Rostamizadeh, and Ameet Talwalkar. 2018. Hyperband: A novel bandit-based approach to hyperparameter optimization. Proceedings of the International Conference on Learning Representations (2018), 1--48."},{"key":"e_1_2_1_48_1","doi-asserted-by":"publisher","DOI":"10.1145\/3187009.3177737"},{"key":"e_1_2_1_49_1","doi-asserted-by":"crossref","unstructured":"Yang Li Jiawei Jiang Jinyang Gao Yingxia Shao Ce Zhang and Bin Cui. 2020. Efficient Automatic CASH via Rising Bandits.. In AAAI. 4763--4771.  Yang Li Jiawei Jiang Jinyang Gao Yingxia Shao Ce Zhang and Bin Cui. 2020. Efficient Automatic CASH via Rising Bandits.. In AAAI. 4763--4771.","DOI":"10.1609\/aaai.v34i04.5910"},{"key":"e_1_2_1_50_1","volume-title":"Proceedings of the AAAI Conference on Artificial Intelligence","volume":"35","author":"Li Yang","year":"2021","unstructured":"Yang Li , Yu Shen , Jiawei Jiang , Jinyang Gao , Ce Zhang , and Bin Cui . 2021 . MFESHB: Efficient Hyperband with Multi-Fidelity Quality Measurements . In Proceedings of the AAAI Conference on Artificial Intelligence , Vol. 35 . 8491--8500. Yang Li, Yu Shen, Jiawei Jiang, Jinyang Gao, Ce Zhang, and Bin Cui. 2021. MFESHB: Efficient Hyperband with Multi-Fidelity Quality Measurements. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 8491--8500."},{"key":"e_1_2_1_51_1","unstructured":"Yang Li Yu Shen Wentao Zhang Yuanwei Chen Huaijun Jiang Mingchao Liu Jiawei Jiang Jinyang Gao Wentao Wu Zhi Yang etal 2021. OpenBox: A Generalized Black-box Optimization Service. arXiv preprint arXiv:2106.00421 (2021).  Yang Li Yu Shen Wentao Zhang Yuanwei Chen Huaijun Jiang Mingchao Liu Jiawei Jiang Jinyang Gao Wentao Wu Zhi Yang et al. 2021. OpenBox: A Generalized Black-box Optimization Service. arXiv preprint arXiv:2106.00421 (2021)."},{"key":"e_1_2_1_52_1","volume-title":"Tune: A Research Platform for Distributed Model Selection and Training. arXiv preprint arXiv:1807.05118","author":"Liaw Richard","year":"2018","unstructured":"Richard Liaw , Eric Liang , Robert Nishihara , Philipp Moritz , Joseph E Gonzalez , and Ion Stoica . 2018 . Tune: A Research Platform for Distributed Model Selection and Training. arXiv preprint arXiv:1807.05118 (2018). Richard Liaw, Eric Liang, Robert Nishihara, Philipp Moritz, Joseph E Gonzalez, and Ion Stoica. 2018. Tune: A Research Platform for Distributed Model Selection and Training. arXiv preprint arXiv:1807.05118 (2018)."},{"key":"e_1_2_1_53_1","doi-asserted-by":"publisher","DOI":"10.1145\/3318464.3386126"},{"key":"e_1_2_1_54_1","doi-asserted-by":"crossref","unstructured":"Sijia Liu Parikshit Ram Djallel Bouneffouf Gregory Bramble Andrew R Conn Horst Samulowitz and Alexander G Gray. 2020. An ADMM Based Framework for AutoML Pipeline Configuration. (2020) 4892--4899.  Sijia Liu Parikshit Ram Djallel Bouneffouf Gregory Bramble Andrew R Conn Horst Samulowitz and Alexander G Gray. 2020. An ADMM Based Framework for AutoML Pipeline Configuration. (2020) 4892--4899.","DOI":"10.1609\/aaai.v34i04.5926"},{"key":"e_1_2_1_55_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10994-018-5735-z"},{"key":"e_1_2_1_56_1","doi-asserted-by":"publisher","DOI":"10.5555\/3291168.3291210"},{"key":"e_1_2_1_57_1","doi-asserted-by":"publisher","DOI":"10.1145\/3299869.3319874"},{"key":"e_1_2_1_58_1","doi-asserted-by":"publisher","DOI":"10.14778\/3407790.3407816"},{"key":"e_1_2_1_59_1","doi-asserted-by":"publisher","DOI":"10.5555\/3172077.3172240"},{"key":"e_1_2_1_60_1","volume-title":"TPOT: A tree-based pipeline optimization tool for automating machine learning. In Automated Machine Learning","author":"Olson Randal S","year":"2019","unstructured":"Randal S Olson and Jason H Moore . 2019 . TPOT: A tree-based pipeline optimization tool for automating machine learning. In Automated Machine Learning . Springer , 151--160. Randal S Olson and Jason H Moore. 2019. TPOT: A tree-based pipeline optimization tool for automating machine learning. In Automated Machine Learning. Springer, 151--160."},{"key":"e_1_2_1_61_1","doi-asserted-by":"publisher","DOI":"10.5555\/3294996.3295183"},{"key":"e_1_2_1_62_1","doi-asserted-by":"publisher","DOI":"10.14778\/3157794.3157797"},{"key":"e_1_2_1_63_1","doi-asserted-by":"publisher","DOI":"10.14778\/3137628.3137631"},{"key":"e_1_2_1_64_1","unstructured":"Microsoft Research. 2020. Microsoft NNI. https:\/\/github.com\/Microsoft\/nni.  Microsoft Research. 2020. Microsoft NNI. https:\/\/github.com\/Microsoft\/nni."},{"key":"e_1_2_1_65_1","doi-asserted-by":"crossref","unstructured":"Kevin Schawinski et al. 2017. Generative Adversarial Networks recover features in astrophysical images of galaxies beyond the deconvolution limit. MNRAS Letters (2017).  Kevin Schawinski et al. 2017. Generative Adversarial Networks recover features in astrophysical images of galaxies beyond the deconvolution limit. MNRAS Letters (2017).","DOI":"10.1093\/mnrasl\/slx008"},{"key":"e_1_2_1_66_1","volume-title":"Noisy blackbox optimization with multi-fidelity queries: A tree search approach. arXiv preprint arXiv:1810.10482","author":"Sen Rajat","year":"2018","unstructured":"Rajat Sen , Kirthevasan Kandasamy , and Sanjay Shakkottai . 2018. Noisy blackbox optimization with multi-fidelity queries: A tree search approach. arXiv preprint arXiv:1810.10482 ( 2018 ). Rajat Sen, Kirthevasan Kandasamy, and Sanjay Shakkottai. 2018. Noisy blackbox optimization with multi-fidelity queries: A tree search approach. arXiv preprint arXiv:1810.10482 (2018)."},{"key":"e_1_2_1_67_1","doi-asserted-by":"publisher","DOI":"10.1109\/JPROC.2015.2494218"},{"key":"e_1_2_1_68_1","doi-asserted-by":"publisher","DOI":"10.5555\/2999325.2999464"},{"key":"e_1_2_1_69_1","doi-asserted-by":"publisher","DOI":"10.5555\/2999792.2999836"},{"key":"e_1_2_1_70_1","volume-title":"International Conference on Machine Learning. PMLR, 9334--9345","author":"Takeno Shion","year":"2020","unstructured":"Shion Takeno , Hitoshi Fukuoka , Yuhki Tsukada , Toshiyuki Koyama , Motoki Shiga , Ichiro Takeuchi , and Masayuki Karasuyama . 2020 . Multi-fidelity Bayesian optimization with max-value entropy search and its parallelization . In International Conference on Machine Learning. PMLR, 9334--9345 . Shion Takeno, Hitoshi Fukuoka, Yuhki Tsukada, Toshiyuki Koyama, Motoki Shiga, Ichiro Takeuchi, and Masayuki Karasuyama. 2020. Multi-fidelity Bayesian optimization with max-value entropy search and its parallelization. In International Conference on Machine Learning. PMLR, 9334--9345."},{"key":"e_1_2_1_71_1","doi-asserted-by":"publisher","DOI":"10.1145\/2487575.2487629"},{"key":"e_1_2_1_72_1","doi-asserted-by":"publisher","DOI":"10.1145\/3219819.3220058"},{"key":"e_1_2_1_73_1","doi-asserted-by":"publisher","DOI":"10.1145\/2641190.2641198"},{"key":"e_1_2_1_74_1","doi-asserted-by":"publisher","DOI":"10.1145\/2939502.2939516"},{"key":"e_1_2_1_75_1","doi-asserted-by":"publisher","DOI":"10.5555\/2540128.2540383"},{"key":"e_1_2_1_76_1","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-319-46128-1_13"},{"key":"e_1_2_1_77_1","unstructured":"Jian Wu Saul Toscano-Palmerin Peter I Frazier and Andrew Gordon Wilson. 2020. Practical multi-fidelity bayesian optimization for hyperparameter tuning. In Uncertainty in Artificial Intelligence. PMLR 788--798.  Jian Wu Saul Toscano-Palmerin Peter I Frazier and Andrew Gordon Wilson. 2020. Practical multi-fidelity bayesian optimization for hyperparameter tuning. In Uncertainty in Artificial Intelligence. PMLR 788--798."},{"key":"e_1_2_1_78_1","doi-asserted-by":"publisher","DOI":"10.1145\/3318464.3389743"},{"key":"e_1_2_1_79_1","doi-asserted-by":"publisher","DOI":"10.1145\/3318464.3389696"},{"key":"e_1_2_1_80_1","volume-title":"Taking human out of learning applications: A survey on automated machine learning. arXiv preprint arXiv:1810.13306","author":"Yao Quanming","year":"2018","unstructured":"Quanming Yao , Mengshuo Wang , Yuqiang Chen , Wenyuan Dai , Yu-Feng Li , Wei-Wei Tu , Qiang Yang , and Yang Yu. 2018. Taking human out of learning applications: A survey on automated machine learning. arXiv preprint arXiv:1810.13306 ( 2018 ). Quanming Yao, Mengshuo Wang, Yuqiang Chen, Wenyuan Dai, Yu-Feng Li, Wei-Wei Tu, Qiang Yang, and Yang Yu. 2018. Taking human out of learning applications: A survey on automated machine learning. arXiv preprint arXiv:1810.13306 (2018)."},{"key":"e_1_2_1_81_1","unstructured":"M. Zaharia et al. 2018. Accelerating the Machine Learning Lifecycle with MLflow. IEEE Data Eng. Bull. (2018).  M. Zaharia et al. 2018. Accelerating the Machine Learning Lifecycle with MLflow. IEEE Data Eng. Bull. (2018)."},{"key":"e_1_2_1_82_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICDE48307.2020.00014"},{"key":"e_1_2_1_83_1","first-page":"1","article-title":"Snapshot boosting: a fast ensemble framework for deep neural networks","volume":"63","author":"Zhang Wentao","year":"2020","unstructured":"Wentao Zhang , Jiawei Jiang , Yingxia Shao , and Bin Cui . 2020 . Snapshot boosting: a fast ensemble framework for deep neural networks . Science China Information Sciences 63 , 1 (2020), 1 -- 12 . Wentao Zhang, Jiawei Jiang, Yingxia Shao, and Bin Cui. 2020. Snapshot boosting: a fast ensemble framework for deep neural networks. Science China Information Sciences 63, 1 (2020), 1--12.","journal-title":"Science China Information Sciences"},{"key":"e_1_2_1_84_1","volume-title":"Survey on automated machine learning. arXiv preprint arXiv:1904.12054","author":"Z\u00f6ller Marc-Andr\u00e9","year":"2019","unstructured":"Marc-Andr\u00e9 Z\u00f6ller and Marco F Huber . 2019. Survey on automated machine learning. arXiv preprint arXiv:1904.12054 ( 2019 ). Marc-Andr\u00e9 Z\u00f6ller and Marco F Huber. 2019. Survey on automated machine learning. arXiv preprint arXiv:1904.12054 (2019)."}],"container-title":["Proceedings of the VLDB Endowment"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.14778\/3476249.3476270","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,12,28]],"date-time":"2022-12-28T10:00:04Z","timestamp":1672221604000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.14778\/3476249.3476270"}},"subtitle":["speeding up end-to-end AutoML via scalable search space decomposition"],"short-title":[],"issued":{"date-parts":[[2021,7]]},"references-count":84,"journal-issue":{"issue":"11","published-print":{"date-parts":[[2021,7]]}},"alternative-id":["10.14778\/3476249.3476270"],"URL":"https:\/\/doi.org\/10.14778\/3476249.3476270","relation":{},"ISSN":["2150-8097"],"issn-type":[{"value":"2150-8097","type":"print"}],"subject":[],"published":{"date-parts":[[2021,7]]}}}