2021-AIAC-QQ-Browser-Hyperparameter-Optimization-Rank6

Overview

2021-AIAC-QQ-Browser-Hyperparameter-Optimization-Rank6

2021 AIAC QQ浏览器AI算法大赛 赛道二 超参数优化 初赛Rank3 决赛Rank6

赛题官网:https://algo.browser.qq.com/

赛题内容:在信息流推荐业务场景中普遍存在模型或策略效果依赖于“超参数”的问题,而“超参数"的设定往往依赖人工经验调参,不仅效率低下维护成本高,而且难以实现更优效果。因此,本次赛题以超参数优化为主题,从真实业务场景问题出发,并基于脱敏后的数据集来评测各个参赛队伍的超参数优化算法。本赛题为超参数优化问题或黑盒优化问题:给定超参数的取值空间,每一轮可以获取一组超参数对应的Reward,要求超参数优化算法在限定的迭代轮次内找到Reward尽可能大的一组超参数,最终按照找到的最大Reward来计算排名。

算法baseline主要来自华为HEBO,针对比赛做了一些参数和代码的修改。另外官方提供的代码修改了一些结构方便线下debug。

运行环境: win10 ,Python3.6,Pycharm20200101,git bash用于运行打包脚本。

官方代码主要修改点:

1、thpo/run_search.py函数,增加修改如下代码:

#run_cmd = common.PYTHONX + " ./thpo/run_search_one_time.py " + common.args_to_str(cur_args)
args = common.parse_args(common.experiment_parser("description"))
searcher_root = args[common.CmdArgs.searcher_root]
searcher = get_implement_searcher(searcher_root)
eva_func_list = args[common.CmdArgs.data]
repeat_num = args[common.CmdArgs.repear_num]
err_code, err_msg = run_search_one_time(args, searcher, eva_func_list[0], repeat_num)

2、初赛阶段,修改n_iteration为10次,总共50组参数,因为hebo线下很容易就到0.99+,将迭代的次数减小,方便继续优化,线下线上能保证同时上分。

hebo代码修改点:

1、修改代码结构,适配本次比赛,具体可以查看searcher.py.

2、searcher.py,name='gpy',MACE方法改为MOMeanSigmaLCB,EvolutionOpt修改iters参数为25.决赛优化check_unique的去重代码。在获得一批最优点后,增加通过距离选择其中一些点的方法,优于hebo原代码中的随机选择方式。具体在distance相关代码。

3、bo/models/gp/gpy_wgp.py,Matern32改为Matern52,去掉linear核,optimize_restarts修改为原来的三分之一,restarts改为一次,也就是优化一次。

总结

上面是本次比赛初赛和决赛的一些修改点,其它的漏掉的记起来了再补充。因为之前没做过超参数的优化,所以除了读大量论文和代码花了很多时间,调参也是花了很多时间。所以try.txt里面记录了大量调参的过程和结果,留作记录。另外初赛阶段把NeurIPS 2020开源的代码都试了下,特别是turbo这个试了很久,感觉应该有效果,但是实际使用效果不佳。初赛阶段之所以做上面这些修改,主要原因是一开始hebo代码调通以后,线下0.99线上0.001,后面发现是超时问题,所以相关的调参工作基本上是优化代码的运行时间,确保精度不下降的情况下提高速度,最终逐步从0.7+优化到0.95+,不过初赛最终切榜的时候显示超时,线上分数掉到0.899+,rank3.

复赛阶段基本上代码没做太大修改,因为试了很多策略效果都不怎么理想。最终还是没用early stop策略。线上0.712+

reference里面有使用的相关开源代码的链接,里面也能找到相应的论文,细节部分可以看下论文里面。

reference:

1、https://github.com/huawei-noah/HEBO/tree/master/HEBO

2、https://bbochallenge.com/leaderboard/

3、https://github.com/uber-research/TuRBO

You might also like...
Genetic Algorithm, Particle Swarm Optimization, Simulated Annealing, Ant Colony Optimization Algorithm,Immune Algorithm, Artificial Fish Swarm Algorithm, Differential Evolution and TSP(Traveling salesman)
Genetic Algorithm, Particle Swarm Optimization, Simulated Annealing, Ant Colony Optimization Algorithm,Immune Algorithm, Artificial Fish Swarm Algorithm, Differential Evolution and TSP(Traveling salesman)

scikit-opt Swarm Intelligence in Python (Genetic Algorithm, Particle Swarm Optimization, Simulated Annealing, Ant Colony Algorithm, Immune Algorithm,A

library for nonlinear optimization, wrapping many algorithms for global and local, constrained or unconstrained, optimization

NLopt is a library for nonlinear local and global optimization, for functions with and without gradient information. It is designed as a simple, unifi

Racing line optimization algorithm in python that uses Particle Swarm Optimization.
Racing line optimization algorithm in python that uses Particle Swarm Optimization.

Racing Line Optimization with PSO This repository contains a racing line optimization algorithm in python that uses Particle Swarm Optimization. Requi

QQ Browser 2021 AI Algorithm Competition Track 1 1st Place Program

QQ Browser 2021 AI Algorithm Competition Track 1 1st Place Program

Train/evaluate a Keras model, get metrics streamed to a dashboard in your browser.
Train/evaluate a Keras model, get metrics streamed to a dashboard in your browser.

Hera Train/evaluate a Keras model, get metrics streamed to a dashboard in your browser. Setting up Step 1. Plant the spy Install the package pip

Ever felt tired after preprocessing the dataset, and not wanting to write any code further to train your model? Ever encountered a situation where you wanted to record the hyperparameters of the trained model and able to retrieve it afterward? Models Playground is here to help you do that. Models playground allows you to train your models right from the browser.
Recognize Handwritten Digits using Deep Learning on the browser itself.

MNIST on the Web An attempt to predict MNIST handwritten digits from my PyTorch model from the browser (client-side) and not from the server, with the

A Blender python script for getting asset browser custom preview images for objects and collections.
A Blender python script for getting asset browser custom preview images for objects and collections.

asset_snapshot A Blender python script for getting asset browser custom preview images for objects and collections. Installation: Click the code butto

SymPy-powered, Wolfram|Alpha-like answer engine totally in your browser, without backend computation

SymPy Beta SymPy Beta is a fork of SymPy Gamma. The purpose of this project is to run a SymPy-powered, Wolfram|Alpha-like answer engine totally in you

Owner
Aigege
记录下数据挖掘、计算机视觉工作中编写的一些代码和总结,备份和分享下。 主要包括工作中的一些实现,自己刷比赛时编写的一些解决方案,包括分析和建模,另外还有些阅读最新论文实现的视觉CNN,结构化数据NN网络等,使用的tensorflow、keras框架,陆续加入阅最新sota论文实现的新算法
Aigege
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt.

UltraOpt : Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. UltraOpt is a simple and efficient library to minimize expensive

null 98 Aug 16, 2022
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization

This project is now archived. It's been fun working on it, but it's time for me to move on. Thank you for all the support and feedback over the last c

Max Pumperla 2.1k Jan 3, 2023
optimization routines for hyperparameter tuning

Optunity is a library containing various optimizers for hyperparameter tuning. Hyperparameter tuning is a recurrent problem in many machine learning t

Marc Claesen 398 Nov 9, 2022
Distributed Asynchronous Hyperparameter Optimization in Python

Hyperopt: Distributed Hyperparameter Optimization Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which

null 6.5k Jan 1, 2023
Hyperparameter Optimization for TensorFlow, Keras and PyTorch

Hyperparameter Optimization for Keras Talos • Key Features • Examples • Install • Support • Docs • Issues • License • Download Talos radically changes

Autonomio 1.6k Dec 15, 2022
Automated Hyperparameter Optimization Competition

QQ浏览器2021AI算法大赛 - 自动超参数优化竞赛 ACM CIKM 2021 AnalyticCup 在信息流推荐业务场景中普遍存在模型或策略效果依赖于“超参数”的问题,而“超参数"的设定往往依赖人工经验调参,不仅效率低下维护成本高,而且难以实现更优效果。因此,本次赛题以超参数优化为主题,从真

null 20 Dec 9, 2021
HyperaPy: An automatic hyperparameter optimization framework ⚡🚀

hyperpy HyperPy: An automatic hyperparameter optimization framework Description HyperPy: Library for automatic hyperparameter optimization. Build on t

Sergio Mora 7 Sep 6, 2022
A Lightweight Hyperparameter Optimization Tool 🚀

Lightweight Hyperparameter Optimization ?? The mle-hyperopt package provides a simple and intuitive API for hyperparameter optimization of your Machin

null 136 Jan 8, 2023
A web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python.

Xcessiv Xcessiv is a tool to help you create the biggest, craziest, and most excessive stacked ensembles you can think of. Stacked ensembles are simpl

Reiichiro Nakano 1.3k Nov 17, 2022
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks

What is DeepHyper? DeepHyper is a software package that uses learning, optimization, and parallel computing to automate the design and development of

DeepHyper Team 214 Jan 8, 2023