Congratulations to Eduardo Segredo, whose algorithm has won two prizes at the annual GENOPT competition
While comparing results on benchmark functions is a widely used practice to demonstrate the competitiveness of global optimization algorithms, fixed benchmarks can lead to a negative data mining process. The motivated researcher can “persecute” the algorithm choices and parameters until the final designed algorithm “confesses” positive results for the specific benchmark.
To avoid the negative data mining effect, the GENOPT contest makes a set of generators available to participants to test off-line and online tuning schemes, then runs a final competition based on random seeds only communicated in the final phase.
A dashboard will reflect the current ranking of the participants, who are encouraged to exchange preliminary results and opinions.
The High Jump prize is awarded to the submission that typically jumps higher than the others, according to the “Record at Checkpoints” criterion
The Target Shooting prize is awarded to the submission whose runs typically achieve higher success rates in the quest for the true global minimum
The Biathlon prize is awarded to the submission achieving the highest cumulative ranking in the two previous categories
The winners will be celebrated at LION 2016 - the Learning and Intelligent OptimizatioN Conference, on Ischia Island (Napoli), Italy, 29 May - 1 June, 2016
The Leaderboard is available at the GenOpt site - Eduardo's algorithm won the Biathlon and High Jump Categories, and overall Gold Medal