Commit 9ffa6599 authored by Antoine Cully's avatar Antoine Cully
Browse files

[ci skip] minor improvement basic example

parent e39c3842
...@@ -2,7 +2,7 @@ This page presents benchmarks in which we compare the Bayesian optimization perf ...@@ -2,7 +2,7 @@ This page presents benchmarks in which we compare the Bayesian optimization perf
Each library is given 200 evaluations (10 random samples + 190 function evaluations) to find the optimum of the hidden function. We compare both the accuracy of the obtained solution (difference with the actual optimum solution) and the time (wall clock time) required by the library to run the optimization process. The results show that while the libraries generate solutions with similar accuracy (they are based on the same algorithm), **Limbo** generates these solutions significantly faster than BayesOpt. Each library is given 200 evaluations (10 random samples + 190 function evaluations) to find the optimum of the hidden function. We compare both the accuracy of the obtained solution (difference with the actual optimum solution) and the time (wall clock time) required by the library to run the optimization process. The results show that while the libraries generate solutions with similar accuracy (they are based on the same algorithm), **Limbo** generates these solutions significantly faster than BayesOpt.
In addition to comparing the performance of the libraries with their default parameter values (and evaluating **Limbo** with the same parameters than BayesOpt, see variant: limbo/bench_bayes_def), we also evaluate the performance of multiple variants of **Limbo**, including different acquisition functions (UCB or EI), different inner-optimizers (CMAES or DIRECT) and whether optimizing or not the hyper-parameters of the model. In all the these comparisons, **Limbo** is faster than BayesOpt (for similar results), even when BayesOpt is not optimizing the hyper-parameters of the Gaussian processes. In addition to comparing the performance of the libraries with their default parameter values (and evaluating **Limbo** with the same parameters as BayesOpt, see variant: limbo/bench_bayes_def), we also evaluate the performance of multiple variants of **Limbo**, including different acquisition functions (UCB or EI), different inner-optimizers (CMAES or DIRECT) and whether optimizing or not the hyper-parameters of the model. In all the these comparisons, **Limbo** is faster than BayesOpt (for similar results), even when BayesOpt is not optimizing the hyper-parameters of the Gaussian processes.
Details Details
......
.. _bayesian_optimization:
Introduction to Bayesian Optimization (BO) Introduction to Bayesian Optimization (BO)
========================================== ==========================================
......
...@@ -2,6 +2,8 @@ ...@@ -2,6 +2,8 @@
Basic Example Basic Example
================================================= =================================================
If you are not familiar with the main concepts of Bayesian Optimization, a quick introduction is available :ref:`here <bayesian_optimization>`.
In this tutorial, we will explain how to create a new experiment in which a simple function ( :math:`-{(5 * x - 2.5)}^2 + 5`) is maximized.
Let's say we want to create an experiment called "myExp". The first thing to do is to create the folder ``exp/myExp`` under the limbo root. Then add two files: Let's say we want to create an experiment called "myExp". The first thing to do is to create the folder ``exp/myExp`` under the limbo root. Then add two files:
...@@ -22,6 +24,8 @@ Next, copy the following content to the ``wscript`` file: ...@@ -22,6 +24,8 @@ Next, copy the following content to the ``wscript`` file:
.. code:: python .. code:: python
from waflib.Configure import conf
def options(opt): def options(opt):
pass pass
...@@ -70,16 +74,26 @@ With this, we can declare the main function: ...@@ -70,16 +74,26 @@ With this, we can declare the main function:
:linenos: :linenos:
:lines: 114-123 :lines: 114-123
The full ``main.cpp`` can be found `here <../../src/tutorials/basic_example.cpp>`_
Finally, from the root of limbo, run a build command, with the additional switch ``--exp myExp``: :: Finally, from the root of limbo, run a build command, with the additional switch ``--exp myExp``: ::
./waf build --exp myExp ./waf build --exp myExp
Then, an executable named ``myExp`` should be produced under the folder ``build/exp/myExp``. Then, an executable named ``myExp`` should be produced under the folder ``build/exp/myExp``.
When running this executable, you should see something similar to this:
Full ``main.cpp``:
.. literalinclude:: ../../src/tutorials/basic_example.cpp .. literalinclude:: ./example_run_basic_example/print_test.dat
:language: c++
:linenos: These lines show the result of each sample evaluation of the :math:`40` iterations (after the random initialization). In particular, we can see that algorithm progressively converges toward the maximum of the function (:math:`5`) and that the maximum found is located at :math:`x = 0.500014`.
:lines: 48-
Running the executable also created a folder with a name composed of YOUCOMPUTERHOSTNAME-DATE-HOUR-PID. This folder should contain two files: ::
limbo
|-- YOUCOMPUTERHOSTNAME-DATE-HOUR-PID
+-- samples.dat
+-- aggregated_observations.dat
The file ``samples.dat`` contains the coordinates of the samples that have been evaluated during each iteration, while the file ``aggregated_observations.dat`` contains the corresponding observed values.
#iteration aggregated_observation
-1 2.70602
-1 2.01091
-1 3.63208
-1 1.53741
-1 4.78237
-1 3.13115
-1 -1.21201
-1 4.44618
-1 -0.9999
-1 4.15864
0 4.99986
1 4.99977
2 4.99984
3 4.99984
4 4.99984
5 4.99983
6 4.99983
7 4.99983
8 4.99983
9 4.99982
10 4.99986
11 4.99987
12 4.99989
13 4.99991
14 4.99993
15 4.99995
16 4.99997
17 4.99999
18 4.99999
19 5
20 5
21 5
22 5
23 5
24 5
25 5
26 5
27 5
28 5
29 5
30 5
31 5
32 5
33 5
34 5
35 5
36 5
37 5
38 5
39 5
0 new point: 0.502378 value: 4.99986 best:4.99986
1 new point: 0.503035 value: 4.99977 best:4.99986
2 new point: 0.502521 value: 4.99984 best:4.99986
3 new point: 0.502533 value: 4.99984 best:4.99986
4 new point: 0.502556 value: 4.99984 best:4.99986
5 new point: 0.502585 value: 4.99983 best:4.99986
6 new point: 0.502618 value: 4.99983 best:4.99986
7 new point: 0.502643 value: 4.99983 best:4.99986
8 new point: 0.502646 value: 4.99983 best:4.99986
9 new point: 0.502673 value: 4.99982 best:4.99986
10 new point: 0.502383 value: 4.99986 best:4.99986
11 new point: 0.502262 value: 4.99987 best:4.99987
12 new point: 0.502111 value: 4.99989 best:4.99989
13 new point: 0.501921 value: 4.99991 best:4.99991
14 new point: 0.501679 value: 4.99993 best:4.99993
15 new point: 0.501383 value: 4.99995 best:4.99995
16 new point: 0.501055 value: 4.99997 best:4.99997
17 new point: 0.500751 value: 4.99999 best:4.99999
18 new point: 0.500517 value: 4.99999 best:4.99999
19 new point: 0.500358 value: 5 best:5
20 new point: 0.500256 value: 5 best:5
21 new point: 0.500189 value: 5 best:5
22 new point: 0.500145 value: 5 best:5
23 new point: 0.500114 value: 5 best:5
24 new point: 0.500092 value: 5 best:5
25 new point: 0.500075 value: 5 best:5
26 new point: 0.500063 value: 5 best:5
27 new point: 0.500054 value: 5 best:5
28 new point: 0.500046 value: 5 best:5
29 new point: 0.500039 value: 5 best:5
30 new point: 0.500035 value: 5 best:5
31 new point: 0.50003 value: 5 best:5
32 new point: 0.500027 value: 5 best:5
33 new point: 0.500024 value: 5 best:5
34 new point: 0.500022 value: 5 best:5
35 new point: 0.50002 value: 5 best:5
36 new point: 0.500018 value: 5 best:5
37 new point: 0.500016 value: 5 best:5
38 new point: 0.500015 value: 5 best:5
39 new point: 0.500014 value: 5 best:5
Best sample: 0.500014 - Best observation: 5
#iteration sample
-1 0.197082
-1 0.15422
-1 0.266084
-1 0.127839
-1 0.593302
-1 0.226588
-1 0.998478
-1 0.351162
-1 0.0101061
-1 0.316549
0 0.502378
1 0.503035
2 0.502521
3 0.502533
4 0.502556
5 0.502585
6 0.502618
7 0.502643
8 0.502646
9 0.502673
10 0.502383
11 0.502262
12 0.502111
13 0.501921
14 0.501679
15 0.501383
16 0.501055
17 0.500751
18 0.500517
19 0.500358
20 0.500256
21 0.500189
22 0.500145
23 0.500114
24 0.500092
25 0.500075
26 0.500063
27 0.500054
28 0.500046
29 0.500039
30 0.500035
31 0.50003
32 0.500027
33 0.500024
34 0.500022
35 0.50002
36 0.500018
37 0.500016
38 0.500015
39 0.500014
...@@ -70,7 +70,7 @@ Create a new experiment ...@@ -70,7 +70,7 @@ Create a new experiment
./waf --create test ./waf --create test
See the :ref:`Framework guide <framework-guide>` For more information about experiments in Limbo, see the :ref:`Framework guide <framework-guide>`
Edit the "Eval" function to define the function that you want to optimized Edit the "Eval" function to define the function that you want to optimized
------------------------------------------------------------------------- -------------------------------------------------------------------------
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment