Commit d481bab6 authored by Konstantinos Chatzilygeroudis's avatar Konstantinos Chatzilygeroudis
Browse files

Quick n' dirty way to assert GP+HP Opt

parent 5991f6b6
......@@ -39,6 +39,11 @@ Why am I getting "'NoLFOpt' should never be called!" assertion failure?
Most probably, you are using the `BOptimizer` class and you have set an `hp_period` (rate at which the hyperparams are optimized) bigger than 0, but you are using a Gaussian Process model with no hyperparameters optimization. This should never happen. So, if you do not want to optimize any hyperparameters, set `hp_period` parameter to -1. On the other hand, if you want use a Gaussian Process model that does optimize the hyperparameters, check :ref:`here <gp-hpopt>` for available hyperparameters optimization options.
Why am I getting "'XXXLFOpt' was never called!" assertion failure at the end of my program execution?
-------------------------------------------------------------------------------------------------------
Most probably, you are using the `BOptimizer` class and you have set an `hp_period` (rate at which the hyperparams are optimized) less than 1, but you are using a Gaussian Process model that optimizes the hyperparameters. This should never happen. If you want use a Gaussian Process model that does optimize the hyperparameters, set the `hp_period` parameter to a value bigger than 0. On the other hand, if you do not want to optimize any hyperparameters, set `hp_period` parameter to -1 and use a Gaussian Process model that does not optimize the hyperparameters. Check :ref:`here <gp-hpopt>` for available hyperparameters optimization options.
Why C++11? (and not <insert your favorite language>)?
-----------------------------------------------------
We have specific needs that mainly revolve around high-performance, minimzing boilerplate code, and easy interface with hardware and existing libraries:
......
......@@ -71,7 +71,7 @@ namespace limbo {
/// Do not forget to call this if you use hyper-prameters optimization!!
void optimize_hyperparams()
{
HyperParamsOptimizer()(*this);
_hp_optimize(*this);
}
/// add sample and update the GP. This code uses an incremental implementation of the Cholesky
......@@ -290,6 +290,8 @@ namespace limbo {
double _lik;
HyperParamsOptimizer _hp_optimize;
void _compute_obs_mean()
{
_mean_vector.resize(_samples.size(), _dim_out);
......
......@@ -15,9 +15,19 @@ namespace limbo {
template <typename Params, typename Optimizer = opt::ParallelRepeater<Params, opt::Rprop<Params>>>
struct KernelLFOpt {
public:
KernelLFOpt() : _called(false) {}
~KernelLFOpt()
{
if (!_called) {
std::cerr << "'KernelLFOpt' was never called!" << std::endl;
assert(false);
}
}
template <typename GP>
void operator()(GP& gp) const
void operator()(GP& gp)
{
_called = true;
KernelLFOptimization<GP> optimization(gp);
Optimizer optimizer;
auto params = optimizer(optimization, (gp.kernel_function().h_params().array() + 6.0) / 7.0, true);
......@@ -82,6 +92,8 @@ namespace limbo {
protected:
const GP& _original_gp;
};
bool _called;
};
}
}
......
......@@ -15,9 +15,19 @@ namespace limbo {
template <typename Params, typename Optimizer = opt::ParallelRepeater<Params, opt::Rprop<Params>>>
struct KernelMeanLFOpt {
public:
KernelMeanLFOpt() : _called(false) {}
~KernelMeanLFOpt()
{
if (!_called) {
std::cerr << "'KernelMeanLFOpt' was never called!" << std::endl;
assert(false);
}
}
template <typename GP>
void operator()(GP& gp) const
void operator()(GP& gp)
{
_called = true;
KernelMeanLFOptimization<GP> optimization(gp);
Optimizer optimizer;
int dim = gp.kernel_function().h_params_size() + gp.mean_function().h_params_size();
......@@ -93,6 +103,8 @@ namespace limbo {
protected:
const GP& _original_gp;
};
bool _called;
};
}
}
......
......@@ -15,9 +15,19 @@ namespace limbo {
template <typename Params, typename Optimizer = opt::ParallelRepeater<Params, opt::Rprop<Params>>>
struct MeanLFOpt {
public:
MeanLFOpt() : _called(false) {}
~MeanLFOpt()
{
if (!_called) {
std::cerr << "'MeanLFOpt' was never called!" << std::endl;
assert(false);
}
}
template <typename GP>
void operator()(GP& gp) const
void operator()(GP& gp)
{
_called = true;
MeanLFOptimization<GP> optimization(gp);
Optimizer optimizer;
auto params = optimizer(optimization, (gp.mean_function().h_params().array() + 6.0) / 7.0, true);
......@@ -72,6 +82,8 @@ namespace limbo {
protected:
const GP& _original_gp;
};
bool _called;
};
}
}
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment