Forums

Errors when setting Optimizer to "None" and varying hyperparameters in GP scikit-learn

I am working on a project trying to use scikit-learn and GP to take predictions. I tried to compute or take the predictions by using the default optimizer of the library. This is something I did using a list of simple kernels with the bounds for the hyperparameters to be fixed, as shown here

kernels=[1.0 * RBF(length_scale=1.0, length_scale_bounds=(0.0001, 10.0)),
       1.0 * RationalQuadratic(length_scale=1.0, alpha=0.1,length_scale_bounds=(0.0001,100.0),alpha_bounds=(0.001,1000000000000000.0)),
       1.0 * ExpSineSquared(length_scale=1.0, periodicity=3.0,
                            length_scale_bounds=(0.0001, 10.0),
                            periodicity_bounds=(0.001, 100000.0)),
       1.0* (DotProduct(sigma_0=1.0, sigma_0_bounds=(0.0011, 100.0)) ** 2),
       1.0 * Matern(length_scale=1.0, length_scale_bounds=(0.0001, 100.0),
                    nu=1.5)]

And then:

for kernel in kernels:
    gp = GaussianProcessRegressor(kernel=kernel,n_restarts_optimizer=30, alpha=0.1, normalize_y=True)
    gp.fit(etraininga,ctraining)
    y_pred,sigma = gp.predict(energyiter, return_std=True)
    yvalues.append(y_pred)

However, I really want to try to get the parameters fixed manually. For doing that I set the optimizer to be "None" and then I made lists of values of the hyperparameters to search for the highest LML-prediction. As I took previosly the automated search (or optimized one) I knew which parameters I was expecting to find. My surprise comes when I got other parameters as the result. Is there anything I haven't done properly in order to search for the optimized hyperparameters manually?

These are the forums for PythonAnywhere, an online development and hosting environment -- I think you might have better luck getting an answer to general Python development questions over at Stack Overflow.