Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Zero predicted variance while prediction differs from ground truth #486

Open
The-Alley opened this issue Nov 29, 2023 · 1 comment
Open
Labels

Comments

@The-Alley
Copy link

Hi,

First I wanted to apologize since the reason I am asking this question might be that I do not understand kriging properly.

I have been using Kriging, KPLS, and KPLSK for multi-output regression. I have noticed that as the number of training samples increases (more than 400 in my case), the predicted variances become zero while the predictions do not completely match the ground truth. I know that the variances for training points are zero but the points I am trying to predict are not in the training dataset. Even when the prediction error is quite large, the predicted variance is still zero. I would appreciate it if someone could help me understand the reason for that. Thanks in advance.

@The-Alley The-Alley reopened this Nov 29, 2023
@relf
Copy link
Member

relf commented Nov 29, 2023

Hi. First what is the dimension of your training samples?
Your surrogates may suffer from overfitting as you experience degradation with number of samples.

If you have numerous samples, you may want to consider them as being noisy or even consider using a sparse GP surrogate to avoid fitting exactly the training data.

@relf relf added the question label Nov 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants