Multi-task learning of GP meta-parameters #396
etienne-thuillier
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I have a question which might be ill-advised as I am relatively new at using Gaussian Processes.
I have a task in which the acquisition of any data point is costly such that we can only afford a verse sparse sampling. For this reason, I would like to leverage relatively densely sampled records that I have of past related tasks to learn the meta-parameters of the Gaussian Process. More specifically, I would like to obtain a single set of meta-parameter values from the past tasks taken as a whole (all tasks at once). My understanding is that I should minimise the total negative log marginal likelihood obtained by summing the individual negative log marginal likelihoods related to each task.
A practical issue that I have with this approach is that even though I have a single prior, it seems that I have to define several posteriors in GPJax:
posterior_1 = prior * likelihood_1
posterior_2 = prior * likelihood_2
...
where each likelihood is a Gaussian object specified for a distinct number of observed data points, e.g.
likelihood_1 =gpx.Gaussian(num_datapoints=dataset_1.n)
where
dataset_1
is specific to task 1 and has a different data point count thandataset_2
.This is puzzling to me because I have now have several models (posterior_1, posterior_2, ...) to initialise even though I want the meta-parameters to be shared across all tasks...
Am I missing something here?
With thanks!
Beta Was this translation helpful? Give feedback.
All reactions