Prior Parameters with MCMC + Extra Optimization Step #84
jejjohnson
started this conversation in
General
Replies: 1 comment 2 replies
-
Priors are set here: Where it calls this: pyextremes/src/pyextremes/models/distribution.py Lines 225 to 252 in ce530c4 KDE kernel is used to find maximum aposteriori (MAP) for each parameter. Once sampler is done, we have a large number of estimates for each of the distribution parameters (shape, loc, scale) - we need to get a representative number ("central fit") for those. I use gaussian KDE to find the mode for each of those. These are shown as orange lines on the corner plot: |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I wanted to know where/how you put prior parameters for the distribution using the emcee algorithms? I could not find anything there except a note of a uninformative prior.
In addition, I notice you run the emcee sampler and then you perform some optimization scheme using the parameters with kde kerne l and a derivative-free optimization scheme? I’m curious if there is some reason for this? I’ve tried fitting GEVD/GPD distributions before with numpyro but I’ve always noticed that the fits with emcee seem to better follow the return period observations. I wonder if it has something to do with this extra step… Any thoughts/comments?
Beta Was this translation helpful? Give feedback.
All reactions