Y = pyro.sample("obs", dist.Normal(y_pred, y_scale),Īnd train with an AutoDiagonalNormal guide. Y_scale = pyro.sample("y_scale", dist.LogNormal(0, 1)) Thanks for the answer I tried implementing your suggestion: def piecewise_eval(knot_x, knot_y, x): Pyro.sample("obs", dist.Normal(y_pred, y_scale), Y_scale = pyro.sample("y_scale", dist.LogNormal(0, 5)) Y_pred = piecewise_eval(knot_x, knot_y, x) Knot_y = knot_y # this might not play well with batching Here’s an attempt at a continuous parameterization: def piecewise_eval(knot_x, knot_y, x): Hi not sure why your discrete model fails to converge, but I would guess a continuous parameterization would converge faster and more reliably, and would additionally scale to multiple pieces. What can I do to make the fit more accurate? Is there a better way to implement this model? I’ve tried having Dirichlet priors on the change points, and multiple restarts (which helps a little, but best of 10 restarts still isn’t great). Plt.errorbar(x,fit_obs_mean,yerr=fit_obs_std) #plt.plot(x,slope_fit*x.numpy()+intercept_fit) Pred = ((piecewise_regression),guide=piecewise_regressionGuide,num_samples=100)įit_obs_mean = fit.mean(0).detach().numpy()įit_obs_std = fit.std(0).detach().numpy() Svi = SVI(piecewise_regression, piecewise_regressionGuide, optim, loss=elbo) But even with just one change point, I don’t get a good fit, even though the elbo converges: pyro.clear_param_store()Įlbo = TraceEnum_ELBO(max_plate_nesting=1, num_particles=10) I want to enumerate the change points out with TraceEnum_ELBO, though it looks like that won’t scale very well with more change points. The guide has all the same distributions as the model: Noise_std = pyro.sample('noise_std_'.format(piece), dist.LogNormal(0.,1.)) # if all of x is masked, sample_change_point returns the last xĬhange_point = min(change_point + change_points, N) Line_distributions.append(dist.MaskedMixture(mask, prev_dist, dist.Normal(lines*x+lines,lines))) Lines.append((slope,intercept,noise_std)) Intercept = (prev_slope*x+prev_intercept)-slope*x Slope, noise_std = sample_slope_and_noise_std(piece) Line_distributions.append(dist.Normal(lines*x+lines,lines))Ĭhange_point = sample_change_point(x, piece) Piecewise regression where pieces are connected and the last piece has slope 0įor piece in ate('pieces', n_pieces): def piecewise_regression(x, y, n_pieces = 3): the MaskedMixture’s are nested as many times as there are pieces). My approach is to sample change points from a Categorical and use a MaskedMixture with a linear regression as one component distribution, and another MaskedMixture as the other component (i.e. I’ve tried various variations on the model, but I haven’t been able to get good results. I’m trying to implement a piecewise linear regression model.
0 Comments
Leave a Reply. |