Note

This tutorial was generated from an IPython notebook that can be downloaded here.

# Gaussian process models for stellar variability¶

theano version: 1.0.4
pymc3 version: 3.7
exoplanet version: 0.2.1.dev0


When fitting exoplanets, we also need to fit for the stellar variability and Gaussian Processes (GPs) are often a good descriptive model for this variation. PyMC3 has support for all sorts of general GP models, but exoplanet includes support for scalable 1D GPs (see Scalable Gaussian processes in PyMC3 for more info) that can work with large datasets. In this tutorial, we go through the process of modeling the light curve of a rotating star observed by Kepler using exoplanet.

First, let’s download and plot the data:

import numpy as np
import matplotlib.pyplot as plt
from astropy.io import fits

url = "https://archive.stsci.edu/missions/kepler/lightcurves/0058/005809890/kplr005809890-2012179063303_llc.fits"
with fits.open(url) as hdus:
data = hdus.data

# Work out the exposure time
texp = hdr["FRAMETIM"] * hdr["NUM_FRM"]
texp /= 60.0 * 60.0 * 24.0

x = data["TIME"]
y = data["PDCSAP_FLUX"]
yerr = data["PDCSAP_FLUX_ERR"]
m = (data["SAP_QUALITY"] == 0) & np.isfinite(x) & np.isfinite(y)

x = np.ascontiguousarray(x[m], dtype=np.float64)
y = np.ascontiguousarray(y[m], dtype=np.float64)
yerr = np.ascontiguousarray(yerr[m], dtype=np.float64)
mu = np.mean(y)
y = (y / mu - 1) * 1e3
yerr = yerr * 1e3 / mu

plt.plot(x, y, "k")
plt.xlim(x.min(), x.max())
plt.xlabel("time [days]")
plt.ylabel("relative flux [ppt]")
plt.title("KIC 5809890"); ## A Gaussian process model for stellar variability¶

This looks like the light curve of a rotating star, and it has been shown that it is possible to model this variability by using a quasiperiodic Gaussian process. To start with, let’s get an estimate of the rotation period using the Lomb-Scargle periodogram:

import exoplanet as xo

results = xo.estimators.lomb_scargle_estimator(
x, y, max_peaks=1, min_period=5.0, max_period=100.0,
samples_per_peak=50)

peak = results["peaks"]
freq, power = results["periodogram"]
plt.plot(1/freq, power, "k")
plt.axvline(peak["period"], color="k", lw=4, alpha=0.3)
plt.xlim((1/freq).min(), (1/freq).max())
plt.yticks([])
plt.xlabel("period [days]")
plt.ylabel("power"); Now, using this initialization, we can set up the GP model in exoplanet. We’ll use the exoplanet.gp.terms.RotationTerm kernel that is a mixture of two simple harmonic oscillators with periods separated by a factor of two. As you can see from the periodogram above, this might be a good model for this light curve and I’ve found that it works well in many cases.

import pymc3 as pm
import theano.tensor as tt

with pm.Model() as model:

# The mean flux of the time series
mean = pm.Normal("mean", mu=0.0, sd=10.0)

# A jitter term describing excess white noise
logs2 = pm.Normal("logs2", mu=2*np.log(np.min(yerr)), sd=5.0)

# The parameters of the RotationTerm kernel
logamp = pm.Normal("logamp", mu=np.log(np.var(y)), sd=5.0)
BoundedNormal = pm.Bound(pm.Normal, lower=0.0, upper=np.log(50))
logperiod = BoundedNormal("logperiod", mu=np.log(peak["period"]), sd=5.0)
logQ0 = pm.Normal("logQ0", mu=1.0, sd=10.0)
logdeltaQ = pm.Normal("logdeltaQ", mu=2.0, sd=10.0)
mix = pm.Uniform("mix", lower=0, upper=1.0)

# Track the period as a deterministic
period = pm.Deterministic("period", tt.exp(logperiod))

# Set up the Gaussian Process model
kernel = xo.gp.terms.RotationTerm(
log_amp=logamp,
period=period,
log_Q0=logQ0,
log_deltaQ=logdeltaQ,
mix=mix
)
gp = xo.gp.GP(kernel, x, yerr**2 + tt.exp(logs2), J=4)

# Compute the Gaussian Process likelihood and add it into the
# the PyMC3 model as a "potential"
pm.Potential("loglike", gp.log_likelihood(y - mean))

# Compute the mean model prediction for plotting purposes
pm.Deterministic("pred", gp.predict())

# Optimize to find the maximum a posteriori parameters
map_soln = xo.optimize(start=model.test_point)

optimizing logp for variables: ['mix_interval__', 'logdeltaQ', 'logQ0', 'logperiod_interval__', 'logamp', 'logs2', 'mean']
message: Optimization terminated successfully.
logp: 263.10222197490816 -> 692.0354844282873


Now that we have the model set up, let’s plot the maximum a posteriori model prediction.

plt.plot(x, y, "k", label="data")
plt.plot(x, map_soln["pred"], color="C1", label="model")
plt.xlim(x.min(), x.max())
plt.legend(fontsize=10)
plt.xlabel("time [days]")
plt.ylabel("relative flux [ppt]")
plt.title("KIC 5809890; map model"); That looks pretty good! Now let’s sample from the posterior using a exoplanet.PyMC3Sampler.

np.random.seed(42)
with model:
trace = pm.sample(tune=2000, draws=2000, start=map_soln, step=xo.get_dense_nuts_step(target_accept=0.9))

Multiprocess sampling (4 chains in 4 jobs)
NUTS: [mix, logdeltaQ, logQ0, logperiod, logamp, logs2, mean]
Sampling 4 chains: 100%|██████████| 16000/16000 [04:09<00:00, 64.02draws/s]
There were 3 divergences after tuning. Increase target_accept or reparameterize.
There were 2 divergences after tuning. Increase target_accept or reparameterize.
There were 2 divergences after tuning. Increase target_accept or reparameterize.
There was 1 divergence after tuning. Increase target_accept or reparameterize.
The acceptance probability does not match the target. It is 0.818181889442642, but should be close to 0.9. Try to increase the number of tuning steps.
The number of effective samples is smaller than 25% for some parameters.


Now we can do the usual convergence checks:

pm.summary(trace, varnames=["mix", "logdeltaQ", "logQ0", "logperiod", "logamp", "logs2", "mean"])

mean sd mc_error hpd_2.5 hpd_97.5 n_eff Rhat
mix 0.645586 0.239069 0.004391 0.208831 0.999778 2553.935195 1.000761
logdeltaQ 1.796429 9.379053 0.180243 -16.062965 21.197027 2755.473848 1.000478
logQ0 0.575926 0.519596 0.008320 -0.408937 1.645487 3390.782583 1.000067
logperiod 3.340409 0.106580 0.002492 3.130645 3.584340 1868.531182 1.001348
logamp 0.385146 0.569892 0.010741 -0.570982 1.554741 2598.262177 1.002484
logs2 -4.965833 0.124512 0.001686 -5.213504 -4.734801 4475.435982 1.000166
mean -0.023139 0.214047 0.003142 -0.425226 0.436275 4595.784068 1.000821

And plot the posterior distribution over rotation period:

period_samples = trace["period"]
bins = np.linspace(20, 45, 40)
plt.hist(period_samples, bins, histtype="step", color="k")
plt.yticks([])
plt.xlim(bins.min(), bins.max())
plt.xlabel("rotation period [days]")
plt.ylabel("posterior density"); 