{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "

Parameter uncertainties with bootstrap resampling

" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "import matplotlib\n", "%matplotlib inline\n", "import matplotlib.pyplot as plt\n", "from scipy.stats import chi2" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Imagine we want to assess something about uncertainties in parameters using the bootstrap method. Here we will create a number of different data sets of the same length as the original data by drawing random sets from these data, with replacement, and doing the fits on these samples. For an example, we'll use a general polynomial function. You can write your own, but you can also use numpy.polyval() ; note that for polyval, the highest order term comes first in the parameter array" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# simulate a cubic function with equal uncertainties on all points\n", "npar= # how many parameters for a cubic?\n", "x= # set independent variable\n", "sig= # set uncertainties on points\n", "truepars= # set polynomial parameters\n", "y=np.polyval(truepars,x)+np.random.normal(0,sig,size=len(x)) # simulated data set = model + random error\n", "plt.errorbar(x,y,sig,fmt='o') # plot it\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We want to do a least squares fit to the data. What does the design matrix for a polynomial look like?\n", "\n", "Note that the least squares solution that accounts for uncertainties on the input points is given by:\n", "$$ (A^T\\Sigma^{-1}A) par = A^T\\Sigma^{-1} y$$\n", "Solving for the parameter vector:\n", "$$par = (A^T\\Sigma^{-1} A)^{-1} A^T\\Sigma^{-1} y$$\n", "where $A$ is the design matrix, and $\\Sigma$ is the covariance matrix with uncertainties on the points (not a sum!),\n", "which is a diagonal matrix if there are no covariances, with $\\sigma_i^2$ on the diagonal elements. When you did this before, we didn't use the covariance matrix, but instead put the data uncertainties into the design matrix; you can do that there, so long as you don't have covariances in the data, but I thought I'd give the general expression and encourage you to code it, since it is easy!\n", "
\n", "You can code the routine for the derivatives with respect to the parameters as you did before (simple for a polynomial), or use the numpy.vander() routine, which gives this for a polynomial function." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def deriv(x) :\n", " ''' Derivatives for third order\n", " or just use numpy.vander() routine for Vandermonde matrix, for arbitrary polynomial order!\n", " '''\n", " return \n", "\n", "# get the least squares solution\n", "design= # design matrix \n", "# covariance matrix just has sig**2 on diagonal if there are no covariances\n", "cov=np.diag(sig**2)\n", "# invert the covariance matrix\n", "invcov=np.linalg.inv(cov)\n", "#calculate the left hand side\n", "ATA=np.dot(np.dot(design.T,invcov),design)\n", "# invert this matrix\n", "err=np.linalg.inv(ATA)\n", "# now solve for the parameters\n", "pars=np.dot(err,np.dot(np.dot(design.T,invcov),y))\n", "\n", "#note that the inverse matrix gives the covariances of the parameters\n", "\n", "# print the fit and parameter uncertainties and plot the fit\n", "for i in range(len(pars)) : print('par, error: ',pars[i],np.sqrt(err[i,i]),truepars[i])\n", "plt.plot(x,y,'o')\n", "xmod=np.arange(0,10,0.01)\n", "plt.plot(xmod,np.polyval(pars,xmod))\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Does your fit look OK? Make sure it does before proceeding." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we will do some bootstrap resampling. To do this, we can make use of the [nump.random.randint()](https://docs.scipy.org/doc/numpy-1.15.0/reference/generated/numpy.random.randint.html) function, which will return a set of $n$ integers drawn from a specified range, with replacement; for us the range will be the number of points, and we will draw this same number of samples." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#example of randint: draws n points from a sequence of length n, with replacement, so note\n", "# that some values will be duplicated\n", "n = len(x)\n", "print(np.random.randint(0,n,size=n))\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# now do a bootstrap resampling\n", "n=len(x)\n", "\n", "# loop over number of bootstrap samples\n", "# save the parameters in allpar\n", "nboot=1000\n", "allpar=[]\n", "for iboot in range(nboot) :\n", " # create the bootstrap data set\n", " j=np.random.randint(0,n,size=n)\n", " xboot= # independent variable values of resampled data\n", " yboot= # dependent variable of resampled data\n", " sigboot= # uncertainties of resampled data\n", " \n", " # solve for the parameters using this data set \n", " design=np.vander(xboot,npar) \n", " invcov=np.linalg.inv(np.diag(sigboot**2))\n", " ATA=np.dot(np.dot(design.T,invcov),design)\n", " booterr=np.linalg.inv(ATA)\n", " bootpars=np.dot(booterr,np.dot(np.dot(design.T,invcov),yboot))\n", "\n", " plt.plot(xboot,yboot,'o')\n", " plt.plot(xmod,np.polyval(bootpars,xmod))\n", " allpar.append(bootpars)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's look at the distribution of the derived parameters" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "allpar=np.array(allpar)\n", "allpar[:,0]\n", "\n", "# here's a gaussian routine so we can plot the least squares fit uncertainties for comparison\n", "def gauss(x,m,s) :\n", " return 1./np.sqrt(2*np.pi)/s*np.exp(-0.5*(x-m)**2/s**2)\n", "\n", "# plot the bootstrap distributions of each parameter\n", "# overplot the gaussian with the least squares fit uncertainty\n", "for i in range(4) :\n", " plt.subplot(2,2,i+1)\n", " plt.hist(allpar[:,i],bins=nboot//10,density=True)\n", " xg=np.linspace(pars[i]-5*np.sqrt(err[i,i]),pars[i]+5*np.sqrt(err[i,i]),100)\n", " plt.plot(xg,gauss(xg,pars[i],np.sqrt(err[i,i])))\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "How do the bootstrap parameter distributions compare with the least squares fit values?\n", "
\n", " ANSWER HERE " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.11" } }, "nbformat": 4, "nbformat_minor": 2 }