Representation of a kernel-density estimate using Gaussian kernels.
Kernel density estimation is a way to estimate the probability density function (PDF) of a random variable in a non-parametric way. gaussian_kde works for both uni-variate and multi-variate data. It includes automatic bandwidth determination. The estimation works best for a unimodal distribution; bimodal or multi-modal distributions tend to be oversmoothed.
Parameters: |
|
---|
ndarray -- The dataset with which gaussian_kde was initialized.
int -- Number of dimensions.
int -- Number of datapoints.
float -- The bandwidth factor, obtained from kde.covariance_factor, with which the covariance matrix is multiplied.
ndarray -- The covariance matrix of dataset, scaled by the calculated bandwidth (kde.factor).
ndarray -- The inverse of covariance.
Evaluate the estimated pdf on a provided set of points.
Same as kde.evaluate(points)
Multiply pdf with a specified Gaussian and integrate over the whole domain.
Integrate pdf (1D only) between two bounds.
Integrate pdf over a rectangular space between low_bounds and high_bounds.
Integrate two kernel density estimates multiplied together.
Randomly sample a dataset from the estimated pdf.
Computes the bandwidth, i.e. the coefficient that multiplies the data covariance matrix to obtain the kernel covariance matrix. .. versionadded:: 0.11.0
Computes the coefficient (kde.factor) that multiplies the data covariance matrix to obtain the kernel covariance matrix. The default is scotts_factor. A subclass can overwrite this method to provide a different method, or set it through a call to kde.set_bandwidth.
Notes
Bandwidth selection strongly influences the estimate obtained from the KDE (much more so than the actual shape of the kernel). Bandwidth selection can be done by a "rule of thumb", by cross-validation, by "plug-in methods" or by other means; see [3], [4] for reviews. gaussian_kde uses a rule of thumb, the default is Scott's Rule.
Scott's Rule [1], implemented as scotts_factor, is:
n**(-1./(d+4)),
with n the number of data points and d the number of dimensions. Silverman's Rule [2], implemented as silverman_factor, is:
n * (d + 2) / 4.)**(-1. / (d + 4)).
Good general descriptions of kernel density estimation can be found in [1] and [2], the mathematics for this multi-dimensional implementation can be found in [1].
References
[1] | (1, 2, 3) D.W. Scott, "Multivariate Density Estimation: Theory, Practice, and Visualization", John Wiley & Sons, New York, Chicester, 1992. |
[2] | (1, 2) B.W. Silverman, "Density Estimation for Statistics and Data Analysis", Vol. 26, Monographs on Statistics and Applied Probability, Chapman and Hall, London, 1986. |
[3] | B.A. Turlach, "Bandwidth Selection in Kernel Density Estimation: A Review", CORE and Institut de Statistique, Vol. 19, pp. 1-33, 1993. |
[4] | D.M. Bashtannyk and R.J. Hyndman, "Bandwidth selection for kernel conditional density estimation", Computational Statistics & Data Analysis, Vol. 36, pp. 279-298, 2001. |
Examples
Generate some random two-dimensional data:
>>> from scipy import stats
>>> def measure(n):
>>> "Measurement model, return two coupled measurements."
>>> m1 = np.random.normal(size=n)
>>> m2 = np.random.normal(scale=0.5, size=n)
>>> return m1+m2, m1-m2
>>> m1, m2 = measure(2000)
>>> xmin = m1.min()
>>> xmax = m1.max()
>>> ymin = m2.min()
>>> ymax = m2.max()
Perform a kernel density estimate on the data:
>>> X, Y = np.mgrid[xmin:xmax:100j, ymin:ymax:100j]
>>> positions = np.vstack([X.ravel(), Y.ravel()])
>>> values = np.vstack([m1, m2])
>>> kernel = stats.gaussian_kde(values)
>>> Z = np.reshape(kernel(positions).T, X.shape)
Plot the results:
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> ax = fig.add_subplot(111)
>>> ax.imshow(np.rot90(Z), cmap=plt.cm.gist_earth_r,
... extent=[xmin, xmax, ymin, ymax])
>>> ax.plot(m1, m2, 'k.', markersize=2)
>>> ax.set_xlim([xmin, xmax])
>>> ax.set_ylim([ymin, ymax])
>>> plt.show()
Methods
__init__(dataset[, bw_method]) | |
covariance_factor() | |
evaluate(points) | Evaluate the estimated pdf on a set of points. |
integrate_box(low_bounds, high_bounds[, maxpts]) | Computes the integral of a pdf over a rectangular interval. |
integrate_box_1d(low, high) | Computes the integral of a 1D pdf between two bounds. |
integrate_gaussian(mean, cov) | Multiply estimated density by a multivariate Gaussian and integrate over the whole space. |
integrate_kde(other) | Computes the integral of the product of this kernel density estimate with another. |
resample([size]) | Randomly sample a dataset from the estimated pdf. |
scotts_factor() | |
set_bandwidth([bw_method]) | Compute the estimator bandwidth with given method. |
silverman_factor() |