dstats ~master (2018-01-23T10:54:54Z)

class KernelDensity1D {}

- cdf
`double`cdf`(double x)` Compute the cumulative density, i.e. the integral from -infinity to x.

- cdfr
`double`cdfr`(double x)` Compute the cumulative density from the rhs, i.e. the integral from x to infinity.

- opCall
`double`opCall`(double x)` Compute the probability density at a given point.

- fromAlias
`KernelDensity1D`fromAlias`(R range, double edgeBuffer = double.nan)` Construct a kernel density estimator from an alias.

- fromCallable
`KernelDensity1D`fromCallable`(scope C kernel, R range, double edgeBuffer = double.nan)` Construct a kernel density estimation object from a callable object. R must be a range of numeric types. C must be a kernel function, delegate, or class or struct with overloaded opCall. The kernel function is assumed to be symmetric about zero, to take its maximum value at zero and to be unimodal.

- fromDefaultKernel
`KernelDensity1D`fromDefaultKernel`(R range, double edgeBuffer = double.nan)` Construct a kernel density estimator using the default kernel, which is a Gaussian kernel with the Scott bandwidth.

Estimates densities in the 1-dimensional case. The 1-D case is special enough to be treated as a special case, since it's very common and enables some significant optimizations that are otherwise not feasible.

Under the hood, this works by binning the data into a large number of bins (currently 1,000), convolving it with the kernel function to smooth it, and then using linear interpolation to evaluate the density estimates. This will produce results that are different from the textbook definition of kernel density estimation, but to an extent that's negligible in most cases. It also means that constructing this object is relatively expensive, but evaluating a density estimate can be done in O(1) time complexity afterwords.