'KernelDensity malfunctions for some inputs
I am using KernelDensity with the following code to separate a 1D vector of numbers based on their spatial distance.
import numpy as np
from numpy import array, linspace
from sklearn.neighbors import KernelDensity
from matplotlib.pyplot import plot
from scipy.signal import argrelextrema
a = array(v).reshape(-1, 1)
kde = KernelDensity(kernel='gaussian', bandwidth=1).fit(a)
s = linspace(min(a),max(a))
e = kde.score_samples(s.reshape(-1,1))
plt.plot(s, e)
mi = argrelextrema(e, np.less)[0]
print ("Minima:", s[mi])
If I use v=[1,10] as input vector, according to the figure below, there is a minima and s[mi] has a value
Minima: [[5.40816327]]
However, for an input like v=[1000,2000], the figure shows that there is a minima, but s[mi] is empty which is weird.
Minima: []
I even tested that with some other inputs like v=[10,20], v=[20,60], v=[10,60] or v=[30,60] and see that s[mi] is empty while there is indeed a minima in the figure.
Any idea about that?
P.S: The version of sklearn is 1.0.2.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|


