Abstract
Let $X_1, X_2, \cdots$ be independent identically distributed random variables having a common probability density function $f$. After a so-called kernel class of estimates $f_n$ of $f$ based on $X_1, \cdots, X_n$ was introduced by Rosenblatt [7], various convergence properties of these estimates have been studied. The strongest result in this direction was due to Nadaraya [5] who proved that if $f$ is uniformly continuous then for a large class of kernels the estimates $f_n$ converges uniformly on the real line to $f$ with probability one. For a very general class of kernels, we will show that the above assumptions on $f$ are necessary for this type of convergence. That is, if $f_n$ converges uniformly to a function $g$ with probability one, then $g$ must be uniformly continuous and the distribution $F$ from which we are sampling must be absolutely continuous with $F' (x) = g(x)$ everywhere. When in addition to the conditions mentioned above, it is assumed that $f$ and its first $r + 1$ derivatives are bounded, we are able to show how to construct estimates $f_n$ such that $f^{(s)}_n$ converges uniformly to $f^{(s)}$ at a given rate with probability one for $s = 0, 1, \cdots, r$.