Let $X_1, \cdots, X_n$ be independent random variables, identically distributed with continuous distribution function $F$. Let $X_{(1)}, \cdots, X_{(n)}$ denote the corresponding order statistics and $F_n$ the empiric distribution function, which we assume to be right continuous. Jung [4] found the asymptotic mean and variance of linear functions of the form $T_n = n^{-1} \sum^n_{i=1} J(i/n)X_{(i)} = \int^\infty_{-\infty} xJ(F_n(x)) dF_n(x)$ when the function $J$ has four bounded derivatives. More recently, it has been shown ([1], [2]) that under suitable restrictions \begin{equation*}\tag{1.1}\mathscr{L}\{n^{\frac{1}{2}}\lbrack T_n - \int^\infty_{-\infty} xJ(F(x)) dF(x)\rbrack\} \rightarrow N(0, \sigma^2)\end{equation*} where \begin{equation*}\tag{1.2}\sigma^2 = 2 \int\int_{s < t} J(F(s))J(F(t))F(s)\lbrack 1 - F(t)\rbrack ds dt.\end{equation*} Here $N(0, \sigma^2)$ denotes the normal distribution with mean zero and variance $\sigma^2$, and (1.1) uses a standard notation for convergence in distribution. The purpose of this note is to give a self-contained proof of (1.1) under the assumptions that the $X_i$ have finite mean and that $J'$ exists and is continuous and of bounded variation except at finitely many jumps of $J$. This theorem can be subsumed in a corrected version of Govindarajulu [2], where appeal is made to the results of [3]. The present proof is both more elementary and shorter. Chernoff, Gastwirth and Johns [1] do not require boundedness of $J$, but invoke compensating assumptions on $F$. Their methods of proof are quite different from those of [2] and of the present paper.