Abstract

Given paired observation (xi, v1i, v2i, …, vpi, t1i, t2i, …, tqi, yi), i = 1, 2, …, n, follow the additive semiparametric regression model yi = μ(xi, vi, ti) + ϵi, whereμ(xi,vt,ti)=f(xi)+∑j=1pgj(νji)+∑s=1qhs(tsi)vi = (v1i, v2i, …, vpi)′, and ti = (t1i, t2i, …, tqi)′. Random errors ϵi is a normal distribution with mean 0 and variance σ2. To obtain a mixed estimator μ(xi, vi, ti), the regression curve f(xi) is approached by linier parametric, gj(vji) is kernel with bandwidths Φ = (φ1, φ2, …, φp)′ and the regression curve component fourier series hs (tsi) is approached by with oscillation paremeter N. The estimator is where . Penalized Least Squares (PLS) method giveMinc,β{ L(c)+L(β)+∑s=1qθsS(Hs(tsi)) }with smoothing parameter θ = (θ1, θ2, …, θq)′, the estimator f(x) is and is , where and . So that,μ^Φ,θ,N(vi,ti)=Z(Φ,θ,N)yis the mixed estimator of μ(vi, ti) where Z(Φ, θ, N) = C(Φ, θ, N) + V(Φ) + E(Φ, θ, N)Matrix C(Φ, θ, N), V(Φ) and E(Φ, θ, N) are depended on Φ, θ and N. Optimal Φ, θ and N can be obtained by the smallest Generalized Cross Validation (GCV).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.