Abstract

Instead of traditionally (globally) adiabatic evolution algorithm for unstructured search proposed by Farhi or Van Dam, the high efficiency search using nested local adiabatic evolution algorithm for structured search is herein introduced to the quantum-like neurons in Hopfield-neural-net for performing several local adiabatic quantum searches and then nesting them together so that the optimal or near-optimal solutions can be founded efficiently. Particularly, this approach is applied to optimally training support vector regression (SVR) in such a way that tuning three free parameters of SVR toward an optimal regression is fast obtained, just like a kind of adaptive support vector regression (ASVR). Hence, we focus on the structured adiabatic quantum search by nesting a partial search over a reduced set of variables into a global search for solving an optimization problem on SVR, yielding an average complexity of order N α , with α < 1, compared with a quadratic speedup of order N over a naive Grover’s search. Finally, the application of regularizing the designated hybrid prediction model, consisting of BPNN-weighted Grey-C3LSP model and nonlinear autoregressive conditional heteroscedasticity, through this technique is realized to experiment the non-periodic short-term forecasts on international stock price indices and typhoon moving paths.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.