Abstract

Multiple-related tasks can be learned simultaneously by sharing information among tasks to avoid tabula rasa learning and to improve performance in the no transfer case (i.e., when each task learns in isolation). This study investigates multitask learning with conditional neural process (CNP) networks and proposes two multitask learning network models on the basis of CNPs, namely, the one-to-many multitask CNP (OMc-MTCNP) and the many-to-many MTCNP (MMc-MTCNP). Compared with existing multitask models, the proposed models add an extensible correlation learning layer to learn the correlation among tasks. Moreover, the proposed multitask CNP (MTCNP) networks are regarded as surrogate models and applied to a Bayesian optimization framework to replace the Gaussian process (GP) to avoid the complex covariance calculation. The proposed Bayesian optimization framework simultaneously infers multiple tasks by utilizing the possible dependencies among them to share knowledge across tasks. The proposed surrogate models augment the observed dataset with a number of related tasks to estimate model parameters confidently. The experimental studies under several scenarios indicate that the proposed algorithms are competitive in performance compared with GP-, single-task-, and other multitask model-based Bayesian optimization methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call