Abstract

The Internet of Things (IoT) is one of the latest internet evolutions. Cloud computing is an important technique which realizes the computational demand of largely distributed IoT devices/sensors by employing various machine learning models. Gradient descent methods are widely employed to find the optimal coefficients of a machine learning model in the cloud computing. Commonly, the data are distributed among multiple data owners, whereas the target function is held by the model owner. The model owner can train its model over data owner’s data and provide predictions. However, the dataset or the target function’s confidentiality may not be kept in secret during computations. Thus, security threats and privacy risks arise. To address the data and model’s privacy mentioned above, we present two new outsourced privacy-preserving gradient descent (OPPGD) method schemes over horizontally or vertically partitioned data among multiple parties, respectively. Compared to previously proposed solutions, our methods improve in comprehensiveness in a more general scene. The data privacy and the model privacy are preserved during the whole learning and prediction procedures. In addition, the execution performance evaluation demonstrates that our schemes can help the model owner to optimize its target function and provide exact prediction with high efficiency and accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.