Abstract

Today’s deep neural networks (DNNs) are very accurate when trained on a large amount of data. However, suitable input might not be available or may require extensive data collection. Data sharing is one option to address these issues, but it is generally impractical because of privacy concerns or due to the problematic process of finding a sharing agreement. Instead, this work considers knowledge sharing by first exchanging the weights of pretrained DNNs and then applying transfer learning (TL). Specifically, it addresses the economics of knowledge sharing in AI services by taking a market-based approach. In detail, a model based on Fisher’s market is devised for optimal knowledge sharing, defined as the gain in inference accuracy from exchanging DNN weights. The proposed approach is shown to reach a market equilibrium and to satisfy important economic properties, including Pareto optimality. A technique for weight fusion is also introduced to merge acquired knowledge with the existing one. Finally, an extensive evaluation is conducted in a distributed intelligence scenario. The obtained results show that the proposed solution is efficient and that weight fusion with TL significantly increases inference accuracy compared to the original DNN, without the overhead of federated learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call