IOTA is an emerging decentralized computing paradigm for developing blockchain-based Internet of Things (IoT) applications. It has the advantages of zero transaction fees, incremental scalability, and high-performance transaction rates. Despite its well-understood benefits, IOTA nodes need to withstand considerable resource costs to generate the distributed ledger. The main reason for this is that IOTA abandons the original blockchain reward mechanism and does not charge transaction fees. Therefore, in this paper we address the cost optimization issue for IOTA based on Lyapunov optimization theory. We take the first step in investigating the cost optimization problem of IOTA and exploring a new optimization scheme using Lyapunov optimization theory. Our proposed scheme enables IOTA to minimize the total cost of IOTA nodes through a computational optimization algorithm. Then, an optimized transaction rate control algorithm can be designed based on the large deviation theory to reduce orphan tangles that waste computational costs. In addition, we define and deduce the effective width of the tangle to monitor the total throughput and reduce the time spent on cost optimization to avoid unnecessary waste of resources. Lastly, a comprehensive theoretical analysis and simulation experiments demonstrate that the proposed strategy is both efficient and practical.
Read full abstract