Abstract
As Artificial Intelligence technologies are being vigorously used, there are major concerns about privacy, security and compression of the data. Bulk amounts of data are being stored in the cloud and the same will be transmitted to the parties that offer AI software services or platform services. The three key features that are to considered while transferring the data from cloud to Artificial Intelligence as a service (AIaaS) or Machine Learning as a service (MLaaS) are Data Compression, Data Integrity and Data Confidentiality. There is high demand for data processing which in turn is driving us to perform data compression. Data compression has to be done whether it is with Artificial Intelligence or Cloud Computing or Machine learning algorithms. Because without compressing the data such huge amounts of data whether it is text or multimedia application cannot be stored as it is. In this paper we have used an optimized lossless compression algorithm. When bulk amounts of data is being transferred from platform services to cloud, the foremost thing that has to be done is categorization of data i.e. the data that is critical and which needs integrity and the data that can be read by the users on the network. The critical data which needs AI services should be checked whether they are transmitted as it is. The data that is being sent by the cloud user should reach the service providing platforms without being modified. To maintain such integrity to the data, Hashing can be used. In this paper we have proposed a hashing algorithm that is implemented after performing data compression. The generated hash value is used as an attribute in generating keys for encryption.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have