Abstract

Nowadays, data keeps increasing; this in turn makes big data one of the hot topics in the modern era of technology. The biggest challenge, however, is big data security and cryptography is one of the most secure techniques. In this proposed model, we use this technique to secure data via a proposed new stream cipher technique to process more than one block by dividing the total size of the block into two parts, and swapping them then, combine and apply XOR operation with key and make some of mathematical operation. This operation is of fifteen rounds which make it very difficult for attacks to guess the plaintext.

Highlights

  • Keywords : Big data, Encryption, Cryptography, Stream cipher, Security, Hadoop, MapReduce. This is the era of digital information within which data is incredibly crucial, the world depends on data

  • It is useful for distributed processing with an efficient process and general data sets on the computing cluster, it used to decrease the cost of security by automatically splitting input data into some parts, run a program parallel on that data parts with handle most of the problems at once, such consistency and fault tolerance

  • It is a blend of the AES encryption algorithm and Map Reduce parallel programming paradigm, which is a less demanding model to incorporate the Advanced Encryption Standard encryption algorithm in order to work in a parallel way and save time

Read more

Summary

INTRODUCTION

This is the era of digital information within which data is incredibly crucial, the world depends on data. Hadoop, developed under an Apache License, is an open source distributed processing framework for sorting data and working applications on clusters of commodity hardware It offers huge storage for all kinds of date, many processing power and the chance to handle limitless tasks and jobs at the same time. It considers as a software framework that comfortable to deal with huge, long - running jobs that cannot be grip within the reach of a single request It is useful for distributed processing with an efficient process and general data sets on the computing cluster, it used to decrease the cost of security by automatically splitting input data into some parts, run a program parallel on that data parts with handle most of the problems at once, such consistency and fault tolerance.

RELATED WORK
THE PROPOSED METHOD
Algorithm
EXPERIMENTAL SETUP
PERFORMANCE ANALYSIS
Key space analysis
Statistical Analysis
EXECUTION TIME
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.