Abstract

With rapid growth in very-large-scale integration technology following Moore's law, the integration density of transistors has reached billions. This caused scaling of transistors to reach deep submicron regime resulting in failure of classical physics. Eventually, classical computing technologies have reached a physical limit and caused slowing down of Moore's law. Also, current leakage becomes a major problem in classical technology at such small size that heats up the chip. So Dennard scaling, which states about the constant power density with the decrease of transistor size also failed. It caused the switch to multi-core technology but that too seems to be at the end due to Dark Silicon issues. As a plausible alternative, researchers are trying to switch to some non-Complementary Metal-Oxide Semiconductor (CMOS) technology, e.g., quantum computing, bio-inspired computing such as deoxyribonucleic acid (DNA), etc. Besides mitigating the concerns faced in conventional technology, DNA computing comes with a bunch of other benefits too to cater the needs of future-generations computing, viz. massively parallel operations, huge information density over silicon, etc. In this chapter, with an introduction to structure of DNA and how DNA computing works, several fields of DNA computing have been explored followed by how DNA computing can be applied to solve several problems otherwise known as hard on conventional computer with some comments on possible future research directions in this promising field.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call