Abstract

Modern computer architectures suffer from lack of architectural innovations, mainly due to the power wall and the memory wall. That is, architectural innovations become infeasible because they can prohibitively increase power consumption and their performance impacts are eventually bounded by slow memory accesses. To address the challenges, making computer systems run at ultra-low temperatures (or cryogenic computer systems) has emerged as a highly promising solution as both power consumption and wire resistivity are expected to significantly reduce at ultra-low temperatures. However, cryogenic computers have not been yet realized as computer architects do not fully understand the behaviors of existing computer systems and their cost effectiveness at such ultra-low temperatures. In this paper, we first develop CryoRAM, a validated computer architecture simulation tool to incorporate cryogenic memory devices. For this work, we focus on 77K temperature (easily achieved by applying low-cost liquid nitrogen), at which modern CMOS devices still reliably operate. We also focus on reducing the temperature of memory devices only as a pilot study prior to building a full cryogenic computer. Next, driven by the modeling tool, we propose our temperature-aware memory device and architecture designs to improve the DRAM access speed by 3.8 times or reduce the power consumption to 9.2%. Finally, we provide three promising case studies using cryogenic memories to significantly improve (1) server performance (up to 2.5 times), (2) server power (down to 6% on average), and (3) datacenter's power cost (by 8.4%). We will release our modeling and simulation tools deliberately implemented on top of only open-source simulators combined, even though some experiments were conducted under industry-confidential environments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call