2012 Poster Sessions : Towards Energy Proportional Memory with Mobile DRAMs

Student Name : Krishna Teja
Advisor : Christos Kozyrakis
Research Areas: Computer Systems
As we pursue energy-efficient processing, we must also pursue energy-efficient memory to avoid bottlenecks and system imbalances. The pursuit of efficiency can benefit from a range of emerging memory technologies and their implications for application performance and system efficiency. This work examines the role of mobile memory in high-performance systems. We evaluate an emerging technology, LP-DDR2, and propose an architecture that provides high-capacity memory systems while addressing technology-specific limitations, such as non-terminated links. Relative to server-class DDR3, this architecture reduces memory power by 5.6× with a 3 percent application performance penalty. Moreover, LP-DDR2 offers highly effective low power modes and fast transitions between them. We perform a vertically integrated analysis beginning with technology parameters and ending with datacenter operating costs. In this analysis, we introduce a new metric, average memory access energy (AMAE), which suggests that efficient memory motivates smaller processor caches due to static energy costs. We'll also introduce TCO-neutral pricing, which quantifies the system benefits of a technology and identifies the price one might be willing pay for them. These concepts are broadly applicable and invaluable for a holistic analysis of emerging technologies.

Krishna Malladi graduated from IIT Kanpur, India in 2009 with a Dual Degree (Bachelors and Masters) in Electrical Engineering with Academic-proficiency medal. Since 2009-fall, he has been a PhD student at Stanford University under the guidance of Prof. Mark Horowitz and is supported by Benchmark Capital Stanford Graduate Fellowship. His research focuses on developing Energy-efficient server architectures and memories for datacenters. He has been associated with Rambus, Google and Qualcomm as an intern in 2011, 2010 and 2008 respectively.