Yale University
abhishek at cs.yale.edu
-
NVIDIA-
NVIDIA-
NVIDIA-
NVIDIAI am an Associate Professor at Yale CS and a member of the Wu Tsai Institute for the brain sciences. I study computer architectures, systems software, and their efficient layering. Before Yale CS, I was at Rutgers CS.
My students and I are experts on virtual memory. AMD uses our coalesced TLBs in its flagship Zen architecture. Between 2017 and end-2019, AMD shipped over 260 million Zen cores with coalesced TLBs. By end-2022, AMD shipped an estimated one billion Zen cores with coalesced TLBs. Starting with the Ampere architecture in mid-2020, NVIDIA has shipped an estimated 50 million GPUs with TLB optimizations to support extreme translation contiguity. Starting with the 4.14 kernel in end-2017, an estimated two billion Linux OSes use our code to migrate 2MB pages. This, and more, is summarized in my textbook and appendix to the classic Hennessy & Patterson textbook on computer architecture.
More recently, we have also been building computer systems that help treat neurological disorders, augment the healthy brain, and shed light on brain function. In our HALO project, we have architected and are taping out low power and flexible chips for brain-computer interfaces.
Our research has been recognized with four IEEE Micro Top Picks awards and two honorable mentions, an NSF CAREER award, the Chancellor's Award for Faculty Excellence in Research at Rutgers, a visiting CV Starr Fellowship at Princeton's Neuroscience Institute, and more. My teaching and mentoring have been recognized with the Yale SEAS Ackerman Award.
Appendix L in "Computer Architecture: A Quantitative Approach" by Hennessy and Patterson