David Patterson: Computer Architecture and RISC-V
The Evolution of Computer Architecture
David Patterson discusses the fifty-year transformation of computing hardware, driven primarily by Moore's Law. He highlights that as transistors became smaller and cheaper, the industry moved from massive room-filling machines to the ubiquitous microprocessors found in every modern device.
The RISC vs. CISC Debate
Patterson explains his pioneering work in RISC (Reduced Instruction Set Computer) architecture, which challenged the prevailing CISC (Complex Instruction Set Computer) models of the 1980s.
• RISC focuses on simple, fast instructions that are easier for compilers to optimize.
• CISC utilized complex instructions that proved harder for hardware to implement and for compilers to effectively leverage.
• The success of the RISC approach is now industry-standard, as it aligns better with the need for energy efficiency and faster execution.
Modern Trends: RISC-V and Open Source Hardware
Patterson discusses the creation of RISC-V, an open-source instruction set architecture that democratizes hardware design.
"Instead of before you had to stop at the hardware, you can now start going, layer by layer below that and see what's inside there."
• Open access: RISC-V allows researchers and companies to build processors without restrictive, proprietary barriers.
• Educational value: Its simplicity allows students to understand computers from the core logic up to the software layer.
Future Challenges and Opportunities
With Moore's Law slowing down, Patterson argues that the field is entering a "new golden age" characterized by domain-specific acceleration.
• Machine Learning: Performance gains are now being realized through accelerators tailored for matrix multiplication, essential for neural networks and AI.
• MLPerf: He stresses the importance of standardized benchmarking (like MLPerf) to ensure fair competition and genuine innovation, moving away from marketing-based hype.
• Quantum Computing: Patterson cautions that despite the excitement, practical quantum computing remains a long-term goal for 2030 and beyond, rather than an immediate general-purpose solution.