US Technological Dominance Is Not What It Used to Be
With everyone so mesmerized by silver-tongued AI chatbots, it’s easy to forget that most flashy breakthroughs in science and technology depend on much less glamorous advances in the fundamentals of computing—new algorithms, different computer architectures, and novel silicon chips.
The US has largely dominated these areas of innovation since the early days of computing. But academics who study advances in computer science say in a new report that by many measures, the US lead in advanced computing has declined significantly over the past five years—especially when measured against China.
It’s well established that America no longer manufactures many of the world’s most advanced computer chips—a process that involves carving insanely intricate patterns into silicon with devilishly difficult techniques. Apple and many other companies instead outsource that work to TSMC in Taiwan or Samsung in South Korea. This is why the US government created the CHIPS Act—a $52 billion package aimed at revitalizing domestic chip-making and related technologies.
The report—from MIT; the Council on Competitiveness, a think tank; and Silicon Catalyst, an investment firm—shows that America’s share of the world’s most powerful supercomputers has also fallen a lot over the past five years.
And while the US has traditionally dominated the development of new computer algorithms, some measures of algorithmic innovation—such as the Gordon Bell Prize, awarded to outstanding scientists working on advanced computing—indicate the US has lost its edge to China. The report sums up the overall trend in its pointed title: “America’s lead in advanced computing is almost gone.”
The findings are, in one sense, hardly surprising. China has made big economic advances in recent decades that have boosted its universities and tech industry while also making the country a linchpin of innovation in manufacturing for many US businesses.
But they also hold a message about the future that US policymakers may want to take note of, especially when advances in computing will be crucial to making progress in critical areas like energy, climate science, and medicine due to their ability to model incredibly complex phenomena.
Neil Thompson, an MIT researcher involved with the report, explains that modern AI such as ChatGPT and art-generating algorithms are built upon advances in a particular type of computer chip—the graphics processing unit (GPU). They were originally invented to perform the operations required to render video game graphics, but proved to be well suited to calculations used in an AI technique called deep learning.
Source link