Killed by Robots

AI Artificial Intelligence / Robotics News & Philosophy

AI Chip Slashes Data Center Power Use

In a remarkable advance, researchers at Oregon State University have created a new computer chip that uses artificial intelligence to cut the energy required for running large language models—like those behind chatbots and digital assistants—by half. This breakthrough helps address a major problem: as AI grows, so does the need for electricity in the powerful computer centers where it works.

Understanding the Challenge

Large language models such as GPT-4 and Gemini operate in massive data centers. These centers rely on heavy streams of information passing along thousands of copper wires between servers. As these systems exchange more and more data at faster speeds, their need for electricity climbs sharply. Simply put, the faster the data moves, the more power is used, because the energy savings per bit of data has not kept pace. This imbalance makes AI a growing source of electrical demand worldwide.

A New Kind of Chip

The Oregon State team designed a chip that approaches the problem differently. At its core is a tiny AI-powered classifier that sits right on the chip itself. When data moves at high speed between servers, errors and noise can creep in. Normally, traditional communication chips clean up this data using equalizers—circuits that consume lots of energy in the process. Instead, this new classifier uses advanced AI principles to spot and correct these mistakes directly, doing the job with much less energy.

Because it is both smart and efficient, the chip can maintain high-quality communication even as data speeds rise, but at only half the usual power cost. This innovation could transform the way information flows inside data centers, especially those serving the ever-growing needs of AI.

Recognition and Support

Work of this importance does not go unnoticed. The project has attracted support from some of the world’s top organizations including the Defense Advanced Research Projects Agency (DARPA), the Semiconductor Research Corporation, and the Center for Ubiquitous Connectivity. When these organizations take interest, it signals the potential for meaningful change in the technology landscape.

The scientific community has also recognized the breakthrough. Ramin Javadi, a doctoral student and leading member of the research team, received the Best Student Paper Award for this project at the IEEE Custom Integrated Circuits Conference in Boston. This honor highlights how the chip not only solves a practical energy challenge but also advances academic understanding of energy-efficient communication.

Looking Ahead

The work is only just beginning. Led by Ramin Javadi and Professor Tejasvi Anand, the research group is already developing the next version of the chip. Their goal: even greater reductions in energy use. If successful, these future chips could bring new levels of efficiency and sustainability to AI and the broader field of high-speed computing.

Why This Matters

AI systems are changing the way we work, learn, and communicate, but their environmental toll is growing. Each step toward greater energy efficiency helps make these technologies more sustainable. This new chip from Oregon State University points the way to data centers that are not only powerful but also much gentler on our planet.

Cutting the electricity needed for AI does more than save money and lower emissions. It ensures that as AI’s capabilities grow, so too does our ability to use it responsibly. Smaller carbon footprints and smarter chips together create a pathway to a future where advanced computing serves everyone without costing the Earth.