Skip to content
News & Analysis

AI Forces 'Interconnects' Outside the Box

Synopsys says PCIe 7.0 will further compel data center designers to go for the disaggregation of heterogeneous resources.
AI Forces 'Interconnects' Outside the Box
(Image: Synopsys)

Share This Post:

By Junko Yoshida

What’s at stake:
Ai is disrupting the electronics industry. Increasingly diversified AI workloads are triggering seismic changes in the architecture of chips, boxes and data centers. Synopsys explains how AI is shortening PCIe spec cycles and discusses the role of next-generation interconnects in AI-driven data centers. 

The tech industry understands AI’s voracious appetite for more data, computing power and memory, and is coping — sort of. But so far not discussed enough are “interconnects” inside a box that have to migrate outside the box.

“In the world of interconnects, we are beginning to hit the laws of semiconductor physics,” said Manmeet Walia, executive director, mixed-signal PHY IP, Synopsys, in a recent interview with the Ojo-Yoshida Report. In contrast, “with compute, you can still go faster by leaning on Moore’s Law. Or, if you can’t go any faster, you can start parallelizing processing.”

Interconnect speed is now clearly trailing compute, according to Walia. Worse, doubling the bandwidth of an interconnect – from 64 gigabits to 128 gigabits per second, for example – does not just double complexity. It introduces “an exponential increase in complexity,” he noted.


This is great stuff. Let’s get started.

Already have an account? Sign in.