Nvidia has launched a new version of its NVLink called NVLink Fusion, new silicon that enables industries to build semi-custom AI infrastructures.

NVLink is a high-speed GPU interconnect developed by Nvidia that allows for faster and more efficient data transfer between multiple GPUs. The company says it provides a direct, point-to-point connection with much higher bandwidth and lower latency than traditional PCIe-based solutions, enabling what it describes as better performance and scalability for high-performance computing (HPC) and AI workloads. 

On Monday, chief executive at the chipmaker Jensen Huang unveiled NVLink Fusion, confirming the firm’s plans to sell the technology to other chip designers to help build powerful custom AI systems with multiple chips connected to each other in order to accelerate the chip-to-chip communication needed to build and deploy AI tools.

Huang announced the news at the Taipei Music Centre, home to the Computex AI trade show, which runs from 20 to 23 May.

Nvidia developed NVLink in 2024, with the technology currently used to exchange huge amounts of data between various chips, such as in the company’s GB200, which combines two Blackwell graphics processing units with a Grace processor.

MediaTek, Marvell, Alchip Technologies, Astera Labs, Synopsys and Cadence are among the first companies which confirmed to adopt NVLink Fusion, enabling the expansion of custom chips to meet the requirements of demanding workloads for model training and agentic AI inference.

“A tectonic shift is underway: for the first time in decades, data centres must be fundamentally rearchitected — AI is being fused into every computing platform,” said Huang.

He added: “NVLink Fusion opens Nvidia’s AI platform and rich ecosystem for partners to build specialised AI infrastructures.”

Matt Murphy, chairman and chief executive of Marvell, said the partnership aims to redefine what’s possible for AI factory integration.

Murphy said: “Marvell custom silicon with NVLink Fusion gives customers a flexible, high-performance foundation to build advanced AI infrastructure — delivering the bandwidth, reliability and agility required for the next generation of trillion-parameter AI models.”

During the event, the company also announced plans to build a headquarters in Taiwan, on the northern outskirts of Taipei.

At the event, Huang shared his vision of a technological revolution that will impact every country, every industry and every company in the world.

“AI is now infrastructure, and this infrastructure, just like the internet, just like electricity, needs factories,” Huang said. “These factories are essentially what we build today.”

Speaking about the future of data centres for the company, he said that they will not be like those society is accustomed to.

“These AI data centres, if you will, are improperly described,” continued the chief executive. “They are, in fact, AI factories.

“You apply energy to it, and it produces something incredibly valuable, and these things are called tokens.”


Share.
Exit mobile version