Rephrase and rearrange the whole content into a news article. I want you to respond only in language English. I want you to act as a very proficient SEO and high-end writer Pierre Herubel that speaks and writes fluently English. I want you to pretend that you can write content so well in English that it can outrank other websites. Make sure there is zero plagiarism.:
- Nvidia’s GPUs are in high-demand — and the company is using AI to accelerate chip production.
- The chip giant’s custom AI model, ChipNeMo, aims to speed up the chip design process.
Companies are vying for Nvidia’s limited supply of GPUs — used to train and build AI products — as the AI sector booms. Now, the chip giant is using its own AI to make its chips faster in what appears to be an effort to keep up with the demand.
Nvidia has developed an AI system known as ChipNeMo that aims to speed up the production of its GPUs.
Designing GPUs can be labor-intensive. A chip typically takes close to 1,000 people to build, and each person needs to understand how different parts of the design process work together, Bryan Catanzaro, Nvidia’s vice president of applied deep learning research, told The Wall Street Journal.
That’s where ChipNeMo can help. The AI system is run on a large language model — built on top of Meta’s Llama 2 — that the company says it trained with its own data. In turn, ChipNeMo’s chatbot feature is able to respond to queries related to chip design such as questions about GPU architecture and the generation of chip design code, Catanzaro told the WSJ.
So far, the gains seem to be promising. Since ChipNeMo was unveiled last October, Nvidia has found that the AI system has been useful in training junior engineers to design chips and summarizing notes across 100 different teams, according to the Journal.
Nvidia didn’t respond to Business Insider’s immediate request for comment regarding whether ChipNeMo has led to speedier chip production.
Nvidia’s efforts to ramp up GPU production come as companies seek to get their hands on the company’s highly coveted chips to get ahead in the AI wars. Meta, which has rolled out AI products such as its large language model Llama 2 and its AI-powered Ray-Ban Smart Glasses in the last year, is on track to amass a total of 600,000 GPUs, including Nvidia’s A100s and other AI chips, the end of 2024.
The quest to build the best AI products seems to be boding well for Nvidia. The chip giant’s stock climbed 4% on Monday to a record high, and analysts from Goldman Sachs expect the gains to continue through the first half of 2025.
Nvidia isn’t the only organization trying to use AI to accelerate the design stage of semiconductors.
Last July, Google’s DeepMind made an AI system which the company said could speed up the process of designing its latest iteration of its custom chips, per the WSJ.
A few months later, Synopsys, a software giant, launched an AI tool designed to boost productivity among chip engineers.
Universities like New York University are also conducting research on how generative AI can be deployed to design chips more quickly.