scorecardresearch

Nvidia unveils new GH200 Grace Hooper ‘Superchip’for generative AI models like ChatGPT & Bard

Nvidia introduces GH200 Hooper platform for generative AI models, mentions to bring down the cost of AI models.

advertisement
Nvidia unveils new GH200 Grace Hooper ‘Superchip’for generative AI models like ChatGPT & Bard  artificial intelligence
Nvidia unveils new GH200 Grace Hooper ‘Superchip’for generative AI models like ChatGPT & Bard
profile
New Delhi, UPDATED: Aug 9, 2023 13:38 IST

Highlights

  • Nvidia announced a new configuration to speed generative AI applications
  • The Grace Hopper microprocessor might lower the cost of using ChatGPT and other LLMs
  • The Nvidia GH200 Grace Hooper platform's new iteration is based on the first HBM3e processor

Advanced and quick GPUs are needed to run generative AI models like ChatGPT and Google Bard, among others. Nvidia in this regard is considered to be at the forefront of developing chipsets for these.On Tuesday, the California-based chipmaker introduced the next generation of the Nvidia GH200 Grace Hooper platform to speed up generative AI applications.

advertisement

The new Nvidia GH200 Grace Hooper platform is based on the first HBM3e processor designed specifically for generative AI, a new Grace Hopper Superchip. 

Grace Hopper  said to be more energy-efficient 

Additionally, the company asserted that the Grace Hopper is more energy-efficient than earlier chips, which can result in cheaper operational costs for data centres that utilise it. This is crucial since LLMs are gaining popularity and their high operating costs are preventing many people from using them. 

Simply put, OpenAI's ChatGPT is a large language model that can produce text that is of human-quality. Customer service, content development, and research are just a few of the uses for it. The Grace Hopper microprocessor might lower the cost of using ChatGPT and other LLMs for companies, which might encourage greater adoption of these technologies. 

The dual configuration consists of a single server with 144 Arm Neoverse cores, eight petaflops of AI performance, and 282GB of the most recent HBM3e memory technology, according to the company in a press release. This configuration provides up to 3.5x more memory capacity and 3x more bandwidth than the current generation offering.

advertisement

New Nvida chipset to be launched next year

Jensen Huang, founder and CEO of Nvidia, stated that "data centers require accelerated computing platforms with specialised needs to meet surging demand for generative AI.”

The new GH200 Grace Hopper Superchip platform delivers this with exceptional memory technology and bandwidth to improve throughput, the capability to connect GPUs to aggregate performance without compromise, and a server design that can be easily deployed across the entire data center.

Jensen Huang, founder and CEO of Nvidia

According to Ian Buck,  general manager and vice president of accelerated computing at Nvidia, the second quarter of next year will see the release of the new configuration known as GH200. Moreover, Nvidia intends to market two products: a version that includes two chips that customers can integrate into systems, and a complete server system that combines two Grace Hopper designs. 

Published on: Aug 9, 2023 13:38 ISTPosted by: nidhi bhardwaj, Aug 9, 2023 13:38 IST

COMMENTS 0

Advertisement
Recommended