NVIDIA unveils Blackwell architecture to power next GenAI wave 

NVIDIA has announced its next-generation Blackwell GPU architecture, designed to usher in a new era of accelerated computing and enable organisations to build and run real-time generative AI on trillion-parameter large language models.

The Blackwell platform promises up to 25 times lower cost and energy consumption compared to its predecessor: the Hopper architecture. Named after pioneering mathematician and statistician David Harold Blackwell, the new GPU architecture introduces...

NVIDIA DGX Station A100 is an ‘AI data-centre-in-a-box’

NVIDIA has unveiled its DGX Station A100, an “AI data-centre-in-a-box” powered by up to four 80GB versions of the company’s record-setting GPU.

The A100 Tensor Core GPU set new MLPerf benchmark records last month—outperforming CPUs by up to 237x in data centre inference. In November, Amazon Web Services made eight A100 GPUs available in each of its P4d instances.

For those who prefer their hardware local, the DGX Station A100 is available in either four 80GB A100...

NVIDIA chucks its MLPerf-leading A100 GPU into Amazon’s cloud

NVIDIA’s A100 set a new record in the MLPerf benchmark last month and now it’s accessible through Amazon’s cloud.

Amazon Web Services (AWS) first launched a GPU instance 10 years ago with the NVIDIA M2050. It’s rather poetic that, a decade on, NVIDIA is now providing AWS with the hardware to power the next generation of groundbreaking innovations.

The A100 outperformed CPUs in this year’s MLPerf by up to 237x in data centre inference. A single NVIDIA DGX A100...

NVIDIA sets another AI inference record in MLPerf

NVIDIA has set yet another record for AI inference in MLPerf with its A100 Tensor Core GPUs.

MLPerf consists of five inference benchmarks which cover the main three AI applications today: image classification, object detection, and translation.

“Industry-standard MLPerf benchmarks provide relevant performance data on widely used AI networks and help make informed AI platform buying decisions,” said Rangan Majumder, VP of Search and AI at Microsoft.

Last...

NVIDIA’s AI-focused Ampere GPUs are now available in Google Cloud

Google Cloud users can now harness the power of NVIDIA’s Ampere GPUs for their AI workloads.

The specific GPU added to Google Cloud is the NVIDIA A100 Tensor Core which was announced just last month. NVIDIA says the A100 “has come to the cloud faster than any NVIDIA GPU in history.”

NVIDIA claims the A100 boosts training and inference performance by up to 20x over its predecessors. Large AI models like BERT can be trained in just 37 minutes on a cluster of 1,024...

Nvidia explains how ‘true adoption’ of AI is making an impact

Nvidia Senior Director of Enterprise David Hogan spoke at this year’s AI Expo about how the company is seeing artificial intelligence adoption making an impact.

In the keynote session, titled ‘What is the true adoption of AI’, Hogan provided real-world examples of how the technology is being used and enabled by Nvidia’s GPUs. But first, he highlighted the momentum we’re seeing in AI.

“Many governments have announced investments in AI and how they're going...