Google expands partnership with Anthropic to enhance AI safety

Google has announced the expansion of its partnership with Anthropic to work towards achieving the highest standards of AI safety.

The collaboration between Google and Anthropic dates back to the founding of Anthropic in 2021. The two companies have closely collaborated, with Anthropic building one of the largest Google Kubernetes Engine (GKE) clusters in the industry.

"Our longstanding partnership with Google is founded on a shared commitment to develop AI responsibly...

Dave Barnett, Cloudflare: Delivering speed and security in the AI era

AI News sat down with Dave Barnett, Head of SASE at Cloudflare, during Cyber Security & Cloud Expo Europe to delve into how the firm uses its cloud-native architecture to deliver speed and security in the AI era.

According to Barnett, Cloudflare’s cloud-native approach allows the company to continually innovate in the digital space. Notably, a significant portion of their services are offered to consumers for free.

“We continuously reinvent, we’re very...

MLPerf Inference v3.1 introduces new LLM and recommendation benchmarks

The latest release of MLPerf Inference introduces new LLM and recommendation benchmarks, marking a leap forward in the realm of AI testing.

The v3.1 iteration of the benchmark suite has seen record participation, boasting over 13,500 performance results and delivering up to a 40 percent improvement in performance. 

What sets this achievement apart is the diverse pool of 26 different submitters and over 2,000 power results, demonstrating the broad spectrum of...

NVIDIA sets another AI inference record in MLPerf

NVIDIA has set yet another record for AI inference in MLPerf with its A100 Tensor Core GPUs.

MLPerf consists of five inference benchmarks which cover the main three AI applications today: image classification, object detection, and translation.

“Industry-standard MLPerf benchmarks provide relevant performance data on widely used AI networks and help make informed AI platform buying decisions,” said Rangan Majumder, VP of Search and AI at Microsoft.

Last...

NVIDIA’s AI-focused Ampere GPUs are now available in Google Cloud

Google Cloud users can now harness the power of NVIDIA’s Ampere GPUs for their AI workloads.

The specific GPU added to Google Cloud is the NVIDIA A100 Tensor Core which was announced just last month. NVIDIA says the A100 “has come to the cloud faster than any NVIDIA GPU in history.”

NVIDIA claims the A100 boosts training and inference performance by up to 20x over its predecessors. Large AI models like BERT can be trained in just 37 minutes on a cluster of 1,024...