#ai-workloads

[ follow ]
AI workloads
Theregister
5 months ago
Artificial intelligence

Memory prices to rise next year, Gartner forecasts

The semiconductor market is expected to return to growth in 2024, driven by increasing demand for AI workloads and memory components.
Gartner estimates that global semiconductor revenues will rise 16.8% in 2024, following a contraction in sales for 2023. [ more ]
Theregister
5 months ago
DevOps

Making sense of Nvidia's SuperNIC

Nvidia has introduced a new networking accelerator called SuperNIC, designed to boost AI workloads in Ethernet-based networks.
SuperNIC offers features such as high-speed packet reordering, advanced congestion control, programmable I/O pathing, and integration with Nvidia's hardware and software portfolio.
SuperNIC is not a rebrand of Nvidia's previous DPU, but a separate product designed to work with Nvidia's Spectrum-X offering. [ more ]
TechRepublic
5 months ago
Artificial intelligence

Microsoft Announces New Maia 100 and Cobalt 100 Chips

Microsoft will release two custom chips next year: the Maia 100 designed for AI workloads and the Cobalt 100 CPU for general compute workloads on Microsoft cloud.
The chips are built in-house by Microsoft, allowing for customization of the entire infrastructure stack to maximize performance.
Microsoft has developed custom server racks with liquid cooling to accommodate the Maia 100 AI Accelerator. [ more ]
TechRepublic
5 months ago
Artificial intelligence

Microsoft Announces New Maia 100 and Cobalt 100 Chips

Microsoft will release two custom chips next year: the Maia 100 designed for AI workloads and the Cobalt 100 CPU for general compute workloads on Microsoft cloud.
The chips are built in-house by Microsoft, allowing for customization of the entire infrastructure stack to maximize performance.
Microsoft has developed custom server racks with liquid cooling to accommodate the Maia 100 AI Accelerator. [ more ]
Ars Technica
6 months ago
Tech industry

Microsoft launches custom chips to accelerate its plans for AI domination

Microsoft announced two custom chips for accelerating AI workloads in its Azure cloud computing service.
Maia is designed for large language models like GPT-3.5 Turbo and GPT-4, while Cobalt is a CPU for conventional tasks.
Microsoft plans to use these chips internally and not sell them. [ more ]
Ars Technica
6 months ago
Tech industry

Microsoft launches custom chips to accelerate its plans for AI domination

Microsoft announced two custom chips for accelerating AI workloads in its Azure cloud computing service.
Maia is designed for large language models like GPT-3.5 Turbo and GPT-4, while Cobalt is a CPU for conventional tasks.
Microsoft plans to use these chips internally and not sell them. [ more ]
moreAI workloads
ComputerWeekly.com
2 days ago
Business intelligence

NetApp upgrades AFF all-flash as it targets AI storage | Computer Weekly

NetApp refreshes AFF all-flash storage arrays targeting AI workloads and energy efficiency. [ more ]
ITPro
1 day ago
Artificial intelligence

Hosting 101: The do's and don'ts

Small businesses can benefit from dedicated hosting by making the right choices from the start and optimizing workloads for better performance. [ more ]
Theregister
4 weeks ago
Artificial intelligence

AI PCs are here but a killer application for biz users? Nope

Forrester Research emphasizes the lack of a 'killer app' making AI PCs essential for business users. [ more ]
Theregister
1 month ago
Artificial intelligence

LANL powers up Nvidia's GH200-packed Venado super

Venado supercomputer focuses on AI workloads with impressive exaFLOPS performance using lower precision calculations. [ more ]
ComputerWeekly.com
1 month ago
Artificial intelligence

Storage technology explained: AI and data storage | Computer Weekly

AI and ML have a wide range of applications from simple chatbots to complex content generation.
Storage plays a crucial role in AI by providing data for training and storing large volumes of generated data. [ more ]
Theregister
1 month ago
Artificial intelligence

Ethernet advances will end InfiniBand's lead in AI nets

Ethernet is set to become a better alternative for AI workloads with three imminent improvements predicted by Gartner.
By 2028, it is estimated that 45 percent of Gen AI workloads will run on Ethernet, up from less than 20 percent currently. [ more ]
Theregister
3 months ago
Artificial intelligence

Nvidia reportedly forms unit to peddle IP to cloud providers

Nvidia is creating a business unit to sell its intellectual property and design services to cloud providers.
Cloud providers like AWS, Microsoft, and Meta have been developing their own custom silicon alternatives to Nvidia's GPUs for AI workloads. [ more ]
Theregister
2 weeks ago
Cars

Tesla wants to monetize its cars to process AI workloads

Tesla considering using vehicle compute power for cash generation. [ more ]
ComputerWeekly.com
1 month ago
DevOps

Cern: Challenges of GPU datacentre management | Computer Weekly

Cern awarded CNCF Top End User Award
Cern exploring GPUs for AI workloads
Kubernetes Scheduler supports GPU sharing [ more ]
ITPro
3 months ago
DevOps

AMD and Microsoft cement relationship with cloud collaborations

Azure customers can run high intensity or AI workloads on powerful infrastructure
Customers don't need to house or maintain the infrastructure themselves [ more ]
[ Load more ]