#gpu-acceleration

[ follow ]
Artificial intelligence
fromFast Company
2 weeks ago

This startup claims it just outran Nvidia on its own turf

DataPelago's Nucleus dramatically accelerates data processing across hardware, outperforming Nvidia's cuDF and exposing significant software-driven GPU performance limitations.
Data science
fromTalkpython
2 weeks ago

Accelerating Python Data Science at NVIDIA

RAPIDS enables zero-code GPU acceleration for pandas, scikit-learn, NetworkX, and other Python data libraries, delivering large speedups and scalable GPU-native workflows.
fromInfoQ
2 months ago

GPULlama3.java Brings GPU-Accelerated LLM Inference to Pure Java

The TornadoVM programming guide demonstrates how developers can utilize hardware-agnostic APIs, enabling the same Java source code to run identically on various hardware accelerators.
Java
Data science
fromHackernoon
3 months ago

Supercharge ML: Your Guide to GPU-Accelerated cuML and XGBoost | HackerNoon

GPU acceleration can greatly enhance traditional machine learning workflows
Focus on cuML, XGBoost, and dimensionality reduction techniques for efficient data processing
Python
fromHackernoon
3 months ago

Achieve 100x Speedups in Graph Analytics Using Nx-cugraph | HackerNoon

Leveraging GPU acceleration can significantly improve performance in graph analytics.
nx-cugraph provides a solution for enhancing NetworkX workflows on larger datasets.
Artificial intelligence
fromInfoQ
3 months ago

Google Enhances LiteRT for Faster On-Device Inference

LiteRT simplifies on-device ML inference with enhanced GPU and NPU support for faster performance and lower power consumption.
[ Load more ]