Cloudian launches object storage AI platform at corporate LLM | Computer Weekly
Briefly

Cloudian launches object storage AI platform at corporate LLM | Computer Weekly
"Cloudian has launched its Hyperscale AI Data Platform, an on-premise S3-based storage platform plus artificial intelligence (AI) infrastructure bundle aimed at enterprises that want quick answers from corporate information. The offer utilises Cloudian object storage plus Nvidia RTX Pro 6000 Blackwell graphics processing units (GPUs) in a retrieval augmented generation (RAG) architecture to power large language model (LLM) functionality that is trained on the mass of corporate data that often goes untapped."
"It comprises three nodes of S3 object storage, in this case on-premise, and connected using S3 over remote direct memory access (RDMA), developed with Nvidia. This allows for rapid connectivity between storage nodes, using RDMA, which was originally developed to allow data to move from the memory of one server to another for high-throughput, low-latency operations while not hitting central processing unit (CPU) resources."
Cloudian's Hyperscale AI Data Platform is an on-premise S3-based object storage and AI infrastructure bundle that enables rapid natural-language querying of corporate information. The solution pairs Cloudian object storage with Nvidia RTX Pro 6000 Blackwell GPUs and a retrieval-augmented generation architecture to power LLM-based search over large corporate datasets. The appliance is air-gapped for on-premise security and includes three S3 nodes connected via S3 over RDMA to reduce latency and CPU overhead. A billion-scale vector database indexes ingested data into numeric vectors for similarity and context calculations to enable accurate, scalable retrieval for enterprise use cases.
Read at ComputerWeekly.com
Unable to calculate read time
[
|
]