
"The expansion of SUSE's cloud-native platform thus focuses on the management of AI workloads. The exact environment does not matter: on-premises, cloud, or hybrid. According to SUSE CPTO Thomas di Giacomo, the container will also dominate AI, just as it has conquered the cloud world. The same insights into costs and usage that are available for cloud usage need a counterpart for AI workloads."
"The new release includes an integrated Model Context Protocol proxy, currently still in tech preview. This proxy simplifies connections with central management of MCP endpoints. It streamlines costs associated with AI models and improves data access management. SUSE has previously added MCP components to SUSE Linux Enterprise Server 16 and is therefore busy implementing this standard in its own portfolio."
SUSE expands its cloud-native platform with built-in observability and an AI inference engine to manage AI workloads across on-premises, cloud, and hybrid environments. The platform aims to provide cost and usage insights for AI workloads to prevent failed AI investments and meet ROI expectations. The release includes a Model Context Protocol (MCP) proxy in tech preview to centralize MCP endpoint management, streamline model costs, and improve data access. The platform supports inference engines like vLLM and offers observability for Ollama, Open WebUI, and Milvus. Rancher Prime gains a context-aware AI agent named Liz to simplify Kubernetes management.
Read at Techzine Global
Unable to calculate read time
Collection
[
|
...
]