Open source platform Essedum 1.0 brings AI to networking
Briefly

Essedum Release 1.0 is a modular open source platform that accelerates AI integration into network applications. It supports secure data connections, dataset import from storage buckets, MySQL, and REST APIs, and pipeline construction for training and inferencing with model optimization and deployment. Models can be accessed across configured connections and platforms including AWS SageMaker, Azure ML, and GCP Vertex AI. Centralized Endpoints view connected endpoints. Adapters simplify integration without host configuration. A Remote Executor enables offloading compute-intensive pipelines to external servers. Planned enhancements include Docker and Helm deployment automation, PDF and Excel ingestion, and secrets management.
LF Networking has announced Essedum Release 1.0, a modular open source platform designed to accelerate the integration of AI into network applications. The platform supports data connections, pipeline management, and model implementation for both on-premise and cloud environments. Essedum 1.0 includes several core features. Connections provide communication links between software systems to enable data exchange. Datasets support the import and management of data from various sources, including storage buckets, MySQL databases, and REST APIs.
Pipelines enable the building and management of both training and inferencing workflows for AI/ML workloads. This includes model optimization and deployment. The Models feature provides access to AI models from configured connections across different platforms. The Essedum project was introduced this year by LFN member Infosys. The platform offers a comprehensive framework that covers the entire chain: from data ingestion to pipeline orchestration and model deployment. This provides developers and operators with tools to build AI-driven network solutions efficiently.
Read at Techzine Global
[
|
]