DevOps
fromBusiness Matters
2 days agoThe Role of Dedicated Servers in Scaling Modern Businesses
Infrastructure investment is crucial for SMEs to ensure reliability, performance, and user experience in a competitive digital landscape.
The model's other capabilities, including support for multimodal inputs, multiple reasoning modes, and parallel sub-agents for complex queries, could help enterprises build faster, task-focused AI for customer support, automation, and internal copilots without relying on heavier models.
Edge computing is a type of IT infrastructure in which data is collected, stored, and processed near the "edge" or on the device itself instead of being transmitted to a centralized processor. Edge computing systems usually involve a network of devices, sensors, or machinery capable of data processing and interconnection. A main benefit of edge computing is its low latency. Since each endpoint processes information near the source, it can be easier to process data, respond to requests, and produce detailed analytics.
When I manage infrastructure for major events (whether it is the Olympics, a Premier League match or a season finale) I am dealing with a "thundering herd" problem that few systems ever face. Millions of users log in, browse and hit "play" within the same three-minute window. But this challenge isn't unique to media. It is the same nightmare that keeps e-commerce CTOs awake before Black Friday or financial systems architects up during a market crash. The fundamental problem is always the same: How do you survive when demand exceeds capacity by an order of magnitude?
The company, which is based in San Francisco and has an office in Pune, India, is targeting up to $35 million this year as it builds a royalty-driven on-device AI business. That growth has buoyed the company, which now has post-money valuation of between $270 million and $300 million, up from around $100 million in its 2022 Series B, Kheterpal said.
A North American manufacturer spent most of 2024 and early 2025 doing what many innovative enterprises did: aggressively standardizing on the public cloud by using data lakes, analytics, CI/CD, and even a good chunk of ERP integration. The board liked the narrative because it sounded like simplification, and simplification sounded like savings. Then generative AI arrived, not as a lab toy but as a mandate. "Put copilots everywhere," leadership said. "Start with maintenance, then procurement, then the call center, then engineering change orders."