DevOps
fromTheregister
20 hours agoAWS put a file system on S3; I stress-tested it
AWS S3 Files allows mounting S3 buckets as NFS shares, providing solid conflict resolution and cost-effective storage options.
When civilian banks, logistics platforms, and payment processors share physical data center infrastructure with military AI systems, those facilities become legitimate military targets under international humanitarian law - and the civilian services housed inside lose their legal protection.
The new version combines lower costs with improved cybersecurity and offers up to 2 petabytes of storage in a 2U rack space. Companies are struggling with explosive data growth, increasing cyber threats, and limited budgets. Dell Technologies is responding to this with PowerStore 4.3, a platform that addresses storage challenges without compromising performance or security. The latest version brings innovations that double storage density and reduce energy costs.
As businesses contend with ever-increasing data volumes and performance-intensive applications such as AI model training, AI inferencing and high-performance computing, they need infrastructure that delivers speed, scalability and efficiency without added complexity.
Developers have spent the past decade trying to forget databases exist. Not literally, of course. We still store petabytes. But for the average developer, the database became an implementation detail; an essential but staid utility layer we worked hard not to think about. We abstracted it behind object-relational mappers (ORM). We wrapped it in APIs. We stuffed semi-structured objects into columns and told ourselves it was flexible.
A future-proof IT infrastructure is often positioned as a universal solution that can withstand any change. However, such a solution does not exist. Nevertheless, future-proofing is an important concept for IT leaders navigating continuous technological developments and security risks, all while ensuring that daily business operations continue. The challenge is finding a balance between reactive problem solving and proactive planning, because overlooking a change can cost your organization. So, how do you successfully prepare for the future without that one-size-fits-all solution?
When ChatGPT launched in late 2022, I watched something remarkable happen. Within two months, it hit 100 million users, a growth rate that sent shockwaves through Silicon Valley. Today, it has over 800 million weekly active users. That launch sparked an explosion in AI development that has fundamentally changed how we build and operate the infrastructure powering our digital world.
A North American manufacturer spent most of 2024 and early 2025 doing what many innovative enterprises did: aggressively standardizing on the public cloud by using data lakes, analytics, CI/CD, and even a good chunk of ERP integration. The board liked the narrative because it sounded like simplification, and simplification sounded like savings. Then generative AI arrived, not as a lab toy but as a mandate. "Put copilots everywhere," leadership said. "Start with maintenance, then procurement, then the call center, then engineering change orders."
This new reality is forcing organizations to undertake careful assessments before making platform decisions for AI. The days when IT leaders could simply sign off on wholesale cloud migrations, confident it was always the most strategic choice, are over. In the age of AI, the optimal approach is usually hybrid. Having openly championed this hybrid path even when it was unpopular, I welcome the growing acceptance of these ideas among decision-makers and industry analysts.
The main advantage of going the Multi-Cloud way is that organizations can "put their eggs in different baskets" and be more versatile in their approach to how they do things. For example, they can mix it up and opt for a cloud-based Platform-as-a-Service (PaaS) solution when it comes to the database, while going the Software-as-a-Service (SaaS) route for their application endeavors.