According to Tamas Cser, Founder and CEO of Functionize, the industry is on the verge of a structural shift. By 2026, development teams will transition from AI copilots to agentic fleets: coordinated groups of specialized AI agents operating semi-autonomously across the entire software lifecycle. In this new paradigm, engineering excellence is measured less by syntactic mastery and more by the ability to orchestrate intelligent systems-delegating, validating, and refining work continuously, at machine speed.
Over the past few years, I've reviewed thousands of APIs across startups, enterprises and global platforms. Almost all shipped OpenAPI documents. On paper, they should be well-defined and interoperable. In practice, most fail when consumed predictably by AI systems. They were designed for human readers, not machines that need to reason, plan and safely execute actions. When APIs are ambiguous, inconsistent or structurally unreliable, AI systems struggle or fail outright.
Hold on. There's yet another method, one that comes from MacOS. That method is Homebrew. What is Homebrew? Homebrew is a free, open-source package manager for Linux and MacOS that simplifies the installation and management of software. Think of Homebrew as a command-line version of the App Store that allows you to install command-line tools such as Python, Node.js, and more with ease.
A secure software development life cycle means baking security into plan, design, build, test, and maintenance, rather than sprinkling it on at the end, Sara Martinez said in her talk Ensuring Software Security at Online TestConf. Testers aren't bug finders but early defenders, building security and quality in from the first sprint. Culture first, automation second, continuous testing and monitoring all the way; that's how you make security a habit instead of a fire drill, she argued.
You know that feeling? You're developing a new email feature, you run your test script, and boom you realize 3 seconds too late that you used the production database. Your CEO just received an email with the subject TEST - DO NOT READ - LOREM IPSUM. Or worse: you configured a cloud SMTP server for testing, forgot to disable actual sending, and now your Mailgun account is suspended for suspicious activity because you sent 847 emails to test@example.com in 5 minutes.
If you're trying to make sure your software is fast, or at least doesn't get slower, automated tests for performance would also be useful. But where should you start? My suggestion: start by testing big-O scaling. It's a critical aspect of your software's speed, and it doesn't require a complex benchmarking setup. In this article I'll cover: A reminder of what big-O scaling means for algorithms. Why this is such a critical performance property.
Ok, you aren't here (I assume) to peruse my books and see how few books I consume (teenage Ray would be embarrassed by the number). The biggest reason I switched to Hardcover was because of their API, which I wanted to use to display it on my Now page. Again, I don't honestly think anyone cares what I'm reading/listening to/watching, but I think it's cool and that's all that matters on my little piece of the Internet.
While building apps I learned that writing code is only half the journey - getting it deployed, updated, and running reliably is also just as important if not more. When I started deploying my apps to the cloud, I realized how many manual steps it took to get the app running. That's when I discovered CI/CD and GitOps tools that automate everything from testing to deployment, so developers can focus on writing code instead of wasting time on manually deploying each time.
Meta has applied large language models to mutation testing to improve compliance coverage across its software systems. The approach integrates LLM-generated mutants and tests into Meta's Automated Compliance Hardening system (ACH), addressing scalability and accuracy limits of traditional mutation testing. The system is intended to keep products and services safe while meeting compliance obligations at scale, helping teams satisfy global regulatory requirements more efficiently.
The gist of the idea is to run the whole user environment, desktop and all, inside WINE. So it's something like a bare-metal WINE sitting on top of the Linux kernel, with just enough plumbing to connect them up. This is significantly different from the current way, which is to run a completely Linux-based stack - the kernel, an init, a userland, a Linux display system, and a Linux desktop, and then run Windows programs inside that.
Only the engineers who work on a large software system can meaningfully participate in the design process. That's because you cannot do good software design without an intimate understanding of the concrete details of the system. Generic software design What is generic software design? It's "designing to the problem": the kind of advice you give when you have a reasonable understanding of the domain, but very little knowledge of the existing codebase.
In the past, programmers worked with platform-specific, fast, lightweight native code editors, but beautiful, cross-platform, hybrid code editors changed everyone's minds - programmers started using heavyweight hybrid editors on their powerful hardware. They started upgrading hardware continuously to run these heavyweight hybrid code editors solely because of their modern look and feel, trend, and productivity-focused features. That's how VSCode became the software industry's default code editor.
HP-UX had a good run, after a long and very varied history. The first version ran on the HP 9000 Series 500 range of 32-bit machines based on the HP FOCUS multi-chip CISC processor. In 1984, the Hewlett-Packard Journal ran a detailed article on how that port was created, based on a kernel called SUN - not related to Sun Microsystems' SunOS.
Reference counting is the primary memory management technique used in CPython. In short, every Python object (the actual value behind a variable) has a reference counter field that tracks how many references point to it. When an object's reference count drops to zero, the memory occupied by that object is immediately deallocated.
I moved to the US from India in August 2021 to pursue a master's degree in computer science at the University of Southern California. Soon after, I began looking for a summer internship - but things didn't go as planned. Like many aspiring software engineers, I was excited about the idea of working at a Big Tech company like Google, Apple, Microsoft, Meta, or Netflix.
For those unfamiliar with Sixel, it's a bitmap graphics format designed for terminals and printers that encodes bitmap data into terminal escape sequences, with each printable character representing a 6-pixel-high, 1-pixel-wide column. Tile enough of them together and you've got full-color images, and even animation. In brow6el's case, it uses the libsixel package to generate graphics. This minimalist in-terminal browser isn't just able to display fully rendered web pages thanks to the Chromium Embedded Framework, as demonstrated in a video included in the Codeberg repository.
Developing software is like taking a journey on which a team is continually making decisions about which way to go, both about the functionality of what they are building (the MVP), and also about what sort of architecture they need to support the MVP (the MVA). The main challenge in using this approach is building something quickly enough to release so that the team can get important feedback as soon as possible.
The colonial Indian government decided the cobra population should be reduced, and decided the solution was to pay a generous bounty for every cobra carcass it received. What the government hadn't anticipated was that some enterprising individuals would start breeding cobras, because cobras were now lucrative. This was bad, but what happened next was worse - the government cancelled the program, and everyone who'd been breeding cobras suddenly had no incentive to keep the snakes, so all the captive cobras were released into the wild.
Leave it to the Linux community to come up with something that no other operating system can do. It happens all the time, and the creativity and ingenuity of these developers never cease to amaze me. Such is the case with a new desktop environment called Orbitiny. The goal of this new desktop, which has been built from scratch using Qt and C++, is to be both familiar and unique.
For almost ten years, the DORA research initiative has explored the effectiveness and metrics of top-performing organizations driven by technology. This effort has gathered insights from over 36,000 professionals from various sizes of organizations and a wide range of industries. DORA aims to unravel the link between operational practices (capabilities) and their outcomes, focusing on significant achievements both organization-wide and to its members.
Prior to this release, SPFx projects leveraged a Gulp‑based build toolchainto orchestrate tasks such as compilation, bundling, and packaging. While familiar, this model has long been considered dated relative to modern JavaScript and TypeScript workflows. At the same time, older SPFx templates and generator outputs triggered npm audit vulnerabilities and lagged behind current dependency expectations. Version 1.22 addresses both concerns by transitioning to a new toolchain and cleaning up scaffolded solution vulnerabilities.
As modern technologies such as artificial intelligence grab today's headlines, it's worth remembering that their foundations were being laid more than half a century ago by computer scientists, philosophers, psychologists, developers, entrepreneurs, and more. These pioneers and those who followed tackled issues and solved problems that future generations may never know existed - but without their seminal work, we wouldn't be where we are today.
My office is two kilometers away, and on foot the trip takes about twenty-five minutes, sometimes a bit more, sometimes less. Like most locals, I eventually switched to a bike, and the commute dropped to around 6.5 minutes. Later, I found a shorter route without bridges or traffic lights; my commute dropped down to about 4.8 minutes. If you plot these trips over time, you get a picture like this: