April 2025 – The Software Engineering Identity Crisis

This month I want to highlight an article, The Software Engineering Identity Crisis, by Annie Vella. This article explores the value we derive from software engineering, how AI is fundamentally changing that value proposition, and how we can adapt to continue meeting our core needs as engineers.

This article challenged me to deeply consider how I feel about the ongoing AI-driven coding transformation. I find myself simultaneously terrified at the loss of software engineering identity and knowledge, and excited at the potential to unlock nascent opportunities.

Recently I was discussing my fear around the loss of deep software engineering knowledge driven by AI usage with a good friend. My friend compared my sentiment to the invention of the calculator and challenged me to consider if I would have rejected calculators in favor of abaci. This was a good challenge. Calculators have clearly enhanced humanity, however my friend’s challenge made me question my position.

The linked article is also extremely thought provoking, and inspired me to share a few fears I have surrounding the knock-on effects of AI code generation. None of these fears are substantiated, and they may eventually prove unwarranted, however I cannot yet reason myself out of these positions:

  1. Software engineers won’t understand how computers work. I worry that our continued abstractions of computers’ inner workings harms our ability to debug complicated problems. Taken to an extreme, we should only code in assembly, which is clearly absurd. However, debugging matters most when computers don’t exhibit expected behaviors due to hardware or systems nuances, and debugging these situations relies on the engineer understanding the complexity in ways that we continue to hide with useful abstractions. AI elevates this problem to new heights and appears to be accelerating the problem, not solving it.
  2. Software products will become hyper fragile and then disposable. Any software engineer knows that software fragility is already rampant, so that’s not a hot take. That said, today we work to reduce fragility over time and create durable systems. At a population level, as we lose our ability to debug complex problems in our systems due to AI generating the code (#1 above), and when AI cannot help us debug, our software products will become disposable — not because the company went out of business, but because we don’t understand the product sufficiently. An AI rewrite may be our only option, and disposable products will likely be consistently low quality.
  3. Data security has already peaked and is on a downhill trajectory. Data and software security is a complex area. We continually mess it up and our data is constantly under attack. The loss of software engineering fundamentals (#1 above), combined with disposable software (#2 above) lead me to conclude our data security is only getting worse from here. I worry that we have too much momentum with the creative aspects of AI-enabled software, and will suffer in the non-creative dimensions, such as security.

“What’s clear is that the definition of ‘software engineer’ is expanding, not contracting. The skills that make someone valuable are diversifying. And this creates both challenges and opportunities.”


Leave a comment