the mechanics of coding got faster. the decisions about what to build — and why — didn’t get easier.
the longer you work in software, the clearer one thing becomes: the parts that feel difficult rarely stay difficult for very long.
tools change. platforms evolve. constraints shift. what stays constant is the responsibility that comes with shaping systems other people have to live with — often long after the initial excitement has faded.
early in my career, building real software didn’t require much ceremony. an ide, a terminal, and enough focus to follow a problem all the way through. the work felt simpler not because the stakes were lower, but because the path from intent to implementation was shorter.
that sense of directness matters, because it’s easy to forget what actually changed — and what didn’t.
today, the work feels different in a quieter way.
i spend less time wrestling with syntax and more time framing problems, testing assumptions, and asking better questions. some of that thinking still happens in code. some of it happens through ai tools that help me explore options, explain unfamiliar paths, or sanity-check decisions before they harden into systems.
this isn’t about novelty or automation. it’s about leverage. using these tools changes where attention goes. opting out doesn’t preserve craftsmanship — it just means choosing a different set of constraints, and accepting the trade-offs that come with them.
ai can reason about code, generate it, reshape it, and explain it. in practice, that doesn’t replace experience — it compresses feedback and surfaces trade-offs earlier than they used to appear.
what’s shifted isn’t delivery timelines, but where effort actually goes. less energy is spent translating intent into code, and more is spent deciding what intent is worth translating in the first place.
the work hasn’t gotten smaller. it has become more front-loaded. decisions lock in sooner, and the cost of being careless shows up faster and more visibly.
the cost hasn’t disappeared — it just moved upstream.
and that brings us back to responsibility.
ai can produce convincing output at scale. it won’t tell you when an assumption is fragile, when a shortcut is architectural, or when something will slowly become a long-term liability.
review still matters.
security still matters.
testing still matters.
possibly more than before.
velocity didn’t remove responsibility — it reshaped where it lives.
after years of building and maintaining systems, the lesson hasn’t changed much: the hardest part was never typing the code. it was knowing which code deserved to exist, and which conveniences you’d eventually have to pay for.
ai doesn’t remove that burden.
it concentrates it.
