While introducing bugs is certainly a risky side-effect of AI coding, the history of software development has included controversial changes in the past, including the transition from assembly language to higher-level languages, which faced resistance from some programmers who worried about loss of control and efficiency. Similarly, the adoption of object-oriented programming in the 1990s sparked criticism about code complexity and performance overhead. The shift to AI augmentation in coding may be the latest transition that meets resistance from the old guard.
Stepping away from assembly did have that effect though. The tradeoff was that code was easier to make and easier to optimize, but its undeniable that it did lead to a loss of control and efficiency.
Similarly, the shift to object-oriented programming also increased performance overhead, but the tradeoff was that you can seamlessly reuse code which makes larger projects more manageable.
The article is right that AI coding is probably here to stay, but all the disadvantages that people are highliting are real concerns that won’t go away, they’ll just be adopted as the new normal.
Stepping away from assembly did have that effect though. The tradeoff was that code was easier to make and easier to optimize, but its undeniable that it did lead to a loss of control and efficiency.
Similarly, the shift to object-oriented programming also increased performance overhead, but the tradeoff was that you can seamlessly reuse code which makes larger projects more manageable.
The article is right that AI coding is probably here to stay, but all the disadvantages that people are highliting are real concerns that won’t go away, they’ll just be adopted as the new normal.
I don’t know that moving away from assembly made things easier to optimize, but easier to read and maintain, absolutely
Thanks for the info, I was wondering how the hell we still have laggy performance compared to the hardware we used to have.