Humanities graduates have been advised to learn coding due to concerns over job security in writing. However, advances in AI since 2022 have also impacted developers. While many developers are incorporating AI tools to enhance productivity, trust issues and ethical concerns arise about the reliability and safety of AI-generated code. The demand for experienced human oversight remains significant, especially when the reliance on AI increases and novice roles diminish, raising questions about the future workforce and the importance of human verification in programming.
84% of developers now use AI or plan to use AI daily. But 46% of those same respondents have said they basically don't trust the accuracy of AI code. 61.7% also said they have ethical or security concerns about AI code. So forget the saturation. Even people who are using it are saying, Yeah, you know, I use it, but that doesn't mean I trust it.
Okay, but what if we need people who have experience? What then? How do people get that experience? If junior jobs have gone, what happens when there's nobody available to double check these codebase translations?
Like you say, it's very murky to say the 25% of internal code, or 20 to 30% of internal code, is AI generated. Okay well, as you say, that's not production. And also, what is this internal code doing? Because if it's just 20% that does the stuff like, it checks that the coffee machine isn't about to run out of beans then, yeah, fine.
Collection
[
|
...
]