Humans have already figured out how to automate repetitive physical and digital labor, and we’ve been doing it for decades and even centuries by using machines and computing. Simply put: If it’s repetitive, then you don’t need AI to automate it.
In fact, the kinds of task we want AI to automate are precisely those that AREN’T repetitive. That was the whole god damn point of AI.
How did we go from the original purpose of AI to claiming that it will do what we’ve already been doing for decades? Where do these narratives come from, and why do people fall for them?
Unless your job is cutting-edge research where you are truly making new scientific discoveries and methods, you're just combining other peoples' ideas into a new unique package and selling it.
The truly valuable work is to notice that there is an underserved market and figure out how to meet their needs.
I write a bunch of widgets for my website. They're little calculators that use common components and apply simple logic. Think unit conversion or date arithmetic.
These currently take a few hours to write, and most of the work is just wiring things together in a predictable way: template, tests, common form controls.
I think that this would be a very good case for AI.
I suppose that generative AI was seen as such a boon to writing boilerplate because it could do so without you having to specifically program anything; it was trained on enough sufficiently-close examples that it could pull it off without a thorough description.