+ The endless era of deciding what we want to be human
What do we do when technology lets us offload human work?
Since the dawn of technology (I'm talking the wheel here, not computers), humanity has been attempting to automate and streamline everything we can. Automation and optimization lets us do more than we could before, so as those ramp up, we suddenly have time to do more things. Multiply that by like a billion things and we get a species that moved from caves to a real, worldwide society. The industrial revolution really ramped up how much we could not only speed up things, but hand off work completely to machines. Entire categories of work (and play) were able to be done by machines at far higher accuracy and speed, so we happily let them take over the work.
Obviously this post is about AI, but let's not talk about AI itself.
Sport has long been a battle of skill where people manipulate their body to become the best in the world, but eventually we developed certain drugs that accelerated things to a point where as a society we agreed, "um, that's actually not okay," and today basically every organized sport has strict rules around what sorts of drugs athletes are allowed to put in their bodies. We want our athletes to be good, but not too good or not too good because of specific types of drugs.
Similarly, computer graphics have been able to create (mostly) lifelike visuals for a couple decades now, but the general public tends to be drawn to movies that do things "for real". We're much more engaged seeing real Tom Cruise hanging onto a real airplane than we are to seeing the identical image fully rendered in CG.
Or how about clothing or furniture? We can mass produce this stuff, but there's a market for "hand-made" products that people will pay extra to get.
Or even blogs like this? You can get the news on Apple and tech without following a specific person online, there are tons of content aggregators that will summarize the news for you, but you're here because you choose to get something different from a raw news feed.
The list goes on, but my point is that there are plenty of things that we can already automate or completely offload to machines and computers, but we choose to retain a human element in a bunch of cases. This finally brings us to the AI part of this conversation. I could feed a sentence into ChatGPT and have it spit out a 500 word blog post in a fraction of the time it takes me to write this very post myself. Hell, if I did it right, you might not even be able to tell the difference. I could also use image generators or music generators or even video generators to crank out "content" way faster than I could were I to do it myself. I don't do that, and you'e probably glad I don't, but like the above examples, this is a social choice, not a practical one. If my goal was to publish the most "content" to this blog, then I should be typing short prompts into an LLM, getting something back, and blindly copy/pasting that into my Ghost admin. I'm also much more likely to be invested in and enjoy music created by someone who did it using traditional methods more than someone who generated a song with an LLM.
And yet, there are plenty of times I use LLMs on a daily basis. I have been doing a lot of development work recently, and that's all been done with a considerable amount of help from LLMs. I also use LLMs at work to help me research topics and help with messaging things things audiences where I need some help finding the right way to say something (I usually reword the raw LLM result, but it's helpful nonetheless). I also used ChatGPT image generation last week to create a fake product image that I was using in a storefront mock up I was making.
My point here is that just because humans can do something with technology doesn't mean we will use them for that. At the same time, there's a place for automation that is acceptable as well. As with all of these technological advancements, society will decide how much we want to automate and how much we want to value human creation.
Discussion