Is AI eating our culture?

Is AI taking our jobs?

I was recently writing a blog about my first impressions on Snowpark. While writing, I thought it would be nice to have some fitting illustrations to enrich the text. I might have been able to create suitable artwork myself, but I could not justify spending several hours of my working day to do that. So, what to do?

Luckily, there has been rapid development in the field of generative AI, and services like DALL-E and midjourney have become increasingly popular. I decided to give DALL-E a try and in a short amount of time, was able to create nice illustrations for my purpose.

At first, I felt good about what I had done. However, after discussing with my partner, who happens to be a professional illustrator, I realised that I should have thought more carefully. While creating illustrations using AI, I was contributing to the loss of livelihood of people, in this case professional illustrators.

But wait, If I had no time to the job myself nor budget to get it done by a professional, it doesn’t mean that I’m a crook, right?! This logic is flawed, however. If one is not able to pay for a pro for good quality work, it is not ethically acceptable to have a child do it in a sweatshop on the other side of the world either. Rather extreme analogy, I admit, but the point is to think about consequences of our choices.

Not that long ago, it was generally thought that while AI will be replacing some jobs, it will not do that for most of us. This conservative opinion was largely based on the limited capabilities of AI as well as the price of automation being too high for most use-cases. The scheme is, however, changing rapidly and there is even research being made on the subject. For example, the newly released GPT-4 is basically passing as a lawyer and a (junior?) developer. Also, Google has just introduced PaLM-E, making it much easier to interact with robots.

The end of culture?

I predict that it is just a matter of time that most tasks requiring, e.g., legal expertise will be automated. The same goes for many code development tasks (yes, I’m starting to feel a bit nervous…) and the list does not end here. In principle, any human activity that can be generalised from the content of the internet using ML is at risk. But what about tasks requiring creativity?

There has been a lot of discussion lately about the creativity of generative AI software (based on large language or computer vision models, such as chatGPT or DALL-E, respectively) and whether what they produce can be considered as art (e.g., kulttuuriykkönen). Last year, an AI generated image won a prize, which has prompted heated debate about what can be considered as art in the first place.

The Forbes magazine writes the following:

“Today art is still generally defined as a human-performed activity, created by those who have been educated to capture the essence of things and their feelings in an appealing form that involves the imagination. We commonly think of art as those forms of expression that come from someone’s emotions and that we relate to on a human level.”

This captures, in my opinion, quite well the essence of art, which should, also in my opinion, remain this way. While AI generated images can be stunning (generally they are not), they are seldom, if ever, truly genuine or imaginative. This stems from the fact that the generative models replicate patterns in the training data rather that coming up with new ideas.

I’m not saying that AI can’t be creative. DeepMinds alpha zero has reached super-human skills in various games without any human interference. This has been possible by the utilisation of reinforcement learning, where an AI agent ties to come up with an optimal strategy, given an environment with rules and rewards.

Art on the other hand is not a game. An artist is not typically aiming to please an audience but rather express inner feelings or convey ideas or opinions. The same goes for basic research; to find new things one must look where no one has been looking before, without the fear of failing. This is the way new discoveries and innovations are made.

Unfortunately, artists are not the only ones needing to fight against AI generated material.  Journalists are also at risk of losing their jobs due to AI generated content, something that is already happening. For example, Daily mail has started to use AI for writing articles and sacked 200 employees. If this is what language models can do today (is there still someone who has not tried chatGPT out?), there is not a long way to AI writing novels. What happens next is that publishers start using AI to mass produce books, probably mimicking previous bestsellers.

As a result, it will become very difficult to make a living as a writer and probably nearly impossible for new talents to emerge. Without any realistic prospects for succeeding as a professional writer, will we eventually loose genuine literature? Physical pieces of art still require an artist, but books are printed on paper by machines so there is not really a need for a human in the loop.

Is this really the future we want? Algorithms being fed more and more material produced by other algorithms, which potentially leads to loss of all creativity and imagination? Even the mere thought of such a possibility is daunting. The use of AI generated text, sound, or images in place of human generated material is a signal that human creativity and intellectual talent is not worth the investment. Or even as an ethical statement that human creativity is not important.

It can be seen already that the rabbit hole with generative AI goes deep. However, we should think carefully whether to follow that path, as it might lead to the loss and degeneration of human culture. This might sound extreme and unrealistic, but we need to bear in mind that AI development is accelerating at an exponential rate; yesterday's sci-fi is tomorrows reality.

It can be that in the future we will have mass entertainment generated by algorithms and artisan culture made by people, for those willing to pay for the authenticity. Even today, many professionals refuse to use AI tools and rather use their own skills out of passion. And passion is something algorithms cannot possess. Even if I’m not a writer, I enjoy writing myself, as much as I enjoy writing code myself.

The time for discussion was Yesterday

At present, it is up to everyone to decide if they wish to support the process of outsourcing creativity. However, the imminent culture crisis calls for discussion at the society level to decide whether protective or regulative actions are needed.

In a broader scope, we must realise that the need for such discussion goes far beyond culture. Throughout history, jobs lost to automation tent to be lost for good. While this process has been slow in the past, it is now evident that it will be slow no more. Somehow this brings in mind a famous quote from 15 years ago by Blockbuster's CEO Jim Keyes: “Yes, it’s melting, but it’s a slow melt.” Instead of melting, I would rather use the word vaporization, to express the speed of development.

This does not mean that all automation is evil. Automation can relieve people from tasks that are either physically or mentally demanding, or otherwise unpleasant to execute. Computers can also be much better at certain tasks and make fewer errors than humans. In such cases it is generally accepted that machines replace people or assist them in their work.

But as a society, we are not prepared for a situation where people are becoming redundant en masse. To prevent such a scenario, or to prepare for the paradigm shift that can be a head of us regarding the nature of work, legislators, civil society organizations, and other stakeholders need to wake up. We cannot really stop the process of AI development, but at least we can try to make this process as ethical as possible.

The above speculations sound probably quite wild. And yes, my main motivation here has been to provide food for thought, as I sincerely do think that we need to seriously think about where the development is heading. However, the automatization of culture is already here. A critical question will be whether the public will generally find such material worth paying for.

According to chatGPT, the increasing use of generative AI is likely to have both positive and negative effects on human culture, and it will be up to individuals and societies to determine how to navigate these changes in a way that promotes creativity, diversity, and equity. Reassuringly, again according to chatGPT, there will always be a need and a place for human creativity in the world as AI models do not have the same level of intuition, emotional intelligence, and abstract thinking as humans, and they cannot fully replace the unique perspectives and experiences that humans bring to the creative process.

I wish you’re right, chatGPT.

Manual culture at risk. Photo by Debashis RC Biswas on Unsplash