As I follow my deep curiosity for different forms of social organization – the creative ecosystems – looking into the past for clues about the future has always proved to be an interesting path. As Antonio Gramsci once said, “History teaches, but it has no pupils.” That’s why it is good to look back now and then.
The Industrial Revolution was a huge influence on how we work and learn today. Modern school systems were born in the 19th century to fuel factories, and not long afterward, the office work model followed suit, at the hands of Peter Drucker, considered the father of modern management.
But it wasn’t only work and school that borrowed their structure. “The regularity and efficiency of the factory was the model for the penitentiaries, insane asylums, orphanages, and reformatories,” explains John Zerzan, “embodying uniformity and regularity, the factory had become the model for the whole society.” Zerzan explored the development of the Industrial era in his book, A People’s History of Civilization, giving us a glimpse of our social organization back then.
It looked like this:
With Gutenberg’s printing press, technology made its first promise of modern progress, making change the new norm of social life. Metal types pressed against the paper in an orderly way made mass production possible, and consequently, the assembly line. In other words, the printing press transformed words and ideas into commodities.
However, independent creativity obviously posed an obstacle to manufacturing efficiency. For industrialism to work and be more efficient, it needed more control and predictability. No one wanted “creative” factory workers. It was the presence of work skills that challenged the new technology, not their absence.
In the task-oriented labors of artisans and farmers, for example, work and play were freely mixed. A constant pace of unceasing labor was the ideal not of the mechanic but of the machine – more specifically, of the clock. The independent creativity of workshops gave way, along with working at one’s own pace, to the unremitting technological time of the factory whistle, centralized power, and unvarying routine.
The trend toward mechanization came more from cultural and managerial bias than from carefully calculated marginal costs. And in this new, harsh, and inflexible world, people looked for an escape from reality, mainly through the abuse of alcohol and opium. One of the consequences of the modern dedication to productivity was sure to be the exhaustion of the natural human gift for the enjoyment of life.
Until finally, the machine life was the only one that could be imagined on all sides. Back then, the only idea for making labor tolerable was to decrease the amount of it by means of ever-fresh developments in machinery. However, it was easy to see that work not only increased, but was also steadily more alienated.
You probably weren’t alive during the 19th century, but I bet that it sounded familiar. Let’s put it in today’s context:
With Artificial Intelligence, technology made a renewed promise of modern progress, making disruption the new norm of social life. Algorithms trained on vast datasets generated outputs in an orderly way made mass personalization possible, and consequently full automation of cognitive tasks. In other words, AI transformed thought and expression into data commodities.
However, human judgment obviously posed an obstacle to algorithmic efficiency. For AI to work and be more efficient, it needs more control and predictability. No one wants “creative” human input. It is the presence of human nuance that challenged the new technology, not its absence.
In the meaning-oriented work of educators and designers, for example, reflection and improvisation were freely mixed. A constant stream of optimized output was the ideal not of the human but of the Artificial Intelligence – more specifically, of the algorithm. The independent thinking of creative professionals gave way, along with working at one’s own pace, to the unremitting technological optimization of the algorithm, centralized platforms, and automated consistency.
The trend toward cognitive automation came more from technocratic and big techs bias than from deliberative ethical reasoning or necessity. And in this new disembodied and artificial world, people looked for an escape from reality, mainly through the abuse of social media and numbing digital entertainment. One of the consequences of the modern obsession with efficiency was sure to be the exhaustion of the natural human gift for the enjoyment of life.
Until finally, the algorithmic life was the only one that could be imagined on all sides. Today, the only idea for making mental labor bearable is to decrease the amount of it by means of ever-fresh developments in AI. However, it is easy to see that work not only increases, but is steadily more artificial.
You got the idea.
“Are we not more ‘over-civilized’ than ever, in greater denial?” John Zerzan questioned, “There is more of the artificial than before, and an even greater indifference to history.”
He is right to worry. So, in my work and research, I’ve been trying to propose a different path ahead: understanding the ecosystems we are part of and how they interact presents a more meaningful plan of action than simply delegating everything to technology, as I wrote before.
I won’t go as far as to say that technology is the greatest evil of our times. I don’t think that AI will destroy us, but it won’t save us either. What will eventually destroy/save us is how people use it – technology in general, not just AI. If you think that technology will solve all your problems, you don’t understand technology, and you don’t understand your problems.
So, why is thinking systemically important? Why will this matter more than investing in technology?
The short answer is that people will always be a good bet. Everything we do is made by and for people, or at least has a human input somewhere in the process. However, “there are now fewer places that provide communities and individuals with opportunities to engage in low-stakes hangs and chance encounters with people of different ages, backgrounds, and life experiences,” explained Adam Chandler in his TIME article. Therefore, what we should really aim for is to understand how strong communities work, and how they build upon, interact with, and share knowledge among them. Ideas and actions deeply influence each other.
But understanding these ecosystems isn’t only about how people connect, but how ideas flow through them via distributed cognition. In Ursula Franklin book, The Real World of Technology, she wrote, back in 1989, that “the assault of noise and unsolicited messages on people’s souls seems to me to create an environment of violence quite akin to how aggression and war hurt innocent bystanders, those poor non-combatants caught in fights not of their own making.” For her, “silence is a space for something to happen.”
Personally, what troubles me the most is that this silence Franklin talked about is what we need for deep thinking. And because of the “noise” of technology, we barely have a quiet time to think. Without thinking, it made it harder for ideas to pollinate. But once we must (eventually) do some thinking for work, education, or whatever, and we don’t have the time for it, we start to delegate it to AI. As Audrey Watters wrote:
I’d argue the interest in using “AI” for brainstorming is surely connected to the decline in reading – reading long-form materials, that is, not text messages and status updates (…) As we spend less time undertaking the challenging cognitive labor of reading, we become less adept at both deciphering complex language and thought and constructing complex language and thought in turn. We have nothing that interesting to say (to write) because we have nothing interesting to think about, because we have read nothing substantive.
Audrey’s piece focused on writing, but I think it easily applies to thinking in general. As history tells us, the promise of technology is control, as always was, but its inability to offer efficiency so perfectly only uncovered the real “flaw”: human innovation systems. Because human cognition is distributed (socially, materially, and temporally), our complex systems of thought can’t accommodate such technologies, unless we adapt them for new metrics of “success.” Put it simply, innovation and efficiency can’t go together; to have one, you have to give up the other, what Blair Enns called the “Innoficiency Problem”.
Basically, these old structures for social organization that may have been powerful in the past do not address today’s needs anymore. We should expand our options to include various formats, giving people more flexibility over how they engage with their ecosystems (e.g., work, education, personal life, etc.), and especially what it means to be “successful” in those systems. We can’t claim the future while still living by the past standards. So, I hope we have already learned by now that one size does not fit all.