Technology is making us boring

As the words come to my mind, I write this article using my computer. Also, sometimes I translate my thoughts using an AI tool to find better words to enrich the text. Then, I publish it and share it with people all over the world via social media platforms. Just for the brief life of this small piece I just composed, technology is all over the place to help me make it happen.

When it comes to new technology, I believe it is only natural for one to think: “how these technological tools can help me think and improve the quality of what I do?”. We have been doing this for a long time -especially in science- but couldn’t we have gone too far?

Recently, I came across this post from Patrick Ryan, where he argued:

[In basketball], when you run the numbers, the only places it makes sense to take a shot from are outside the D (3 points) or right next to the basket (2 points). As a result, nowadays every team in the NBA plays basketball this way. But it is not just basketball (or, famously, baseball) that has adopted “moneyball”. Like an incredibly contagious virus, the philosophy of moneyball has rapidly spread, infecting the entirety of human culture. In 2022, the 10 most popular films in the world included eight franchise sequels (Top Gun, Jurassic Park, Avatar, Minions, a Shrek spinout, three Marvel movies), and a rehash of Batman. It’s the same story in music. The music industry has been getting steadily more data-driven (read: boring) since the early ‘90s. Studies indicate pop hits are getting more similar over time. Country is fast becoming one of the most popular music genres in the world, and it all sounds the same.

In Formula One, teams’ engineers collect 30 megabytes per lap of live stream data generated by the car while it’s running, and two or three times more is downloaded from the car once it hits the pits -what can easily reach a terabyte of data per race. However, teams got so good at building cars that F1 became boring due to its predictability and lack of competitiveness –Lewis Hamilton’s words, not mine. In short, the best tech wins.

In the eagerness to get it right, or to win, or to be better, we have delegated most of our lives to technology. In a data-driven society, the unpredictability of human thought and the imperfections of human actions became undesirable. In other words, you can make a thing so perfect that it’s ruined. But the real problem with this data-driven approach is not actually what we can do with the data, but what we miss because of it.

During World War II, fighter planes returned from battle with multiple bullet holes. Upon landing at the base, these aircraft were analyzed to understand which parts were most commonly hit by enemy fire to strengthen armor and to reduce the number of planes shot down. Interestingly, while some parts were riddled with holes, others were largely intact.

That’s when mathematician Abraham Wald pointed out that maybe there was another way to see it: the reason certain areas aren’t covered in holes could be because the planes that were shot in those areas didn’t come back. This realization led to the reinforcement of armor in the parts of the plane where there were no holes and, consequently, reduced the number of planes shot down.

So, could we have been adding armor to the wrong places when it comes to technology?

In 2020, a mysterious account reached an impressive 800K followers on TikTok by posting videos of sink reviews. Dean Peterson, a filmmaker from NY and the face behind the account, said that what started as an alternative to dealing with pandemic times quickly became an obsession on social media. “What was really weird was when it started to get media attention”, he recalls, “there were articles in Curbed, Apartment Therapy, Time Out, The New York Times. Even Drew Barrymore’s show emailed me.” But commenting on his relationship with the algorithm, he said:

I had this fun, creative outlet where I could review sinks using the same criteria and vocabulary that you might use to review a piece of art. But as time went on, I started getting a little burnt out. Nobody was forcing me to do them every day; that was something I imposed upon myself. But it slowly started to dawn on me that if you don’t feed the TikTok beast consistently, that the algorithm begins to turn its back on you. I knew that it was really over for me when doing them started to feel like work.

Sadly, this is not an isolated case. Out of personal curiosity, I’ve always followed up with people who became famous doing what technology told them would work (a.k.a. “trends”) basically to see what’s next. It turns out, most if not all had the same end. When technology becomes our boss, it is easy to get stripped away from all the pleasure and personal fun for the sake of getting better numbers. As Patrick Ryan argued:

Thanks to short form video on TikTok, Youtube and Instagram, you are now fed a constant, frustratingly addictive stream of the-same-but-slightly-different video clips whenever you open your phone. Algorithms optimize [what] they know you’ll like just enough to keep watching. As AI becomes more ingrained in our culture, the media we create and the very paper we write on gets embedded with universal, data-driven, computer generated “intelligence”. This can almost wholly replace human curation, creation and even relationships.

Put differently, what used to be a human role to both create and curate their work, technology slowly took over these responsibilities and now we are its manual labor and the tools doing its bidding.

I argued elsewhere about the importance of doing meaningful work, and one thing that became clear to me after all the fuss around AI is that machines think differently than us, with different goals and values, but putting our work in their hands will eventually end badly for us. Cases of depression and burnout have skyrocketed in the past years, and I can’t stop wondering if behaving like machines is not at the heart of the problem.

Maybe it is time to rethink what we really want to accomplish with the help of technology. As an educator myself, I believe that if we ever want to change this scenario, the best way to do it is to incorporate the conscious use of technology in Education. As Professor Boris Steipe from the University of Toronto put it:

It’s up to us as professors to provide an education that remains relevant as technology around us evolves at an alarming rate. If we outsource all our knowledge and thinking to algorithms, that might lead to an unfortunate poverty in our curiosity and creativity. We have to be wary of that.

Also, in Neil Postman’s book The Surrender of Culture to Technology, he proposed 7 questions we should ask when developing new technology:

1. What is the problem to which technology claims to be a solution?
2. Whose problem is it?
3. What new problems will be created because of solving an old one?
4. Which people and institutions will be most harmed?
5. What changes in language are being promoted?
6. What shifts in economic and political power are likely to result?
7. What alternative technologies might emerge from this?

So, when evaluating a new technology, we should reflect if it is really needed or useful at all, and if by eliminating the human element we do not replace or disrupt anything good that already exists, and this includes family and community relationships.

When it comes to using tech, do not lose focus on one thing: YOU are the keeper of ideas, historical memory, compassion, context, and hope, that the algorithm cannot understand and technology cannot automate. So do not forget to act like it.

Scroll to Top