EXPO          CONCEPTS & PROTOTYPES          SCENARIOS          INSIGHTS         THE TEAM

The tool we use the most but trust the least: algorithms

When was the last time you picked up your phone? Less than 30 seconds ago? An hour? Our phone always has something new to offer. In a way it starts to compete for attention with our reality. This isn’t a coincidence. Algorithms and Artificial Intelligence have made it incredibly easy to keep our attention where meant to keep it, without us even noticing it. This is taking a toll on our perception of truth and the way we trust.


︎︎︎ROSA JOLIEN LAAN / TECHNOLOGY

Let’s start with an important disclaimer: I am not a tech-professional. I am a 21-year-old student. I am part of the generation who has been experiencing technology and social-media from her early teens. When it comes to technology and understanding its ins and outs, it has always felt somewhat out of reach. Having lost count how many times I got distracted by my phone whilst writing this article, I can tell I’ve gotten careless and comfortable with a tool that I can’t fully understand. Which is exactly why this subject is so important.

You get online for free so how do
tech-companies make money?

Algorithms and Artificial Intelligence provide us the content it ‘thinks’ we want, when we want it. We feed it with information; our clicks, swipes and pauses (yes, even how long we do nothing is noticed) to in return receive new results based on that information.

The question which should be raising your eyebrows is: how exactly do these tech companies make money? The best explanation has been given to us in The Social Dilemma documentary: “If you are not paying for product, you are the product”

Or even better, by Jaron Lanier, the writer of Ten Arguments for Deleting Your Social Media Accounts Right Now: “It’s the gradual, slight, imperceptible change in your own behavior that is the product”. Tech companies, such as google and Facebook, have three goals: engagement, growth and advertising. They make their money by making sure we stay online, get others online and selling ads we’ll see online.



“They have three goals: engagement, growth and advertising”

 



  Image: Pablo Rochat via Instagram. 


We dread being disconnected

So, truth is, it is happening right under our noses. An example of this is the way we swipe down to refresh our homepage. It has the same addictive motion as Vegas’ slot machines and we repeat it because we know it will constantly offer new juicy content.


But is it really that invisible? We know we pick up our phone and we are aware our social channels use algorithms. Why don’t we care? I hear you thinking: “I can put my phone down and close my laptop at any time I want”, but this isn’t really true, is it? We are obliged to use our technologic services in one way or another: to follow classes during the pandemic or to access our bank accounts for example. We’ve learned to trust that these tools work in our favor.



“Our fear of being disconnected is greater than our fear of being misinformed”




Trust and network has become our ‘currency’, as Rachel Botsman, author, speaker and trust expert, calls it. We have endless material to compare to each other, loads of businesses rely on reviews and everybody you’d want to reach out to is at the end of your thumb. The fact that we don’t care about the risks while being aware of them, means our fear of being disconnected is greater than our fear of being misinformed.


Misinformed, outraged and polarized

Conspiracy theories and fake news easily find an audience online. People will believe what they want to believe. As human beings we are programmed to find explanations to unexplainable things to feel secure. Fake news and conspiracy theories spreads six times faster than true news according to The Center for Humane Technology’s Ledger of Harms. This is because fake new grabs our attention due to containing highly emotional and unexpected contents. In TIME article ‘Down the rabbit hole’ Charlotte Alter explains that distrust in the national news is causing people to do their own (online) research and sometimes even unknowingly express elements of the Qanon or Pizzagate theory, which claims democratic politicians run a global sex-trafficking ring, sounds bizarre right?


This seems far from home but in a recent survey conducted under Dutch students with multiple social media accounts, nearly a quarter admitted they do not trust all of the accounts they follow online. Besides that, nearly all participants stated they came across multiple conspiracy theories online, but nonetheless still use these channels for connecting with people and staying up to date with news.


You can think of it like a Rubik’s Cube: AI twists and turns the colors of our content. With the goal of providing you with results similar to what you share yourself or have viewed before as perfectly as possible. In the end, if successful, our homepages view either just yellow or just green. While staying on these colored platforms we’ll not even be able to see the other colors anymore because they are on opposite sides of the algorithm.

“We are worth more addicted, outraged, polarized and misinformed.” said Tristan Harris, co – founder of The Center for Humane technology, in an interview with 52-insights. For that same reason, the ones who are really able to create change within the social network software and the ways we use it, the big tech company owners, though showing support for Tristan’s work, are seldom involved in the conversation. “We don’t live in a culture of forgiveness. It’s very hard for them [tech companies] to make public choices in which they won’t be trashed for doing so”. So until they do, it’s up to us to handle tech more responsibly.



“Its possible to be pro technological innovation and still demand safer design and
program principles”




Innovation, regulation or both?

Facing the future, we might want to reconsider our current naïve attitude towards the ease of networking online and the loads of information we so willingly digest. It’s possible to be pro technological innovation and still demanding safer design and program principles. We need to start listening, conversing and look further than your homepage’s suggestions. Technology is embedded in our systems and is offering promising possibilities in the future. Let’s make sure it’s designed ethically and instead to manipulate, enrich our thinking.

︎





More from Rosa Jolien Laan 

INSIGHT REPORT 2020
Brands and Innovation         Amsterdam Fashion Institute