1). Privacy is a right, not a crime
2). Privacy is important, because it gives us psychological safety, makes our behaviour less predictable and standardized. It also helps avoid automated assumptions being made about you
3). If you don't ask for it while you can, you allow the lack of privacy to become the "new normal", with vast implementation of AI
Last week, I accidentally became a privacy activist.
In 2021, I was invited to speak at Mobile World Congress, the largest tech conference in the world, and I accepted it. It soon turned out that, in order to register to attend the event live, I had to upload my passport and use facial recognition to check-in.
I didn’t feel comfortable with it, as I didn’t see the necessity for a private company to have my passport copy. I also knew that the GDPR regulation allows you to opt out of this system (and so did the organizer’s website). However, citing the pandemic and police requirements as a reason, the organizers said there wouldn’t be a live check-in, and so I had to upload the passport copy, or else I couldn't attend the live event.
In a long email thread with GSMA and their data protection officer, I kept questioning how exactly having my passport would help someone be protected from COVID (in an event where thousands of people would be sitting next to each other), but I received no answers. I also received no clear answers about how exactly my data would be stored and used. Furthermore, my questions were clearly treated as excessive and annoying, and half of them were ignored. So I ended up joining the virtual event.
However, something didn’t feel right, and a bitter aftertaste of an Orwellian world was imminent. So I consulted a lawyer friend, filed a claim with the Spanish data protection agency (AEPD) without much hope, but just because I thought it was the right thing to do for my own conscience.
Two years later, the claim was satisfied. Following my privacy claim, GSMA - the organizers of Mobile World Congress, one of the largest tech conferences in the world, was fined 200,000 euros for the usage of facial recognition software at MWC 2021.
I am not receiving any of this money, and as an added “bonus,” I found myself in the middle of a media storm. I suddenly have to explain to tens of journalists why I would risk spoiling the relationship with conference organizers and my own speaking reputation, potentially positioning myself as someone difficult to work with. Why would I bother speaking up if my pictures are already all over the internet, my face is probably captured by thousands of video cameras on the streets, and my phone already knows where I am and what I do? Isn’t that inconsistent?
I’m glad you asked.
Asking for privacy is not a crime.
How much do you care about your digital privacy rights?
Chances are, you think one of these two things:
1) "I know it’s important, but my data is out there anyway, why waste time thinking about it?" or
2) "I have nothing to hide, I follow the law, so I am not worried about privacy" (echoed by a remark from the former head of Google and Alphabet, Eric Schmidt, who once famously said, "If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place").
We somehow allowed the digital world erode our sense of privacy and feel either bored or powerless when facing the reality that more and more data about us is outside of our control. This feeling is perpetuated by large tech companies, where asking them about your data is increasingly portrayed as something shameful and time-wasting, while these very companies make their employees and contractors sign multi-page non-disclosure agreements.
But this doesn't have to be this way.
Indeed, a lot of our data is already online (As a side note, all tracking on my phone, as well as all Google history, is disabled).
However, data sharing is first of all a question of agency. Legislation (at least in Europe) says that it’s up to me (and you) to decide when, what, and with whom to share online, rather than someone deciding it for me and you. It’s important that you and I can withdraw our consent to share this data whenever. We also have the right to know how this data will be used. This is our right guaranteed by law (GDPR in Europe is the strictest one; there are similar mechanisms in some other countries like Canada).
Privacy does and will always matter, no matter how many Eric Schmidts of the world try to convince you otherwise. If I ask you to give me your email password or send me photos of your kids bathing, would you do that? Probably not – because these are private things, and should remain private.
Asking for privacy shouldn’t be seen as a potential threat to society, nor as a sign of criminal intention. When you ask someone how they are using or going to use your data, you are showing your active citizenship position and exercising your lawful right. A person who speaks up and asks about their lawful rights should not be shamed or ostracized. The person should be applauded and encouraged to do so.
The Overton window
Asserting your rights helps prevent such incidents from becoming the new normal. In sociology, the concept of the Overton Window describes the range of ideas and perspectives on political and social issues that are considered acceptable by the general public at a given time.
When the Overton Window shifts, things that were once considered abnormal become the norm. For example, think about the restrictions on carrying shampoo bottles larger than 100ml in airport security. In the past, this would have seemed unusual, but now it's widely accepted (ever tried to pour shampoo from a larger bottle to a tiny one to take to the aiport, losing half of it on the way?).
What we choose to ignore and silently accept today can easily become the new normal. If we don't start questioning and asserting our digital rights, the boundaries will keep shifting, and not necessarily in our favor. Imagine if today you're asked by a private company to upload your passport without any objections. Tomorrow, you might be asked for your fingerprints, and the day after, even your DNA.
I'm exaggerating this scenario, but the message is clear: unless we stand up and exercise our legal rights, we're essentially agreeing to an Orwellian future.
Predictability and the Loss of Free Will
Another major reason why privacy matters is that if a company or state possesses a significant amount of your digital information, they can effectively "hack" your humanity. In other words, algorithms can predict your actions and manipulate your behavior to make it more predictable and standardized. This diminishes the notion of free will and is already happening.
Did you know that a staggering 70% of the content people watch on YouTube is driven by recommendations from the platform's algorithm? However, we who watch these recommended videos often believe we are making independent choices. The reality is that these algorithms optimize for what will keep you engaged and online, not necessarily for what you genuinely want.
Privacy and Mental well-being
Privacy also holds immense value for our mental well-being. Research indicates that humans have a fundamental psychological need to be alone and free from judgment by others.
When we are under constant surveillance and aware of it, our behavior drastically changes. We limit the options we consider and become more compliant and self-censored.
Free thinking is crucial as it fosters creativity and exploration. When we stifle our ability to think freely, we become more predictable. Furthermore, free thinking contributes to our sense of belonging to a community and our ability to trust others.
The Future of Privacy
The privacy debate has never been more relevant than it is now, especially with the increasing use of facial recognition software and implementation of AI by governments, companies and educational institutions.
During the pandemic, approximately 80% of large companies implemented surveillance software to monitor remote employees. The market of 'emotion AI' that can recognize your feelings will almost triple in 6 years, from $19.9 billion in 2020 to $52.8 billion by 2026.
While these tools are often marketed as promoting well-being and mental health, they are essentially surveillance tools that monitor and predict individuals' attitudes and behaviors based on factors like tone of voice, typing speed, and typing quality. This allows for making (unlawful) assumptions about people and impacting their lives, such as through facial recognition or guessing emotions (as this software is often not based on any scientifically proven theory).
In the US and increasingly in Europe, more schools and universities are implementing facial recognition and other data collection about students under the pretext of security or convenience, or both. However, a lot of the data that's being collected, in the best case, ends up in the gray legislative area, and in the worst case, it is sold to data brokers and marketers who can eventually use this data to target kids with online ads. I'm not even talking about the risks of potential data leaks.
What can you do?
You don't have to have a legal education to claim your privacy, nor do you need to have a lawyer, although it certainly helps to familiarize yourself with the basic privacy law principles accepted in your country. If you are in Europe/UK, this is GDPR, which gives you the right to ask how data is used and collected, and to opt out of it, inquire about how an automated decision has been made about you, and ask for human intervention, among other things.
However, it's much more important that you simply start asking questions – to your employer, to your children's school, to your hospital, to a conference you are attending, to a hotel where you are staying, or even if you are installing a new smart meter (believe it or not, they collect LOTS of other data apart from your electricity and water bills). The questions are, in principle, the same for all situations:
What information exactly do you collect about me/my child digitally?
What is the need to collect this information? (For example, in the case of a hotel – do they need a copy of your passport, or is just a number enough? It's actually the second one.)
How is this information used to make decisions about me or my child?
How accurate is the algorithm? What's the probability of it making a mistake? (For example, if your employer uses emotion AI that 'predicts' your mood or agreeability and you get rejected for a promotion because of its results, or aren't hired – these are all real cases.)
What if I disagree with this decision? What is your procedure for appeal?
How do you securely store the data? How do you transmit this data to the servers, and where are the servers located? Who has access to this information?
What is your procedure in case of a data breach?
What is the mechanism that allows me to opt out of your data collection?
Do NOT buy into generic words and links to privacy statements. They are usually written in a very confusing way and discourage anyone from trying to understand what really happens.
It's okay to ask for clarification if you are not a lawyer and ask them to explain it to you in simple words if this is affecting your life. What is not okay is to brush you off for asking these questions.
Privacy is not a crime; it's a right. Exercise it while you can.