E35 The Future of Privacy
60% of Americans believe it’s impossible to go through their daily life without companies and the government collecting their data. And they’re right. But according to our guest Dr. Carissa Véliz, that’s not how it should be.
In this interview, Don MacPherson is joined by Dr. Carissa Véliz, an expert in digital ethics with an emphasis on privacy and AI ethics. They discuss the state of privacy today, the countless ways in which organizations track, store, and analyze our data, and the importance of maintaining our privacy both online and in the real world. Carissa offers advice for how to protect yourself online, and insights on how you can stand up and demand better data protection from organizations and the government.
Season Three of the podcast is dedicated to exploring the future and how life is sure to change over the next decade. This episode provides insight into how invasions of privacy and the fight to maintain it will disrupt the way we live and work.
Dr. Carissa Véliz, an Associate Professor at the Faculty of Philosophy and the Institute for Ethics in AI at the University of Oxford, is an expert in digital ethics. Dr. Véliz specializes in privacy and AI ethics along with political philosophy and public policy. Dr. Véliz is the author of Privacy is Power and the editor of the Oxford Handbook of Digital Ethics.
Don MacPherson:
Hello, this is Don MacPherson, your host of 12 Geniuses. For 25 years, I've been helping organizations and the leaders who run them improve performance. Now I travel the world to interview geniuses about the trends shaping the way we live and work. Today's topic is the Future of Privacy. For nearly 20 years, organizations have been sucking up data about our daily habits. The result is an incredibly valuable profile of what we are doing, what we've done, and shockingly accurate predictions about what we will do. In short, our privacy has been compromised in ways that are seemingly unimaginable — And what is worse, we have given permission to these organizations through user agreements that very few of us ever read.
To give us a much better understanding of the state of privacy and its future, our guest is Dr. Carissa Véliz. Dr. Véliz is a professor at Oxford University. She works on digital ethics with an emphasis on privacy and AI ethics. She is also the author of the book, Privacy is Power. This episode of 12 Geniuses is brought to you by the Think2Perform Research Institute, an organization committed to advancing moral, purposeful, and emotionally intelligent leadership. You can learn more and access the institute's latest research at T, the number 2, pri.org.
Carissa, welcome to 12 Geniuses. I'd like to start the conversation by asking you what the state of privacy is today.
Dr. Carissa Véliz:
Thank you so much, Don, for inviting me. The state of privacy right now is not great. It would definitely be a lot better. Since the beginning of the 21st century, there has been a move towards collecting more and more personal data from people. At the beginning, it was done by corporations like Google, and the vast majority of people didn't know about it. So, probably if you've been around for a while, the first time you opened an email account, it never occurred to you that you might be giving your personal data in exchange for that service. And the governments were informed that this was going on, but tragically, after 9/11, governments were very worried about keeping the population safe, and they thought that they could use that data to such purposes, and so they didn't have a big incentive to regulate it.
So, essentially for 20 years, give or take, the data economy has proliferated and these methods of mass data collection have become more refined, more prevalent, and our privacy has been eroded. At the moment, just to give you a few examples, for instance, when you wake up, usually the first thing you do is look at your phone, and that alerts a whole host of corporations that you've just woken up and at what time you wake up. They've also been monitoring; how well do you sleep? Usually when you have insomnia, you pick up your phone and you check your email or you check social media. The phone also knows who you're sleeping with, if that person also has their phone next to them. And just along your day, thousands and thousands of data points are collected, or anything from where you work and who your contacts are, to how much you weigh according to the seat on your smart car.
Don:
I'm already scared. We're only three minutes in.
Dr. Véliz:
You should be.
Don:
Let me just ask you the question — you said when you pick up your phone first thing in the morning, which is very common, and most people do that — I do that myself f— what is your phone doing and how is it alerting these different companies? Is it through the apps that are on your phone? And what would the data send be to those companies?
Dr. Véliz:
It varies a lot. Much of it is opaque. And you really need very tech-savvy people to investigate, to give you a very precise response. But yes, much of it is through your apps. It also depends on the kind of brand that you have for a smartphone. Just like a rough way of thinking about it, whenever you buy a smartphone from a company that earns its keep through data collection, then you can assume that they will be collecting more data than a company that earns its keep through selling hardware and software. For instance, you wake up in the morning and you check your weather app, and that already gives two very important data points. One is where you are, because weather apps usually ask you for location data, and that means; where do you live? And where you wake up, which usually is where you slept.
And also, at what time do you wake up? The tricky thing is that many data points don't seem that sensitive to us, right? What do I care if somebody knows where I live? But in fact, those data points can be aggregated and analyzed in very complex ways that can tell a lot about a person. You could tell roughly how much do you earn on the basis of where you live and where you work. And that's already much more sensitive than you would think by just giving a couple of data points.
Don:
I started out by asking you what the state of privacy is. You said not very good. And you brought us back to the year 2000 and 9/11 as a key event for data privacy or lack of data privacy regulation. How would our privacy now, compared to 1995, like pre-internet, how much data is available about individuals now compared to ‘95, for example, which is 25 years ago?
Dr. Véliz:
Incredibly much more data. Just almost unfathomable amounts of data. Now, on the one hand, you could say that we have a different kind of privacy now. So, in 1995, if you wanted to buy something, typically you would have to go personally to the store and face the person who's selling you, and buy it. For instance, whenever you wanted to buy something sensitive, say a book that you might be ashamed to read or a particular medicine, if you live in a small village, that would mean that you would have to expose yourself to whoever sells those items. Today you can do it online. So, it gives you the impression that you have more privacy than you used to. But in the past, corporations and governments had very little data on people. And whatever data they had was very scattered.
If you had a supermarket, they might have already some loyalty cards and they might know what you like to buy, and so on, and give you coupons in return. But the kind of data analysis that they could perform on that was minimal, was literally minimal. They could know that you have a, for instance, you have a taste for sweet things, and then give you coupons for that. But now there's a whole network of data analysts and sellers, data brokers, that analyze that data, and they can try to infer things from it. For instance, they can try to infer whether you might be pregnant depending on the kinds of things that you're buying on the supermarket, whether you might be at risk for diabetes, depending on how much sugar you eat. And these data brokers want to collect as much data as possible from as many people as possible.
So, their goal is to have a profile on each and every one of us. And that profile can contain many kinds of different data — purchasing records, browsing history, things like medical records, credit records, other things like whether you've gone to high school or your educational background, your income. It’s really, really broad. And then they sell these profiles to essentially whoever wants to buy them. Many times other corporations wants to buy them. So, insurance companies, typically, want to know as much as possible about the people they are insuring, but also prospective employers, banks. It could be also as well governments. And in some cases, they have even… Data brokers have been known to sell data to fraudsters.
Don:
And are those data brokers collecting individual data? So, it actually says your name and all of the information that it has about you, or are they collecting them in a demographically analyzable way?
Dr. Véliz:
Both. Much of the data is aggregated and presumably anonymous, but even anonymous data is very questionable because it turns out that it's quite easy to de-anonymize data. There have been a few studies made on this, and you typically need between two and four data points to identify someone as one and the same person. For instance, you’re probably the only person who lives and works in the places that you do. And so, if you already have those two data points, you already know who you are. If you have zip code, if you have gender, if you have age, then it's very easy to identify someone. But data brokers also collect data that is individual. For instance, there was a scandal a few years ago about how data brokers sold lists of very sensitive categories. For instance, lists of people who have been victims of rape, people who are HIV positive, people who suffer from impotence. And these are literally lists of people.
Don:
And those data points that they had — HIV positive or rape victim or something like that — they're verifiable or are they inferred?
Dr. Véliz:
In some cases, it's data that is good quality and that comes from very reliable sources. And many cases it's data that is inferred or that comes from very unreliable sources. I read, I forget exactly which book it is, but I remember reading about a researcher who looked up their own file on these data brokers and realized that data brokers thought that she hadn't even finished high school, and in fact she had a PhD. This is problematic because, even though she has a PhD, she might be treated by many corporations who buy that file as somebody who didn't finish high school. And because these systems are very opaque, there is no way that we can easily know what information other corporations have on us, and much less contest it and make it right if there's been some kind of error and we're being misjudged.
Don:
What about some of the other technologies like facial recognition or maybe voice recognition that we may not even be aware that data would be collected about us?
Dr. Véliz:
Yes, those kinds of identifying technologies are very concerning. The whole environment is very opaque. So, there are a lot of practices that we may not know about. One of them are called audio beacons. So, companies want to know who you are and they want to know that your phone is yours, and that your laptop is yours, and that your TV is yours. And they want to know that if your smart TV shows you an ad in the morning, then it's you who looks that product up on your phone or on your laptop, and then goes to the shop the next day and buys the product. This is interesting to companies because they can know how much of an effect their ads are having. And so, in order to triangulate those data points and to identify you as one and the same person, when you go into a shop or when you are listening to an ad in your home with your smart tv, you may be hearing noises in the background.
So, the ads, or there might be music in the shop that you go to, and these ads and this music might be broadcasting audio beacons, which are a kind of audio cookie that we can't hear as human beings, but that your phone recognizes and sends a signal back. And so, they can identify that it's you with your phone in that shop. Other things that companies use are data about your Wi-Fi networks. Your mobile phone is always trying to connect to the closest Wi-Fi that you have available if you have it on. And while it's doing that, it's also broadcasting the Wi-Fis it has connected to in the past. So, that gives off personal information. And the same with Bluetooth. If you want to be as safe as possible, whenever you leave your home, you should turn off your Wi-Fi and your Bluetooth.
Facial recognition is also a big concern. This is being used not only by police in some occasions, but also by corporations. And sometimes they don't notify people or the government that they're doing this. And so, it's not easy to know when patient recognition is being used on you. There can be many occasions in which it is used on you. It can be used through a camera that is a part of the CCTV in a shop or elsewhere that you encounter a camera. But it can also just be that companies like Clearview are collecting all the photographs that is on you out there on the internet, so on Facebook or on Twitter or on old cloud servers that you haven't looked at in a while, like Flickr or one of these that has been around for a while. And they're using those photographs and training their facial recognition algorithms on them. And you might never learn of that.
There are other more concerning technologies being developed. To the best of my knowledge, they're not being used widely yet, but it's something that we should really be vigilant about. And those are things like gait recognition. So, how you walk is very particular to you. And also heartbeat recognition. So, there's a technology that uses laser to recognize your heartbeat as unique, and that would be incredibly invasive.
Don:
I'm now mortified. I'm absolutely… I don't know what to say, Carissa.
Dr. Véliz:
It's very scary. Another technology that is used widespread is StingRays. So, stingrays are like vacuum cleaners that vacuum data around them. The way they work is your mobile phone is always trying to connect as well to the nearest cell phone tower in order for it to work. And these stingrays act as if they were a cell phone tower. And your mobile phone connects to it thinking it's a cell phone tower, and once it's connected, the person who is managing this StingRay can look at your messages, your browsing. They can intercept calls. In the past, these devices were very expensive and mostly only used by government. But it turns out that they can be manufactured in one's home if you're tech savvy enough, and they're not even that expensive. And also, it's increasingly the case that companies are selling spyware. And many of these companies claim that they only sell these devices to legitimate governments, but a lot of experts question that.
Don:
These technologies like the StingRay technology or the gait recognition, or this heartbeat technology, those don't seem to have any benefits other than just surveillance. So, no benefit to me as an individual out in society. I only see a benefit in terms of monitoring or surveilling people. Is that an accurate statement or am I missing the benefit?
Dr. Véliz:
Maybe somebody, say from an intelligence agency, who is very well intentioned and genuinely wants to keep the population safe would argue that these tools are necessary for the protection of the population, it’s just that they should be only in the hands of the right authorities. Now, the comeback is that the reality is that surveillance today is very much a cooperation between corporations and governments. And in fact, it's more and more the case that governments depend on corporations for these kinds of surveillance. And also, it is concerning that the kind of technology that you need to collect data is also the kind of technology that makes it internet insecure.
If the internet were more secure, if everything was encrypted, it would be much, much harder to collect data. And so, when intelligence agencies lobby for having less security in order to collect data, they're making the whole internet insecure. So, I would agree with you that the downsides are much, much, much more than the possible benefits.
Don:
How do you see TikTok as being a tool for gathering data and possibly compromising privacy?
Dr. Véliz:
TikTok is a concern because of the amount of people who are using it, because of the kind of data and the amount of data that it collects. It collects much more data than Twitter and much more personal data because any Chinese company can be subject to very intrusive measures by the Chinese government. So, I do think it's a concern. I don't think it's impossible for the United States to have enough reassurance and enough separation between, say, a U.S. TikTok and a Chinese TikTok for there to be some kind of workable solution. But as things stand, I think it's a definite concern.
Don:
One last technology question I want to ask you about, related to data collection, would be wearables. You've talked about how health information can be compromised by all of the data that will be collected, and wearables seems like one of those tools that can hugely compromise an individual's health information. What are your thoughts on that? And how much data is collected on an individual who's wearing a wearable like a Fitbit or an Apple Watch or something like that?
Dr. Véliz:
I think we should be very much aware that big tech is getting interested in medicine and is getting more and more involved with government in ways of getting closer to our medical setting. And we should be very careful about that. In the case of wearables, something to look out for is Google wants to buy Fitbit. Of course, Google already has a huge amount of data on its users. And data comes from Google, from YouTube, from the search, from maps, Google Maps, from Android. So, it has so many sources. And now if they have this one more sensitive thing that is very related to health, that should concern us.
Don:
You've said 2018 was a landmark year for privacy. Why or what events happened that made it a landmark year?
Dr. Véliz:
Two things were very, very memorable. One was the Cambridge Analytica scandal. And that's a landmark because it showed, in a kind of textbook way, how data can be used to erode privacy. And Cambridge Analytica showed how privacy is actually political. It's a political concern and it's a collective concern. And if you expose your privacy, you put us all at risk. Because when you expose your privacy, you expose data about other people. So, when you expose where you live, you expose things about your neighbors, about your partner, about anyone who lives near you. When you expose things about your genes, you expose your siblings, your parents, your kids, and even very distant kin that you might never know about. When you expose things about your psychology, you expose other people who share those psychological traits, and all of that can be weaponized and used against us as a citizenry to erode our democracy.
So, that was one landmark case. And the other landmark case was the implementation of the GDPR in Europe — the General Data Protection Regulation. When I started working on privacy about six years ago, everybody, most people I talked with had a very cynical view of privacy. “Privacy is dead, it's just history, get over it. Why don't you do something more relevant?” And it was essentially seen as something having to do with more with history than with the present or the future. And it's incredible that only five or six years later the GDPR could happen. Nobody thought it was possible to pass a law that would protect personal data in the context of the data economy. And now, the GDPR is not perfect. It has many flaws and we need to do more and we deserve better. But it's a huge success. It's a huge success that nobody thought was possible and that has put pressure on other countries to review their privacy laws and their privacy practices.
And that has given citizens tools to defend ourselves and that has given us tools to better inform ourselves. There's still a lot to do. It could be much better enforced, but it was the beginning, I think, of the end of the internet as a wild west in which companies can do whatever they want without any accountability or even giving an explanation, much less asking for permission.
Don:
So, what are some of the examples of really concrete things that or rights that have been restored for citizens of the EU through this GDPR?
Dr. Véliz:
One kind of right that is very important is the right to access your own data. One of the advantages that this gave us was that big companies like Google and Facebook and Twitter have had to make it much easier for us to download our data. Another very important right is that you have the right to ask for your data to be deleted, which wasn't there before. And you have the right to be forgotten has also been something very important in the context of Europe. So, if there's something online about you that is outdated or incorrect and that puts you in a bad light, or that is bad about you and it's true, but it's no longer relevant for public interest, you have a right for Google to delist that from Google search.
Don:
How likely is it that something like GDPR would be extended to other parts of the world?
Dr. Véliz:
I think it's very likely. It's already happened in many ways. For instance, Japan updated its privacy laws and made it much more similar to the GDPR. California had a very important law passed regarding privacy. And at the moment, many privacy bills are being discussed in the United States — and I think it's just a matter of time. Within the next few years, we will see the United States updated privacy laws federally and not only by states. That's my prediction.
Don:
One of the things that I read in researching for this topic was that you have recommended that people do not take DNA tests. And I hadn't even really thought about that in terms of privacy. Could you talk a little bit about the risk of people volunteering that information?
Dr. Véliz:
First of all, you should know that commercial tests are incredibly inaccurate. So, a study came out in nature showing that the interpretation of these tests were around 40% inaccurate, or inaccurate 40% of the times. So, 40% of the times you'll be told something that's not actually true. So, you might be giving out your privacy for just bogus stories in return. That's one thing. But also just in terms of privacy, when you do one of these tests, usually the terms and conditions are very disfavorable for users. So, you usually give away all kind of rights to your genetic data. And you have no idea how your genetic data will be used, and you have no way of changing that once you give it away. And it can be used for something good or it can be used for something bad, and you have no control over it.
A couple of things of ways that genetic data has been used that are questionable is Canada has actually deported someone on the basis of genetic data. This was a case in which there was a refugee who was seeking asylum and this person had committed a crime. And so, Canada was not wanting to give that this person asylum, and Canada wanted to prove that the person wasn't from the country they claimed they were from. It was a very controversial case, and it's just an example of how your genetic data could be used to infer the nationality of someone else, and that can have consequences that you might disagree with. You could think of the centers of refugee and asylum seekers in countries that are much more problematic than Canada. Another concern for yourself is how these genetic data might be used against you or against your family in the future.
One example is, at the moment in the United States, you won't be denied health insurance on the basis of that. But one concern is that loss can change. And you can be denied other kinds of insurance like life insurance. Also, the premium of your health insurance could change on the basis of the results of genetic taste. And bear in mind, these results might be false. So, even though they're wrong, you could still get, say, charged more or denied some kind of insurance because of this test. And once you make the test, you have to reveal it to insurance companies. If you don't reveal it, your insurance can be taken away.
Don:
One of the things that you said, I like to think more about how things should be as opposed to how they actually are — how should things be?
Dr. Véliz:
Ethics is about thinking, how can we make the world a better place? Not just about how things are today, but how they should be, and what kinds of rules should we come up with to make the world a better place or at least to walk in the right direction? One of the controversial things that I argue in my book — Privacy is Power — is that personal data shouldn't be the kind of thing that you can buy or sell. In any country, in any society, no matter how capitalist, there are certain things that are outside of the market and that should be outside of the market, and that people agree should be outside of the market. Some examples are votes. You shouldn't be able to buy votes because then you undermine the whole concept of democracy. Another example is children. You shouldn't be able to, or persons in general, you shouldn't be able to buy or sell people.
In the same way, I think personal data isn't the kind of thing that should be bought and sold. We shouldn't allow companies to be able to profit from knowing about and selling that information; that, for instance, somebody has been the victim of a terrible crime or of a terrible disease. That's not the kind of thing that should be up for sale and that should make people rich. People shouldn't be able to line their pockets with this kind of information. And that means a lot of other kinds of rules that we should put in place to prevent data from being misused. One of those tools is fiduciary duties. Fiduciary duties are duties that are imposed upon one party in a relationship between two parties that is very asymmetrical. So, lawyers — Lawyers and clients have a very asymmetrical relationship because lawyers know more than clients and because clients have a lot at stake usually when they seek a lawyer.
And so, a fiduciary duty makes lawyers put the interest of their clients first. In the same way, doctors have a fiduciary duty to their patients. And fiduciary duties are also appropriate when there's something very valuable being entrusted to the professional in question. In the case of the lawyer, you entrust your legal masters. In the case of doctor, you entrust your body and your medical data. And all of these professions can have conflicts of interest. The lawyer could want to have your case in order to be famous or they might have two clients whose interests are not compatible, and they might choose the client that's wealthier because they're going to make more money with him. And none of these professions are allowed to do that. They have to put the interests of the client or the patient first. I argue that we should do exactly the same with data controllers and data managers. Whoever wants to manage personal data, they need to have fiduciary duties towards data subjects. Our data shouldn't be used against us.
Don:
What needs to happen from a legislation perspective in order for us to get there? And I'm assuming this has to be done on a country-by-country basis.
Dr. Véliz:
Yes, regulation is inevitable. This is the wild west at the moment, and we need rules in place. We need to ban the sharing and selling and buying of personal data. We need to implement fiduciary duties. We need to ban, for instance, sensitive inferences from being made because you could have data that is not personal and it doesn't seem sensitive, but then try to infer very sensitive things from that. And that's going to be an effort that's going to take some time. It does have to happen country by country, but it will also happen in conversation with other countries because countries want to share data amongst themselves, both in terms of governments, but also companies want to be able to access users in other countries for them to be able to widen their market. And that means that we need to agree on minimum standards, which is what's happening between the U.S. and Europe. At the moment, there have been two such deals, and both have been deemed unlawful, and we need something better. For that to happen, however, we need people to protect their privacy and to demand privacy from companies and from their representatives because we won't have regulation unless people ask for it.
Don:
A company like Facebook is worth, I think I looked recently, was $750 billion, and Google is worth over a trillion dollars; Apple's worth $1.8 trillion. How hurt would these companies be without this wild west of data collection and data sharing? Would they be compromised? Because what I'm looking at is how much they would resist this type of regulation?
Dr. Véliz:
It depends. I think big tech varies a lot in that regard. For instance, Apple would be the company that is less hurt of the big tech companies because it's a company that relies less on personal data for profit. They mostly make their money through selling us things like laptops and smartphones. Google, at the moment, would be very hurt because at the moment, about 80% or 83% of their income is made through ads, and ads are typically targeted. But Google is different because we rely a lot on Google, we know it very well, and it could have a different business model. One of the things that I argue in my book is that in 2013, it was already hugely successful. And according to a column, I think it was in Forbes, if I remember correctly, if users paid Google about $10 per year, Google would've made the same amount of money.
So, that's what a user was worth to Google. And that's not a lot of money if you compare it to what we pay for Netflix, which is about $10 per month, not per year. Google, I think, could more easily change its business model, and I think it's conceivable that people would be willing to pay for that because it's so useful to have the search, and to have Google Maps, and the cloud and so on. In the case of Facebook, I think they would be the most hurt. They are also the company I think that are worse about privacy and also that are less trusted by users. And they would have to think hard about changing their business model because I think that's a business model that is inherently dangerous and destructive for democracies and citizens.
Don:
Could you give us a few tips? You've mentioned a couple along the way, but a few tips for people, individuals to protect their privacy.
Dr. Véliz:
Definitely, and I think this is very relevant with the elections coming up because there will be actors trying to interfere with elections. And if you give out your privacy, you are contributing to that. One thing is to be careful with the kind of devices that you buy. Try to buy devices from companies that do not make their money from personal data. When you leave home, turn off your Wi-Fi and your Bluetooth. Read newspapers and read good journalism, but read it from the source. Go to the web page themselves. Don't read things from social media because it's more likely to be targeted to you and it's more likely to be fake news. So, if you like a newspaper, go to their website and get news from there. Instead of using companies and services that are questionable from the point of view privacy; try to use privacy-friendly options.
For instance, instead of using Gmail, use something like ProtonMail. That's a very good email based in Switzerland and that uses encryption. Instead of using WhatsApp that's owned by Facebook, you could use Signal. So, whenever you can, try to find the privacy-friendly alternative. Instead of using Google search, I recommend DuckDuckGo. In general, try not to overshare too much. Be aware that your data is something coveted and something that is probably going to be misused at some point. So, don't give it away if you can help it.
Don:
You've mentioned your book a couple times, and the title is Privacy is Power, is that correct?
Dr. Véliz:
Yes. Privacy is Power.
Don:
And when does it come out?
Dr. Véliz:
It comes out on 24th September in the UK, and it's still unclear in the U.S. exactly when…
Don:
And what can people learn in the book?
Dr. Véliz:
They can learn what is the relationship between power and privacy? Because it's something that I think has surprised us. We were very used to certain kinds of power like political power and economic power, but this new kind of power that stems from the analysis of personal data is very surprising in many ways. It has to do with the ability to predict and influence behavior. And I think we should think about power as a kind of energy. It can transform from one kind into another. And we have to stop it from transforming. For instance, we have to stop economic power from transforming into political power. No matter how much money you have, you shouldn't be able to buy votes. And in the same way, we have to stop certain kinds of companies that have a lot of data from transforming that power into political power as well.
So, they will learn about that. They will learn about why and how data is toxic. I argue that personal data should be treated as a toxic substance, much like asbestos or other kinds of toxic substances, and how it can poison your individual life; it can poison your company — if you have a company — and it can poison your community and your society if you live in a democracy. And they can learn about what kinds of things to demand from our political representatives and what kind of regulations we need to fix this environment. Then they can learn from all kinds of practical advice like the ones I just gave, but many more about how to better protect their privacy and the kinds of things to look out for.
Don:
You can opt out of this question if you have not been following it, but we are obviously, in the United States, having a very important presidential election coming up in November. How confident are you that the voting population will not be manipulated through external parties, whether it be a foreign government or somebody else who would like to influence the results?
Dr. Véliz:
Not hugely confident. I don't think that we have the necessary rules in place to be absolutely sure that it won't happen. And in particular, I think Facebook is very concerning, because Facebook, Twitter as well, but Facebook has a very clear interest in remaining unregulated. The fact that social media corporations have a stake in the matter makes it tempting for them to not do enough to prohibit interference or to prevent interference, and it could even be tempting for them to perform interference. One of the concerning aspects of Facebook is that they have been using these kind of encouragements for people to go voting, which sounds like a really good thing — people should go voting. But these I vote buttons, or I have voted buttons, or go vote buttons are not shown to everybody, and we do not know the criteria for which people get to see it and which people don't get to see it. And we are not allowed to audit it.
It's insane that a corporation with that much power is getting involved in elections and without any kind of oversight. It's really absolutely insane. Now, I hope that Facebook and Mark Zuckerberg will be responsible and will not interfere with elections, but the fact that they could and that we would not know about it is already insane, because the rule of law cannot depend on simply trusting the goodwill of people. There has to be some legal regimen and some auditing powers to be able to make absolutely sure that interference cannot happen — and we don't have that at the moment.
Don:
One last question for you. How has COVID-19 either threatened privacy or enhanced privacy?
Dr. Véliz:
I think it has been a definite challenge and a danger. One of the ways in which we lose privacy is when we face crisis like 9/11 and others, in which people ask us for our data to fix it. There is a temptation to do it uncritically and to say, “Okay, yeah, whatever it takes,” even without verifying that our personal data will in fact fix whatever needs fixing because sometimes it actually doesn't. So, in this case, our personal data was asked in, for instance, contact tracing apps. I think it's still unclear how that will pan out. In the United States, there isn't, to the best of my knowledge, widespread use of it yet, so we'll see. But it's a challenge and it's something that we should be vigilant about.
But I'm quite positively surprised that there has been a lot of talk about privacy concerns. I think that's already something good and it already shows how people are much more mindful of privacy than they were 20 years ago or 10 years ago. Even though we're in a very bad place with respect to privacy at the moment, I think we have much more tools than 10 years ago to deal with it. And people are increasingly becoming more aware of how it's a problem and why they should care and how they can help protect privacy.
Don:
Where can people find out more about you and where can they order your book — Privacy is Power?
Dr. Véliz:
People can find out more about me on Twitter, my name is Carissa Véliz, or on my website, carissaveliz.com. And they can find my book in their favorite bookshop or they can order it online on Amazon.
Don:
Carissa, thank you so much for spending time with us today and sharing your expertise on the topic of privacy, and thank you for being a genius.
Dr. Véliz:
Thank you so much, Don. This is a lot of fun.
Don:
Thank you for listening to 12 Geniuses, and thank you to our sponsor, the Think2Perform Research Institute. For our next episode, I had the great honor of going to Peterson Air Force Base in Colorado Springs to interview the Command Senior Enlisted Leader of U.S. Space Command. Master Gunnery Sergeant Scott Stalker and I discussed the future of War, which will be released November 10th. Devon McGrath is our production assistant; Brian Bierbaum is our research and historical consultant; Toby, Tony, Jay, and the rest of the team at GL Productions in London make sure the sound and editing are phenomenal. To subscribe to 12 Geniuses, please go to 12geniuses.com. Thanks for listening, and thank you for being a genius.