The Human Pulse Podcast - Ep. #12
Back to list of episodes
LINKS AND SHOW NOTES:
Living Well with Technology.
In this episode, Fabrice Neuman and Anne Trager explore the relationship between privacy, secrets, technology, and trust. They discuss how technology platforms manage personal data to personalize services, the growing efforts by web browsers to protect user privacy, and the nuanced differences between privacy and secrets. The conversation emphasizes the importance of trust and awareness in navigating privacy choices, ultimately encouraging listeners to reflect on their own relationship with privacy in a technology-driven world.
Reach out:
Anne on Bluesky
Fabrice on Bluesky
Anne on LinkedIn
Fabrice on LinkedIn
We also appreciate a 5-star rating and review in Apple Podcasts and Spotify.
Chapters
(00:00) Introduction
(01:40) Privacy: The Trade-offs We Make
(05:32) From Privacy to Targeted Ads
(08:32) Privacy, Secrets, and the Evolution of Web Browsers
(09:29) Secrets and the Brain: Why Keeping Secrets is Hard
(15:45) Protecting Privacy: The Role of Encryption
(17:27) Case Study: Apple's Privacy Conflict with the UK Government
(21:33) Balancing Trust, Privacy, and Technology
(29:25) Conclusion: Privacy, Awareness, and Choice
See transcription below
Resources and Links:
WITI – Workforce Influence Technology and Innovation
David Eagleman's podcast Inner Cosmos: "Why Is It So Hard to Keep a Secret?"
Apple Advanced Data Protection
Tech Connect Europe
Browsers focusing on privacy:
Brave
Mozilla Firefox
Apple Safari
And also:
* Anne’s Free Sleep Guide: Potentialize.me/sleep.
Anne's website
https://potentializer-academy.com
Fabrice's blog (in French)
https://fabriceneuman.fr
Fabrice's podcast (in French)
https://lesvoixdelatech.com
Brought to you by:
www.potentializer-academy.com & www.pro-fusion-conseils.fr
(Be aware this transcription was done by AI and might contain some mistakes)
Fabrice Neuman (00:00)
Hi everyone and welcome to the Human Pulse Podcast where we talk about living well with technology. I'm Fabrice Neuman
Anne Trager (00:07)
And I'm Ann Trager.
Fabrice Neuman (00:09)
We are recording this on February 23rd, 2025.
Anne Trager (00:13)
Human Pulse is never longer than 30 minutes, so let's get started.
Fabrice Neuman (00:18)
So, and this time we wanted to talk about privacy and even like more privacy and secrets. And we'll talk about the difference between those two words if there are any, we'll see. We wanted to talk about that because privacy is a main concern for everyone on the Internet basically and it came up in lots of different conversations we had with people and in different meetings, for example, we are part of a group called WITI, which means Workforce Influence, Technology and Innovation. It was formerly known as Women in Technology International. It's a global networking organization, so we are part of. We meet a lot of people through it.
We actually host a group called Tech Connect Europe every Thursday morning at 9 a.m. Central European Time. And we'll put a couple of links in the show notes so you can join if you want. In one of the last ones, the topic of privacy was raised. And it's really important these days, I think. And so we wanted to dive a little deeper into that subject.
Anne Trager (01:30)
Yeah, it's such a great subject. I think that privacy is one of these ideas that is very important to a lot of us and remains this kind of abstract word. What does it actually mean? Because I don't know about you, but I am like a lot of people I know where I'm totally willing to give up any ideas of privacy if Netflix will tell me what show I want to watch or...
You know, Amazon will bring up the stuff I want to buy. And the only way that these services, whichever, whatever they are able to do that and to customize their services for me is to know a bunch of stuff about me, which means that my privacy is, well, you know, I don't know where is my privacy in all of that. Now it's all written up in privacy statements on websites.
And I've never ever actually read one except for the one that I have on my own website. I did actually read it before I signed it. So I know what's in those, but it shows that this whole idea that we are a little bit contradictory. We want to have everything without giving up any information or do we? That's the question.
Fabrice Neuman (02:29)
Hahaha. Yeah, it's a question. The thing is, there are different levels of privacy. So you're talking about Netflix, for example. So basically what Netflix does is Netflix is watching you watching stuff on Netflix. And... Yeah, but so let's go a little deeper in that. What it means is that obviously, so...
Anne Trager (03:05)
That's really scary said like that.
Fabrice Neuman (03:16)
As far as I know, Netflix doesn't use any kind of webcam to watch you, obviously, but it keeps tabs on what you're watching so it can produce statistics in what you're watching and then deduce what you might want to watch after that, what you might like. You know, it's the famous recommendation engine, which is very efficient based on all the information it can gather from you.
It's not really a problem if it stays in the silo, if you will. Netflix knows things about you, about what you watch, but it doesn't know anything about what you like to eat, for example. It's just a slight privacy indent, so that you can get some service out of it. So you can get...
You give information to get back information that concerns you and that interests you because that's what they sell in addition to the streaming TV shows and movies that you can get. So basically it's a question of you trust Netflix with your secrets but some of your secrets and a very tiny portion of your secrets which is what you watch on their service.
And even though it's even tinier than we think in that regard because it's only what you watch on their service and not what you watch on other services because they don't have access to what you watch on Amazon Prime or Hulu or Disney Plus or whatever.
The question then remains is, so we give up a little privacy in exchange for something else. So we pay for Netflix to get those shows and movies. And also, we also pay to get those recommendations. But the thing is, we pay in different ways. We either pay with money or we pay with data. And this is where privacy is linked.
Anne Trager (05:22)
Well, and I think that that's, we've all had the experience or heard the story of, you know, doing a search for pair of red shoes and then all of the sudden on some social media platform you start seeing a bunch of red shoes.
So this is happening. information about what I'm doing that's getting out there. And I have in some way agreed to it, but not necessarily knowingly or not consciously.
So it's a real, I think it's something that maybe we need to start thinking a little bit more about this.
Fabrice Neuman (05:58)
Well, so, and I think this is happening. A lot of people are talking more and more about privacy. What you're describing is actually less and less true in the sense of, if we go back a little bit in history, what happened was when we started to use the internet and our web browsers were not very good at keeping information from leaking. from one site to the next. And so you had cross-site linking and cross-site information leaking. And if you fast forward to today, you have main web browsers, whether you're talking about Edge, Brave, Firefox, Safari on the Apple platforms, they lie about you. So what you were referring to is like all the information the website can gather to build a profile of who you are. And in order to make sure that you can be targeted by ads that might interest you, sometimes it's not very efficient on top of that. Nowadays, you connect to a website day one and the browser will give away your location and maybe the number of colors displayed on your screen and the websites you visited beforehand and stuff like that. But then day two you go back with the same browser on the same website and actually the browser will say, so he's now on another place, another location and he's not a woman, he's a woman, he's a man or something like that. Companies getting a profile, a detailed profile of you is getting a little harder because of that, because lots of companies are trying to defend your privacy. You might have noticed I did mention Google Chrome as browser, even though it's the most used in the world by a large margin. And that's because Google is not quote unquote, a tech company, it's an advertising company. And so they need information about you to sell ads.
Anne Trager (08:22)
And it raises a very good question, which is what's the business model behind this? And if it's free, then what are we giving up in order to have that free service? There's always a trade-off of something. And so it's a question of are we trading off our privacy or are we not? Are we trading off some secrets about ourselves? And I think that this raises another question about what's the difference between privacy and secrets. Thank you, by the way, for bringing up this evolution in the, in the browsers because of the demand people are having for privacy. And this is something that's, you know, happened in the past few years where these browsers have become, are really trying to protect our secrecy or not. Well, not our secrecy, our privacy or our secrecy, but I really want to talk about secrets. Okay. Let's just, I'll just out myself on this. want to talk about secrets because I listened to a totally fascinating podcast last week by David Eagleman. His podcast is called Inner Cosmos. And it was about why the episode I listened to was called, "Why is it so hard to keep a secret?" That's why I want to talk about it. So this podcast, Inner Cosmos talks about the brain and how our brain works. And in this particular episode, he really dives into what what a secret is in terms of brain science. So he says that the brain, he describes the brain as a team of rivals, that our neural networks are constantly competing for, I guess, a place in the sun. Okay? We have that each neural network is going to have a different goal and a different perspective, and our neural networks fighting each other all the time, for, energy. And what a secret is, according to this definition of the brain, and I hope I get it right. I hope I understood this correctly. So the secret is that one part of the brain wants to reveal something that another part of the brain wants to withhold. So you have this rivalry. It's not really a fight. It's rivalry. And if there weren't this rivalry, it wouldn't be a secret.
So what this means is that having a secret is not a passive thing. It takes energy. There's an internal conflict. There's cognitive load having this rivalry going on inside. I want to say it. don't want to say it. I want to say it. I don't want to say it. I just think that that's really, really fascinating. And the result is that we have a really hard time, we as human beings in general, have a really hard time keeping secrets. There's a ton of evidence for this all around us. Every time human beings have a secret, there's some way to let it out. Either we'll tell a friend or we'll tell...
Fabrice Neuman (11:16)
Mm-hmm.
Anne Trager (11:30)
...a priest, the whole notion of confession, it was baked into one of the major religions for a really long time, It's about telling our secrets because of this cognitive load. so as much as secrets can also provide a certain evolutionary advantage, you know, I know where the bananas are, because primates keep secrets too, okay? The big apes, I think that's really interesting. Now,
You know, I know where the bananas are and you don't, you know, I... and whatever that's going to confer as an evolutionary advantage. So that's why we keep secrets. We know how we... we know what's going on in the brain. You can measure this. You can see it on the fMRI studies, what's happening actually. You know, what happens when we lie as well. And the other thing that's really interesting is that...
In kids, telling secrets or telling lies, because that's if you're going to keep a secret, then you're going to lie about something. That's a really important developmental stage. And when it happens in young children, it's considered to be a sign of early intelligence, as they learn to understand that other people have other perspectives and can't actually read your mind.
Fabrice Neuman (12:30)
Mm-hmm.
Anne Trager (12:48)
So anyway, which leads me to think that, if I want to keep my secrets and keep my, and tell lies to keep my secrets, then maybe I'm just being a three-year-old. And I haven't grown up anyway. Okay, anyway. But we all know that secrets play a crucial dynamic, a crucial role.
Fabrice Neuman (12:58)
Hahaha.
Yeah
Anne Trager (13:09)
But we all know that secrets play a crucial role in group dynamics, in fostering trust, what will I tell you or what or not in creating alliances in, and as we said earlier, in maintaining competitive advantages. So we're really contradictory here. If we go back to what we were saying earlier about how we are with our privacy, we are We want our privacy.
And we also want all the services. So we want our secrets and we want to tell them.
Fabrice Neuman (13:45)
Well, I think what you're saying is that there are different layers and that's the thing. So protecting our privacy and we as a whole, think we can still say that we think about privacy as a right for privacy. have the right to protect it. And in order to protect our privacy, we need to protect the secrets that we are giving away to the companies that we are using.
Anne Trager (14:09)
Hmm. Yeah.
Fabrice Neuman (14:12)
And this is how we go back to the privacy angle. But then you have different levels. So for example, the Netflix example, this is low level secrets. So knowing what I watch is On top of that, if only the different streaming services I use know about what I watch instead of giving it to the world. And then you have this example of search for some green boots on the web and then everybody knows everybody meaning all the other websites and potential advertisers know about it and so I'm bombarded with ads for green boots after That's more annoying and then there are like secrets like my bank account accesses or access codes and stuff like that I want to keep for myself and this is where how do I make sure that this stays within my reach and my reach only so I can my money safe basically. So that's another level of secrets and this is where tech is and the whole privacy thing comes together because we basically use the same ways of accessing all that...
Anne Trager (15:17)
Yeah.
Fabrice Neuman (15:35)
...those pieces of information through the web and how do we protect that? And the base technique to protect our privacy online is encryption. And this is also why we wanted to talk about that because is the, the, The bedrock of our privacy is encryption.
When we go to a website, you know, we have this HTTPS thing because it's encrypted. So nobody can eavesdrop on what I'm looking online. Nobody being quote unquote, obviously. When we do a bank transfer online, this is all encrypted. And more and more we use encryption to ensure that our exchanges are protected from eavesdropping. And we do that also when we use different messaging apps and even what we're doing now as we are recording this podcast, use a basically video conferencing software which is encrypted so only us can see it up until we publish it on YouTube and in podcast form so everybody can listen and watch, obviously.
Anne Trager (16:40)
I'm never right.
So this.
Encryption allows us to have trust in the tools that we're using. Trust being very important aspect of whether or not we share secrets or our privacy. You know, if we make this connection, there were two things that come up. Either we share because we trust somebody we'll share with a friend or a trusted technology as we'll share a secret.
Fabrice Neuman (16:55)
Yes.
Anne Trager (17:17)
when nobody knows it's us or who we are. So anonymity and trust are two concepts that allow us as individuals to feel comfortable sharing our privacy or our private secrets.
And it comes down to right now, if we simplify that we can trust this technology because of the encryption that's in place.
So there is some news about Apple and Apple in the UK related to the UK government using the Investigatory Powers Act. Can you tell us a little bit more about this?
Fabrice Neuman (17:52)
in this particular context, this law was used against Apple to require from Apple to provide access to encrypted iCloud backups through a backdoor. So what it means is that when you use encryption, only you and the person you're talking to, whichever means of communication you use, I'm trying to simplify here, can read and or listen or watch.
the exchange, right? It's called end-to-end encryption. And so authorities cannot eavesdrop in any way because even Apple doesn't have access to that information. And so it's a problem for authorities if they want to tap into conversation just to, for example, make sure there is no problems or, you know,...
Anne Trager (18:19)
Mm-hmm.
Fabrice Neuman (18:39)
...actions taking place. So in order to explain that, let's go back to the beginning of the telephones. When we had phone conversations, it was very easy technically to tap into conversations. There was no encryption. So within the law, the authorities could ask for a judge to give them permission to listen to conversation. Fast forward to today, you can still...
Anne Trager (18:39)
You
Fabrice Neuman (19:06)
...ask the judge for authorization, but if you don't have access to the encrypted conversations, you cannot tap into conversation and listen in. And so what the UK government is now requiring from Apple and probably from other companies, but we don't know yet because when companies are asked to do that, they are not allowed to talk about it. That's one of the main expressions of the law.
Anne Trager (19:30)
Mm-hmm.
Fabrice Neuman (19:34)
So they required a backdoor, which means giving the authorities a way to listen to any encrypted conversations. It's concerning because you create a backdoor of that sort, you might want and you might think it will be only used by authorities because the key is given to the authorities only. But when you create something like this, you know and technology taught us that many, many times. there is a compromised security in a system, it will be found somebody else and then used. There's something like that. It's happening in the US as we speak basically. It's called this salt typhoon hack.
Salt Typhoon is a name given by Microsoft researchers of a two to Chinese state sponsored threat group. Right. And they are conducting cyber espionage against U.S. telecommunication networks, AT &T, Verizon, T-Mobile, because they were able to find a back door. so basically, U.S. government are...
Anne Trager (20:33)
Mm-hmm.
Fabrice Neuman (20:53)
...telling people if they want to keep their privacy to use and to encrypted other services than telephone networks because now those networks are compromised. And so that's a big problem. And so...
Anne Trager (21:04)
Hmm.
Fabrice Neuman (21:07)
...when you create a backdoor, you know for sure that other people will use it against your will. So it will not be kept for only be used by authorities. And that's the real problem. And this is what we wanted to talk about.
Anne Trager (21:23)
So I just want to be clear. The way I understood it, in the UK, what Apple chose to do is not to open up a back door, but to pull one of its advanced features, which is called the Advanced Data Protection feature, which is the one I gather was targeted by this
Fabrice Neuman (21:45)
Yeah.
Anne Trager (21:46)
And that doesn't apply to everybody. So you can still use your phone to make a call and not worry about your privacy, about somebody tapping in, or can you, or what's the deal?
Fabrice Neuman (21:58)
back up a little bit. Nowadays, not all the things you store within your Apple account, and it also applies to other types of is encrypted end to end. Because if it were, then for example, you as a user, lose your password and you can't access your data anymore. So it's not a very user-friendly environment. So in some ways, Apple still retains the keys to your data.
They can help you back to your data when you need, when you forget your password. The Advanced Data Protection Service was a creation from Apple from two years ago, which you can enable on your device and when you do then Apple tells you Several times if you enable that feature we will not be able to help you if you forget your password We will not be able to help you so it's really totally encrypted Which means also that and this is where what the UK government doesn't like and other governments in the world actually tell is saying but if you if they if this is that
If that's true, then if we ask a judge to give us permission to ask you, Apple or any other company, we need to have information about this person. You cannot give it to us because you don't know it, right? And the answer is yes. And so the law the UK is using is a way to ask Apple, OK, so you have this advanced data protection, but we need
way to get And the answer from Apple was to say, we don't want to provide that, so we will remove that feature from UK phones and UK Apple devices, or devices used in the UK.
Anne Trager (23:49)
Okay, okay, okay. Well, so this brings me back to this idea of what we're exchanging our privacy for. So in the case, I would really like Apple to be able to help me if I forget my password. I think forgetting passwords happens to everybody. Now that's a whole another episode that we can do about this whole password thing and how you can live with your passwords.
Fabrice Neuman (24:04)
Yes. Yes.
Anne Trager (24:15)
and living well with passwords, think, because what a mess. That said, I'm OK with Apple holding the keys to be able to help me get back into my phone if I forget my password. I am OK because I trust Apple, so we get back to this notion of trust. And because I know enough about Apple to reassured. That my data on my phone is encrypted as much as I need it to be encrypted. So it's a choice. And I think that this whole discussion leads to that, is what can we do in everyday life?
Well, first of all, we can be aware of what's going on and we can understand the choices that we are making about the tech we're using and the secrets we're giving up and the privacy. I don't want to say compromises because I'm not exactly sure that they're compromises and maybe they are. So the privacy compromises that we are making on a day-to-day basis. And it's not just about our phones.
How many of us are going into chat GPT and saying, help me, you know, come up with the best possible schedule for whatever. And then you just give it all away. And, I don't know.
Fabrice Neuman (25:15)
No, it's- me.
It's a whole other can of worms obviously. the privacy is also the way to protect privacy is to make sure that we think about all the data we give without knowing or somewhat willingly and but then we forget that we've given all that information away. So it's really a question of being conscious of the choices we make. For example, know, we I'm not including myself in that.
All like Snapchat and TikTok and stuff like that. But so what are we giving up to use those services for free? So there's this phrase we often hear, which is, if it's free, you're the product. I do believe it's true. just think it's incomplete because it doesn't mean that sometimes even if you pay, you're not the product. Back in the day when you were buying a magazine in you know, you were paying for the magazine and there were still ads in it.
All the services we use online today that are free, we pay for We always pay for something with either money or our data. And we need to be conscious about that. And we all know that all those users agreements we never read are very long. They're very complex. So I would say maybe tongue in cheek that AI can help with that. So download.
Anne Trager (26:51)
to understand them at least.
Fabrice Neuman (26:53)
Yeah, download the user's agreement that you're about to sign quote unquote by clicking OK and ask the AI what exactly it means. so you'll maybe, at least you'll make a better decision using that service because you'll know that some of your data might be used. And sometimes maybe it will make you choose not to use that service, right?
Anne Trager (27:20)
I think this awareness is so important. I'm a coach, so I'm particularly attentive to, first of all, raising awareness and also to this notion of confidentiality. For example, in the coaching relationship, it's like an ethical foundation of the profession in which I work. so, for example, I will never use AI in my coaching sessions because I can't control the data flow.
And when I am in sessions through other partners and so forth, there are tools in place that enable anonymization and aggregation of data to ensure privacy. So let's be clear, just because your data is out there doesn't necessarily mean that your privacy is being impinged upon, which is why you have to go, if you really want to know what's happening, and read those agreements, because those agreements are there to tell you.
What's actually happened? Is it connected to your name or isn't it connected to your name and so on and so forth? And these are the questions, the more we know, the more we can make conscious choices or at least make the conscious choice to not know and see what happens. At some point, these tools are so pervasive and we so want to have the benefits
Fabrice Neuman (28:22)
Yeah.
Anne Trager (28:44)
That maybe we're moving beyond a privacy world. I don't know, what does post-privacy look like? I have no idea. I don't even know if that's something we really want. So the question then, ultimately, if we raise it up a level, is how can we perceive this notion of privacy differently? And for now, I don't know. The only answer I've got to that is awareness and choice, choosing the trade-offs that we're going to make. And choosing what we share based on context and trust level.
Fabrice Neuman (29:15)
Well, we'll end it on this note because I think it's a food for thought. So I think our listeners will have a lot to think about it and tell us about what you think about privacy. That will be interesting and we'll probably and for sure come back to this topic
So that's it for episode 12. Thank you all for joining us. Visit humanpulsepodcast.com for links and past episodes.
Anne Trager (29:40)
Thank you also for subscribing and reviewing this. Wherever you listen to your podcasts, it really helps for people to find us. And please share this episode with one person around you so that we can get the word out about what we're doing. And we will see you in two weeks.
Fabrice Neuman (29:57)
Bye everyone.