Attention Data Protection Professionals Who Want To Take Their Career To The Next Level
You’re about to discover the secrets to be a world class Privacy Pro and thought leader!
Hi, my name is Jamal Ahmed and I’d like to invite you to listen to this special episode of the #1 ranked Data Privacy podcast.
In this value packed episode, you’ll discover what every Privacy Pro needs to know about the future of Data Privacy including:
- The latest trends in privacy that will shake up the industry
- How to keep up and stay ahead with regulatory changes
- How to enhance your career prospects and stand out as a world class Privacy Pro!
and so much more…
Jules is CEO of the Future of Privacy Forum, a catalyst for privacy leadership and scholarship, advancing principled data practices.
FPF is supported by more than 180 leading companies and foundations, as well as an advisory board of comprised of the country’s leading privacy academics and advocates. FPF’s current projects focus on online data use, smart grid, mobile data, location, big data, apps, connected cars, wearable tech and student privacy.
Jules previous roles have included serving as Chief Privacy Officer at AOL and before that at DoubleClick, as Consumer Affairs Commissioner for New York City, as an elected New York State Legislator and as a congressional staffer, and as an attorney.
Jules has served on the boards of a number of privacy and consumer protection organizations including TRUSTe, the International Association of Privacy Professionals, and the Network Advertising Initiative. From 2011-2012, Jules served on the Department of Homeland Security Data Privacy and Integrity Advisory Committee.
Follow Jamal on LinkedIn: https://www.linkedin.com/in/kmjahmed/
Connect with Jules on LinkedIn: https://www.linkedin.com/in/julespolonetsky/
Check out the Future of Privacy Forum: https://fpf.org/
Get Exclusive Insights, Secret Expert Tips & Actionable Resources For A Thriving Privacy Career That We Only Share With Email Subscribers
Subscribe to the Privacy Pros Academy YouTube Channel
Join the Privacy Pros Academy Private Facebook Group for:
- Free LIVE Training
- Free Easy Peasy Data Privacy Guides
- Data Protection Updates and so much more
Apply to join here whilst it’s still free: https://www.facebook.com/groups/privacypro
Are you ready to know what you don't know about Privacy Pros? Then you're in the right place.Intro:
Welcome to the Privacy Pros Academy podcast by Kazient Privacy Experts. The podcast to launch progress and excel your career as a privacy pro.Intro:
Here about the latest news and developments in the world of privacy.Intro:
Discover fascinating insights from leading global privacy.Intro:
Professionals, and hear real stories and top tips from the people who have been where you want to get to.Intro:
We're an official IAP training partner.Intro:
We've trained people in over 137 countries and counting.Intro:
So whether you're thinking about starting a career in data privacy or you are an experienced professional, this is the podcast for you.Jamilla:
Hi, everyone, and welcome to the Privacy Pros Academy podcast. My name is Jamilla, and I'm a data privacy analyst at Kazient Privacy experts. With me today is my co host is Jamal Ahmed, a Fellow of Information Privacy and CEO at Kazient Privacy Experts. He is a revered global privacy thought leader, world class trainer, and published author for publications such as Thompson, Reuters, The independent, Euro News, as well as numerous industry publications. Welcome, Jamal.Jamal:
Good morning, Jamilla. How are you?Jamilla:
02:00 p.m. I'm good, how are you?Jamal:
It's am for our special guest. And you know what? Our special guest needs no introduction today, but I'm going to let you introduce him anyway. I don't think we can do any introduction. We can do him justice. I just want to introduce him as the Godfather of privacy to me. So over to you.Jamilla:Advertising Initiative. From:Jules:
Well, thank you. What an honour. I've admired the podcast in the past, and you folks do a really good job getting your speakers to engage and educate and inform, so I'll try to live up to that standard.Jamilla:
I'm sure you will. As Jamal said, you needed no introduction. So as we always do, we start off with an ice breaker question on the podcast. What is your favourite thing that you bought this year?Jules:
I paid for a wine training course, which during COVID was done remote with audience of aspiring wine experts around the world and had to study and take my test. And I enjoy learning the mysteries behind labelling and the different levels of wine. I just couldn't help think of how you could sit down and read the text of GDPR or CCPA or some data protection statute and think you understand what's going on, right? You pick up a bottle of wine from Bordeaux or Burgundy. You don't know what kind of wine in there. If you don't know, you're like, I'm drinking a Bordeaux. You start getting a little bit educated. You say, oh, well, wait a second. In the Bordeaux region or areas that are primarily Merlot, and if it's on the side of Geronda, then there's going to be a wine that's primarily Merlot with some Cabernet. But here it's likely to be Cabernet. But here they name by the region, and here they name by the Greece. You have to learn the lingo, but you have to also understand the rules and the laws and the culture and the issues. And I find that someone who's been doing data protection for 20 to 30 years, what frustrates me is that, look, it's important now, and I'm delighted that the world, right, not just us geeks or lawyers and so forth, are interested and obligated to have the basic skills. But what we need to do, those of us who have been doing this a bit longer without being sort of hardy and arrogant like, oh, who are you?Jules:
Who are you?Jules:
Newcomers, right? No. We need to help people understand that there are layers of complexity and that if you simply read a text and you don't understand, well, there's national law and there are other adjacent laws that actually exist, and then there's this country's implementation of the law, and then there's opinions. But then opinions are not necessarily binding either. There's opinions here and opinions most of us read the latest, perhaps EDPB opinions, say, the one on data transfers. And we've been doing this for 25 years, writing contracts, negotiating, advising, discussing, debating, doing comparative law. And we're like, wait a second. This group of very educated, expert folks who had a discussion amongst all the wise people have some conclusions that are incredibly surprising to people who have been working in the area for 30 years. Oh, you mean that's actually not even a trip. We don't even know what a transfer is. We've been doing international data flows, and we don't know what a transfer is. Wow. So the complexity and the layers, it's been exciting learning that I still need to do a lot of learning even after 30 years in the field. I feel like I'm just beginning.Jamal:
It's fascinating what you just shared there, in addition to the wine, is also about the actual, the learning that we do and then sometimes we have to do the unlearning when a new regulation or a new guidance sometimes comes out. And as you've said, you've spent 30 odd years doing all this stuff before all of this guidance comes out. And then we've got a bunch of academics, suddenly they decide, hang on a minute. No one knows what this means. This is actually what that means. Sometimes it can be very frustrating because it's not very pragmatic and it doesn't actually make much commercial sense. Some of the things that we're seeing how should Privacy Pros deal and tackle with that?Jules:
Well, first of all, you have to make sure your head doesn't explode. Right? If I start explaining to people that this conversation we're having right now using the technology that is crossing the Atlantic is violating laws according to most DPAs. That it is impossible right now for most of us unless we're using, I don't know, some sort of unique decentralised tool to send an email. To have a phone call. To have a virtual zoom or other conversation. Because those providers are in bulk transferring data that is subject to law enforcement surveillance access in many countries. None of which have other than the few who've got adequacy. So, hey, we're all criminals here, or we're facilitating the criminal behaviour of some large or even small multinationals. Right? So when you start with that, how could that be that there is actually not a clear basis at this moment to communicate or to use a service that is communicating, supporting communication for large numbers of people regularly across the Atlantic or from Europe to, well, I don't know, maybe 100 other countries, that it is technically, potentially a violation of the law, right? Well, no, that's not possible. Well, welcome to our world where we live with shades of gray and risk and nuance and interpretation.Jamal:
Thank you very much for sharing. Jules, you wrote an article called entitled college Students Don't Care about Privacy. Actually, they do. Do you think more and more people are caring about their privacy rights?Jules:
Well, I think that privacy has become the worst word to use for most of the important conversation, because for the average person who isn't thinking about this, what do they really think? They don't think about it in terms of, hey, I want to hide and be alone and live on a mountain. There are those people. But most of us want to be engaging, but we want to be engaging on our terms, right? We want to be communicating. We want to be sharing. We want to be very open with certain people close to us, a little bit less open with others. We want some integration with business where they know well to serve our needs, but we don't want them misusing and being too intrusive. So I think everyone cares about the rights and freedoms of existing in society, right? I mean, the GDPR is not about privacy. It's about using data protection rules to structure our relationship and the balance between the different powers and authorities and government and companies in society. So of course we all care about the fact that we want to have what we share an email, a communication, a provocative joke, something off colour, something sensitive shared with the audiences that we intend and that the tools that we use should respect and reflect that. We all want to be treated properly when it comes to how we're marketed to, and we have different views over, I don't know how we want to balance what big companies do. So privacy is now far bigger. We've become a key intermediator with many of the core rights and freedoms on a person basis, a person to company basis, a relationship between government. So when we say we don't care about privacy, we kind of sometimes mean, hey, there are people who maybe used to not share information because there wasn't some easy way or they weren't open about it, and now they're putting it on Twitter and Instagram and so forth, right. That doesn't mean they don't care about privacy. They still don't want data misused. They still care about maybe not discussing their sexual interest, not discussing their salaries. But yeah, they're going to joke on Twitter about things. And in the past, maybe they never shared that because it wasn't in a convenient or easy way. So everybody cares, we just don't ask the question properly. Right. Plenty of people, oh, I don't care about privacy. I'm all out there. Well, okay, you are a white man at a senior point in your career with a comfortable income, with a lot of security. So yeah, big shot, you don't care about privacy because guess what? No one's going to fire you. You're the boss. Well, I'm a more junior person. I come from a background where there's been discrimination and so on. I'm at a point in my career where I don't want to be misjudged and have my opportunities limited. So everybody is at a different place and everyone has a different set of priorities. And I think we want law to be flexible enough to give us that zone that allows us to be comfortable in society and that we help you know what in the US. And I'd be interested in your opinion as to how much this is changing the dynamic in the US. The regulators, the FTC, the White House, they care about power and about equity and about civil rights. And it turns out that you actually need privacy and data protection law to discuss these things. Right. Because if it's discrimination based on data, if it's AI, if it's regulating tech so they're not talking about privacy a lot. They're holding listening sessions now from the Department of Commerce with what we think will lead to privacy legislation around things like discrimination and about fairness and about equity. And it's actually very much about data protection. Right. The FTC met with antitrust competition authority leads from EU this week, and they were talking all about competition and power and companies and what are they worried about the data that those companies have. So privacy has become a term that means anything and everything to everybody and anybody, and it's almost not helpful anymore to be talking about it. I think, as a catch all, I.Jamal:
Definitely think what you said there, but as a catch all and people actually forgetting what we actually mean when we're talking about privacy. Some people will be like, no, I don't care about it, but they're not actually understanding the fabric of what we're talking about here and how it touches upon all of those other things that you've mentioned there as well.Jules:
Yes, of course. Article eight and perspective for the Pickle article. I'm trying to talk globally and use sort of privacy the way a range of folks use it. I often try to think about it in three ways. One is, yes, what is this respect for the private life that has a very high level of protection? Reading my emails and so on and so forth. Right. And then let's put that aside for a second and let's talk about the business that we're in. Right. Data protection rules that balance don't guarantee me privacy. They guarantee me that there will be a balance, that there'll be an assessment. Yes, the government may have to have information about COVID. Yes, other people may need to be informed, but it's going to come with rules, restrictions, transparency, limitations that help us order society. I may or may not have, quote unquote, privacy at the end, but I have a system to set the right rules in place that structure this balancing of freedoms. Right. And then completely side. Let's talk about the philosophy of privacy, perhaps best set forward by Professor Helen Nissenbaum prominent, philosopher at NYU for many years now at Cornell Tech, who really uses things like context to define whether we should be thinking about it as privacy. And I find it very helpful. Right. Let's just think about our naked bodies. Obviously privacy. Of course, if I'm going to be embarrassed or seen, if I'm in a room alone, I'm trying on some clothing, I don't think privacy I'm alone. It's not even an issue if I'm with a loved one, again, I say, okay, well, now there's somebody else here. But I'm perhaps very happy that we are naked together, and it's appropriate. There's no privacy invasion when we're together. Now you take a picture and you do something with it afterwards. Well, now you've violated the terms that we had in place because you've taken what I shared and used it out of context. Right. I'm now at the beach, and I'm wearing a bathing suit. And again, depending on my culture and so forth, I may have no privacy concerns. This is where I'm expected to be in this garment. I mean, the doctor's office. Again, the context is appropriate. I'm naked out in the street because I was just tossed out for whatever misbehaviour. Well, okay, now I'm exposed in a way that is great. They're all the same situation, but because the context is different, there's a dramatically different, quote unquote, privacy. Being naked can't be the judgment, and that's where data protection sometimes has a hard time because it says naked, therefore privacy. Oh, no, wait a second. There's many context, so I think the philosophy of how someone feels about such is, one, framing the article eight, the respect for private life, and how that might give different legal protections to certain things that invade my privacy. And then the tools of our trade, the data protection rules that manage the rights and freedoms and set the balancing and so forth, which hopefully sometimes overlap with privacy, but frankly don't always.Jamilla:
Yeah, I think it was a good analogy. Can you tell us more about your role at the Future Of Privacy Forum and the work that you do there?Jules:
Twelve years ago, when I was planning to move on from AOL, I thought there was a gap in the data protection landscape. I had been a chief privacy officer very early, when there was just a handful of us. I was among the group that founded the IAPP, and when we started, it really was 30, 50, 60 of us who kind of were regularly engaged. And certainly by the end of my chief privacy officer days, there were now thousands and thousands of folks around the world, and there really wasn't a good place for those folks to, shall I say, do their business. There were trade groups whose role was defending industry interests. They weren't typically deeply involved in data protection other than lobbying against the legislative efforts. The things that were on the plate of many of my peers who weren't just in the private sector, they were at universities, they were at schools, they were researchers, they were senior people dealing with complicated issues of data protection. Right. Oh, I'm in the automotive sector, and now we're starting to use data in some novel ways. I'd like to talk to experts in each other and learn and figure out, and maybe I've got some good ideas of what the right practices are. How do I perhaps make that more visible in my industry to regulators and so on and so forth, and regulators increasingly as well. We're finding it frustrating to really get good information. The trade groups kind of gave them certain limited information. Companies were sort of on guard. Academics are incredible and valuable, but aren't always the ones under the hood, seeing the way the data flows actually work, or don't have the time or interest to sort of go into one particular direction if it's not in their research direction. So when I started the Future of Privacy Forum 13 years ago, my goal was to create a place in the centre of the data protection debates, working with the senior executives at companies. We've got about 210 as of today, but also we've got 15 people on our team who work with schools and universities around the world on data protection issues, right? How do they use Ed tech for remote learning? How do they the issue in some countries now is scanning the communications of students who might be talking about suicide or might be talking about bringing a gun. How do you deal with technologies like that? Do you use them? Do you not use them? How do you use them effectively? What about all the researchers who want access to the data of platforms or frankly, of other companies where they can study the effects of these platforms on individuals? How do you make that data available without creating a privacy concern, but yet ensuring that we can assess what these platforms are doing? So my goal is, can I create a place in the centre that works with all, that is very diverse in funding across many, many different sectors that doesn't solely rely on corporate funding? So that we've got the independence. We've got significant amounts of foundation funding, National Science Foundation funding, which is US Agencies used to fund much of the academic research that is done well. So that's what we do. We've got about 50 staff offices in Singapore, Brussels and Tel Aviv, and we run working groups on the issues that are, I think, driving a lot of interest, where people want help, want discussion, want education. We do training like you guys do, except we don't do the data protection training. There's a lot of good programs. The program you provide, others. What we do is we provide how does it work? If you don't understand the data flows in a real deep way, good luck trying to provide useful advice. So we do lots of how does Ed Tech actually work so that you can then assess? It's hard to get that training unless you're really working at an organisation or working in the sector. Unless you live and breathe that sector, you don't truly understand what's collected, where does it go, how does it work? So we do that data flows for data protection experts type training. We try to help educate policymakers as well, so they can understand what's going on and make a really smart decisions. And we care about the opinions of NGOs and advocates. We bring them into the tent because they've got an important say and they should be telling us what they think, and we ought to be integrating as much as we can their issues and concerns. So we do best practices, codes of conduct. We do a lot of peer to peer meetings, education engagement symposium, AI did, self driving cars, ed tech, you name the fun issue. We typically try to pull together the leaders in that space and say, hey, what needs to be done here? What can we figure out.Jamal:trends we can see going into:Jules:
Well, we obviously see regulation accelerating around the world. In Europe, you're on your now second and third wave because we're calling it AI regulation. But a lot of it is data protection related in many ways and we call it the Data Act and data governance. So it's wrapping up many of the issues that weren't fully covered. Of course, the challenge is some of them kind of were covered. Well, if our question is that we don't like profiling or we don't like discrimination, don't we actually already have guidelines that are built into the GDPR? Why are we inventing new stuff instead of building the interpretations we want? So I'm a little bit frustrated that we are sort of well, because it's called AI. It's a new game for data protection. Well, where did we fall short? And if so, what do we need to do to extend instead of coming up with a whole completely parallel system that needs to integrate well, but obviously the next wave, I'd say of follow on to enhance and fix obviously the platform focus, which is the driver of a lot of the activity. I think what's not always appreciated is that, well, there's a lot of the world that integrates with those platforms and it's not super easy to say, hey, these are rules for Amazon and Facebook without saying, oh, okay, you've just told every app who doesn't have an app in the world how their business is affected or who else advertises and so forth. So in general, platform driven, how do we do additional obligations to deal with the power of large tech in the US. We're going to see lots and lots of state activity and then again a bit more of a focus on discrimination, targeted activity. I'm hoping we'll see some real sophistication in the PETs because some of the technologies that have been effective for privacy but have limited the utility of the data, which therefore makes it not as interesting a PET when you don't then get to do some of the basic functions that you need. Obviously the PETS that at least highest interest to me are the ones that allow you to have some of the important activities that we want to see while providing really strong mathematical protection. So we're seeing great advantages in multi party computing, homomorphic encryption, synthetic data, differential privacy techniques. We're seeing a lot more on device and we can debate whether that's going to give you more or less privacy. But certainly many actors have decided, well, if the data is not shared because it's on your device, your phone, your computer, your chip, well, then it's not shared and sold. Now, of course, you can still do lots of nefarious activity with targeting and so on and so forth. So we haven't solved the problem, but lots more how do you keep the data in your cave, in your basement, in your server, on your phone, but conduct activity? So that's going to happen. Let's talk metaverse for a second. And I don't want to call it metaverse because then it becomes all about what's Facebook doing or are we going to be wearing things strapped to our heads? But the reality is, who said that the Internet is locked into a computer on my desktop or a laptop or a mobile device? Those are all incredibly useful, but each time it became more mobile, it opened up new areas, right? The fact that my computer is now on a phone and has all of these additional sensors obviously created all kinds of privacy issues, but also created all these incredible capabilities because now the data was with me on the beach while I was hiking to take a picture. But again, this is still clunky. I still have to look down if I'm walking, I still can't do it while I'm driving. I still have to tap in certain ways that don't work as well for everybody, right? So the fact that we're going to see advances, immersive technologies where access to computing and data and sensors will be off this keyboard and off this little device. Now, what that looks like, whether it's as extreme as strap the oculus on your head or whether it's Roblox Fortnite and some of the things you see in gaming or something in between, I mean, look at us now. We're doing this over a VC and we're in these little boxes. Isn't there something better that's going to be a little more immersive? We're going to be having these conversations and look, this is better than what happened two years ago, right? When we had like a webex and we couldn't even do video and then we had video and we couldn't unmute ourselves. Now we're using the technology. We can have a chat. People are starting to use the chat a lot during a lot of the conferences that we've been running. Like, more interesting activity is happening with the people on the side debating and discussing and the panelists sort of interacting so can't this get a little bit better. And I don't know if it's avatars and I don't know if it's maybe popping up the screen to be a little more AR. I've learned in 30 years of doing this that the products that I think are a success, I thought we'd no longer have business cards. We'd be like bumping our phones to exchange contacts. When's the last time it's technically possible, but when is the last time you bumped your phone? As a way of linking in with somebody or exchanging a contact, right? No. Here's my phone. Type in your email. Right. So I don't understand why certain things work and why certain things don't work. But at the end of the day, it's clear to me that both consumers and companies are going to move outside the box and we are going to be engaging in immersive technologies, and that's going to create a lot of work for all of us in data protection, because now we'll be debating identity a lot more, right? My identity will be relevant to how much I charge my vehicle. It will be relevant to what identity I use in the Metaverse. It will be information I want to take with me across different services. So biometrics identity, the role of advertising in all those new spaces, right? Are they going to be in the Metaverse? Is it going to be offline? Is it going to be in the billboard? That's I think, what's going to be busy in many of us, we're talking AI today. Prediction we won't be talking AI in a couple of years. Just like we don't talk about, hey, electricity. Oh, look at all these things electricity does, right? Electricity powers all these things. Now we expect that there will be electricity, right? Yeah. We're worried about can we have it in batteries and can it be more portable and why do I have so many problems recharging and all that, right? And that will be solved as well. These things get better and better. And so the expectation that machine learning develops and advances and that there are certain things that are interactive, that when we ask, we get the right information via voice. Those are going to be part of the fabric of what we expect. And they won't be, oh, my God, did you realize you can speak to that thing? Oh, my God, of course you're going to be able to speak to it. Why should you only have to type? But we have to solve. But these will all create sensitive data issues, and it's going to be up to all of us to make sure that we learn from the lessons of the past. We don't have a mess like we have today with the Edtech ecosystem spilling data everywhere. How do we have economic models that do more for individual autonomy? Maybe that's a good closing thought as well. That's, I think, where we end up going for the future. We will be doing a lot of remote stuff, even though I hope we are back in person. But we will be looping in the people who are international, the people who have disabilities, the people who just can't travel, the people who have a baby at home. And I hope that we'll be not isolating those people to like a box on the screen while we all have the drink and the glass of wine around the table. So if we're able to be inclusive, use technology at the end of the day for the benefit of humans in society, which is, I think, what we all want. We just have to wrestle, sometimes companies and sometimes government into their appropriate box.Jamal:
Thank you so much, Jules.Jamilla:
Whenever anyone says Metaverse, I go back to the podcast we did with Avishai and talking about the multiverse and Marvel, and that's where my mind jumps to. It's interesting. You are talking about PETS. I was reading this morning that the UK and the US have partnered to award prizes to people who come up with good PETS, I guess. So I thought that was quite interesting.Jules:
Yes, that's the grand challenge that I referred to, so hopefully we'll see some great things coming out of that. All right.Jamal:
Awesome. Thank you so much for making the time to come and talk to us on the Privacy Pros Academy. It's an absolute pleasure having you.Jules:
Wonderful. Lots and lots at fpf.org. And, of course, you. Thanks for having me, Jamal and Jamilla, have a great rest of the day.Outro:
And we'll put those links below.Outro:
If you enjoyed this episode, be sure to subscribe, like and share so you're notified when a new episode is released.Outro:
Remember to join the Privacy Pros Academy Facebook group, where we answer your questions.Outro:
Thank you so much for listening. I hope you're leaving with some great things that will add value on your journey as a world class Privacy Pro.Outro:
Please leave us a four or five star review.Outro:
And if you'd like to appear on a future episode of our podcast or.Outro:
Have a suggestion for a topic you'd like to hear more about, please send.Outro:
An email to email@example.com.Outro:
Until next time. Fine. Peace be with you. Bye.