Data in the 4th Industrial Revolution

Brave Spaces Roundtable
Brave Spaces Roundtable
Data in the 4th Industrial Revolution
/

Dédé Tetsubayashi
Hello, Debbie, also known as the Data Diva, it is wonderful to have you here with us today. I’m Dr. Dédé Tetsubayashi; I’m CEO and founder of incluu. And this is another episode of the Brave Spaces Roundtable.

Debbie Reynolds
Oh, such a pleasure to be on on your show, Doctor; I really appreciate the call and happy to get to know your audience.

Dédé Tetsubayashi
Thank you so much, and welcome. We are very excited to have you join us today. And we’re looking forward to today’s taping. Okay, so today I want to talk about data, I just want to give a little introduction to who I am. I’m a social scientist and technologist who’s built an expertise in product equity and inclusion and accessibility over the last 21 years. I am very excited today to have you on the show, and would love to hear a little bit more about you and your work and what brings you to data.

Debbie Reynolds
Sure, thank you so much. So I’m Debbie Reynolds, they call me the Data Diva. I’m the founder, CEO and Chief Data Privacy Officer of Debbie Reynolds consulting, I work at the intersection of privacy sort of law and regulation and globally, and technology. So I’m a technologist by trade, but I get involved in many different things as it relates to data and, and the rights of people. So I work a lot with organizations that are doing like digital transformation, like they’re moving into new data spaces. I like to work a lot with emerging technologies. So things like robotics satellites, the metaverse, you know, all types of wacky things that people are doing. So I try to stay on top of all the stuff that people try to do. You know, I love technology, but I don’t love everything that people try to do with technology. So that’s my thing.

Dédé Tetsubayashi
I hear you on that I hear you on that all the wacky things, please tell us a little bit more about what counts is wacky. And what’s especially important to what we’re trying to do with technology and data that brings you to this work?

Debbie Reynolds
Yeah, let’s say you were in a study, like a medical study or something, someone take your picture, and you sign all these releases, you know, you sign all these forms, like, okay, it’s only for that. And then your your image somehow ended up in a public data set that’s being used by AI to create other images, you know, derivative images on the internet, and you’re like, you know, what happened? What happened to like, my data, what happened to my rights, right? And then the US because we have so many, a lot of our laws are very consumer base, you know, you’re not a consumer, you know, that companies like, well, we don’t know where we got the data from. You’re not our customer. So we don’t have any obligation to you. There’s no, that’s, that’s some of the problematic things that we have. But I mean, you know, we have, you know, on the wacky side, in my view, you know, people wanted to create, you know, real estate property in the metaverse. You know, having people in the metaverse have rights in the middle, you know, what people call like, immersive, right. So, you know, if you’re an adverse what slaps you like, Oh, my God, I was, you know, what am I gonna do or whatever, right. So a lot of those issues just haven’t been sorted out on a legal front. So I like to work with companies on standards around that. And then I also very much deeply into things like smart cities and IoT technologies. So things like you know, connected, cars, traffic systems, almost anything that you do, where you’re walking past. So sir, you know, you’re doing things with smart speakers, those are things that I enjoy working with.

Dédé Tetsubayashi
Fabulous, thank you so much for sharing, as we are entering, or maybe even now fully immersed in the fourth industrial revolution. And for us, it’s really important to think about what does it mean to build inclusively? Or does it mean to build for equitable outcomes? And how do we actually think about what social impact and social justice look like at the intersection of product development, design, and so on and so forth? And you brought up of course, like AI and ML facial recognition, the metaverse and like, what rights can a person actually have over their information? Especially in a place like the metaverse that is not a place, right? It is. It is in the cloud. And it’s it’s a virtual arena where we do still have ability to recognize who individuals are to a certain degree. I’d love to hear more about what ways has your trajectory changed to keep up with these types of technological design advancements? And for example, like, what would you say or how do you stay on top of the needs and demands for data privacy and security as we’re facing new and unforeseen changes? ideas about how to continue to include data privacy and security and protecting our information in these new environments.

Debbie Reynolds
Yeah, I tried to take a human centric approach, you know, thinking of myself, right, like, what would I like? Or what would I think was reasonable. And then also, you know, the technology always outpaces the laws. And as we’re seeing emerging technology, they’re getting farther and farther ahead of kind of laws and regulations. So, you know, a lot of times I’m studying, you know, data systems, or things that are happening are emerging in data. And people asked me about, like, I’ll talk about it, like, this happens a lot. So let’s say something I was talking about two years ago now and like, you know, they have a lawsuit about it recently. And I, you know, I, a lot of times, like, Okay, here’s a video I did about that two years ago, because I’m way far ahead, like looking to see what’s happening, and then imagining to myself, like, you know, what could be the legal implications, or what could be the human problems in that, because I feel like, you know, the future will not be like the past. So looking in the rearview mirror, it’s not going to help us in these new spaces. And so all also, I think that we need a new way to think about problems where I don’t think the type of problems so we’ll have in the future as a result of a lot of emerging technologies. You know, there will be no, in my opinion, adequate legal redress for the problems that we’re going to end up having in future. Right. So, you know, the, the the harm to individuals can be catastrophic. And I don’t think that a law can really help in that regard. So I do, you know, I think regulation is good, but I think when you, you know, regulation isn’t the, it isn’t the magic pill to be able to solve some of these issues. So, you know, for example, let’s say someone is in a school situation, a person of color, they’re on a, like zoom call, or something like that, with a teacher, the software, because it can’t really identify, the person may say, hey, this person isn’t paying attention or whatever, you know, this, and maybe this, that individual gets, like low marks, you know, they feel okay, this person isn’t like holding up with everyone else. So then they put they put that person as a result of those kind of AI, things on maybe a slower track where they don’t get the same opportunities, they don’t get the same classes, you know, they can’t, you know, they’re not on the Harvard track anymore. They’re on, you know, maybe you go to a community college track. And so what is the redress for that, you know, something that has happened in the past that creates this sort of systemic level of inequality that just snowballs on itself? You know, I don’t think that there will be, you know, that could be, you know, devastating. So someone can’t really change, you know, I don’t think a lawsuit is going to change that for someone.

Dédé Tetsubayashi
That’s a really interesting point that you make if a lawsuit isn’t going to change the situation, especially in addressing systemic and justices for someone? How do we then think about how to use history? How do we learn from history? And how do we use historical data points, to think in a future four sided manner and try to prevent harms before they happen? or mitigate them? Or like, what what would you say would be a better approach?

Debbie Reynolds
I think it has to it comes down to design but not only design, but also how people are using it, right? So you know, like, let’s say, a brick is used to to, to build houses, but you also take a break and harm someone with it, right? So I’m not saying outlaw bricks, I’m saying, let’s figure out what’s the best use for it. Let’s think about the harm. And let’s try to find ways to build things into the system, whether that’s AI audits, or whether that is you know, doing privacy by design. That’s something I work with companies on you know, like hey, you know, this is a problem with this AI system. You know, if you know, especially when you’re talking about high stakes, situations that can harm people so for me a lot of AI says but like I don’t think that AI systems should be used in you know, things where okay, you’re gonna say, oh, I can this person committed a murder because the AI system thinks they look like X. Like to me, that’s not the way you should be using AI and right so I think it’s reused for more lower stakes things and then I also think that you know, humans I can’t aggregate advocate there are human judgment and they’re human responsibility to machine or to AI. So that’s not what it’s made for. It can never be as intelligent as a human. I don’t care who says or what, right. You know, I’m thinking about, like an example, let’s say, this happened a few months ago, where two guys and Tesla, they got that, you know, they’re supposed to be in the driver’s seat and have a self driving remote, right? So those two guys got out of the driver’s seat, I think they were like, in the backseat of the car, and it like ran into a tree and blew up. Okay, so that’s, to me, that’s what I feel like people are trying to do AI if you’re trying to say, okay, AI is smart enough to think for me as a human, and it isn’t, but he has his own rules, but you have to be in the driver’s seat. Human should make the final judgment. And we shouldn’t abdicate our, our human judgment and thought to a machine or to AI.

Dédé Tetsubayashi
I really like that example that you gave in, in comparing where AI and ML technology is today, as you said, like a lot of people do think it’s more advanced than a human is. And I always advise caution, because for me, it is something that we have created and invented. The way that I usually describe it is that it’s, it’s like a two year old child. It still needs guidance, it needs guardrails, it needs to be taught how to make certain decisions. But AI is not it’s never it’s not a true intelligence. It is. It is a very, it’s a mathematical process, right? And it’s determining out of this historical bunch of information that you’ve given me, what is the probability that I can say that this thing that is occurring within this large data set? is potentially can I say this is a cat? Not right? Can I actually drive you and your friend in a in a hands free manner? At the moment, that’s not a feature that is part of any self driving car or autonomous vehicle? Let’s say that because there isn’t a true self driving feature available? Right? There could be in the future, right? We’re nowhere near that type of maturity at this moment.

Debbie Reynolds
Absolutely. And I don’t think we ever we ever should be right, I saw there was a case where this guy he got arrested because someone said this facial recognition thing look like this guy. And I saw the two pictures. And there were some similarities with people’s facial features. I don’t know maybe people think all black people look alike. But like, if you had eyeballs in your head, you could tell that these were my upset people. Right? So it’s like, if if you pay someone thought that that was accurate enough to actually was they actually arrested this person? Right? Actually, they went to the to the judge. And the judge looked at the evidence picture, he said, This is not the same person, like why did you even bring this person here? But one of the judges said, Hey, I think this, you know, I said, that’s the same person, and this is the person you don’t have no, no other corroborating evidence, who said, what an AI system spit out. And you know, there, as you know, with the databases, you know, like, if you, let’s say, let’s say you search for something on Google, and it didn’t it no result came up, you’d be like, Well, Google sucks, right? So we no matter what you search for, they’re gonna give you a result, whether it’s good or not, because they want you to continue to use it. So when you do a Google search, and it says, Oh, you got a million hits in three seconds, like, there are a million hits for what you look for, they’re probably, you know, five or 10. Maybe, like, if you went 100 pages in, he’s like, what, why did this come up, you know, because they want you to use a system. So they’re using those same systems and transferring over to things like facial recognition databases, so they don’t want the facial recognition to come back with nothing. So they’re gonna throw something up there, right? Like, oh, this is whatever now was a computer told us that this is this guy, so we’re gonna go arrest him and they let the chips fall where they may. And that’s like, the wrong way to use data. And that’s the wrong way to to be able to think about data systems, you know, to me, that’s like they, you know, did being digitally illiterate, in my opinion,

Dédé Tetsubayashi
digitally illiterate, I like that. So, in your, in your experience, what are some of the challenges that you face or that you have faced as a thought leader in data privacy and data security, and moreover, with intersecting identities as a black and woman thought leader in the field

Debbie Reynolds
challenges let’s see, let’s see how No, no, that’s a good question. I’m sure I’d have a good answer for this. I don’t know, I’m just dabbing, and I do my pain. You don’t, I’m saying so like, I don’t, I’m not, you know, I’m running my own race. I’m not like really challenging, or I’m not really, you know, getting in other people’s way. One thing that I think is very interesting is a lot of times, like I’ve been in data for, you know, well over 20 years, right. So, a lot of times when people talk about privacy, they think, sort of legal, legal hat. And then when they, when I, when I’m on a list or something about privacy, I’m almost always the only tech person like, so the majority of people are lawyers, but, you know, a lot of lawyers don’t have 20 plus years of experience and data, right. So I think that’s very different. And then just being, you know, having had a long career in corporate America, you know, I was almost always the only woman or the only black person or the only person of color and, you know, in a lot of spaces, so I visited a relative of my, my boyfriend, who’s Jewish, and she was saying, you know, she was asked me, So how do you deal with being in all these spaces with, you know, all these white folks or whatever, that’s just the way it’s been, you know, I hadn’t really thought about it, because it’s just, that’s the way it is, but I want to see, you know, I always reach out to people of color, and I want to see more diverse people, because, you know, my, my thing is, if you don’t have a different perspective, you can’t really get a rich solution. So if you like I say, let’s say I have 10 people, I gave them all cameras, told them take pictures of the same thing, that everyone will have different pictures, right, they’re taking a picture of the same object, but they have a different perspective. So if you have people who have the same, you know, came from same place, look, the same talk the same have the same experience, they’re gonna have a very narrow point of view, and especially when we’re dealing with a human problems, you know, all humans need to be involved can’t just be, you know, Ivy League, white human spirit or something like that. So we need, you know, we have human problems, and we need all people from all walks of life, all types of industries, you know, like, I very much advocate to have people, people in engineering who are interested in privacy, people who, you know, like you like, who are, you know, have some social science background and in, in data, so all data people I think, should be have some involvement, because this ultimately impacts all of us, as humans.

Dédé Tetsubayashi
Agreed, agreed. And it’s interesting that you, you separate when you’re in environments where we talk about privacy, because you’re right, a lot of people automatically assume legal privacy, they don’t assume information, privacy, or data privacy and the connections between them. But for for those of us who are intact, we know how closely they go hand in hand. And we also know, the constraints, the tensions that we can sometimes faced when trying to work with teams that don’t have the representation of all the folks that they’re trying to reach, or even that they should be trying to reach. If we don’t have that representation. How do you then make sure that you’re building or taking photos from a holistic perspective so that everyone can see all of the different angles and everything that should be represented in that image. And one of the ways of course, that I think is best to address that is thinking about the design phase, especially when it comes to product and making sure that there’s representation for all the different functional areas, in addition to the communities of people that we’re trying to work with. Without that relationship without those trusts across functional areas, I can’t be a good product manager, if I don’t have a strong partner and user experience research to make sure I actually know what it is that I’m building and that I’m building it for the right type of folks. I can’t build anything if I don’t have a strong engineering team. And engineering team can’t build anything if they don’t know what the customer at the end of the day needs or wants. And privacy, data security are all important to them. Right? So we need to make sure that data folks legal whether or whether it’s legal data, whether that’s data privacy focus, or whether it’s legal privacy focus, we need to make sure that we’re all involved in some manner. So I really like that. Yeah, absolutely.

Debbie Reynolds
We need to break down those walls and all silos because of problems that we have going forward. We’re going to need a multidisciplinary team of people that are diverse First to be able to solve human problems.

Dédé Tetsubayashi
Mm hmm. And if, in your more than 20 years of working in corporate, you usually are the one and only? What does that say about what it is that we’re thinking about when we’re trying to build for a future? Where, right? We don’t know what is going to be happening, like societally. And we need to be preparing for the eventualities, that all our information, for example, might be available virtually, what does that mean? And why do we protect each other from giving away all of our information? Because we do that without realizing we’re giving away our information? Oh,

Debbie Reynolds
there’s so you know, we are, you know, it’s complicated, right? Is it? You know, there’s psychological manipulation there, in the way that these products are built, you know, it’s a lot of psychology there. So it’s like, you know, you want the cotton candy now, but you don’t want to hear that your teeth are gonna fall out six months from now, right. So it’s like, they know that they that, you know, these technologies, they, they are addictive, and they are designed to be addictive to play on your psychology, the way that humans think, you know, kind of the reward system, almost like a slot machine. And it works, unfortunately. But I think that there should be more transparency in what’s happening, I think people wouldn’t be as thrilled about some of the stuff that’s happening, if they have more visibility into what’s happening. And then also, I think that the value exchange is very asymmetrical, right now. So okay, let me give you these, let me give you this free product, but I’m gonna take like all your data and create like these profiles about you so that other people can’t discriminate against you. Like, that doesn’t seem right. I wouldn’t give you my data. If I knew that you were gonna do that. And then, to your point about, you know, just kind of diversity, what we have right now is a situation where people have blinders on, right? So I tell that story I tell is, let’s say, let’s say you go to a grocery store, you step on that mat, that opens the door for you, right? So the person in front of you, let’s say white male, he steps in, the man opens, right? You are me, we step on the mat, and nothing happens. Okay? So it’s like, so what happened? So the person in front of you like, well, there’s nothing wrong with the man because when I walked through the door, it just open and you’re like, wait a minute, if it doesn’t open for both of us, there’s a problem. Like, we need to like, have a timeout. So that’s, that is the issue that we’re facing, where not everyone is experiencing the same level of harm, or a same level of problem. But when we have people are, who are trying to build things for the world, and people, they need to understand how that impacts everyone, not just them.

Dédé Tetsubayashi
Agreed? And if we’re continuing to build from a perspective of the people who do have access, or who are able to step on the mat and be recognized or put their hands under the the soap dispenser or the standard, what is it Sandy? Yep, right, dispenser, and it actually does something and either dispenses soap or dispenses alcohol.

Debbie Reynolds
But if you have darker tone skin, and you step on the mat, and the door doesn’t open or you don’t get any, so where do you go? And where do you go? That’s the problem? And how do you fix it so that as many people are then able to actually fast write that product? That? That’s right. That’s right. If it doesn’t work for everybody, you have to go back to the drawing board. Right? So I think governments have a harder time. So governments deal with this, this issue a lot. Because when you’re in a government, everybody’s your customer, right? So as opposed to, let’s say, your Apple user, so Apple really only cares about who their customers are, they don’t care about they’re not customers, but in a governmental type of situation, let’s say so security, like they, they have everybody’s data, right? So it is if people can’t access their benefits or whatever, they have to find ways to connect with people, for example, not only in digital ways, there are some things in the mail, you know, they have other, you know, stuff for people who are hearing impaired, you know, Ada type of accommodations on website. So I think taking that as a model and understanding that at some way some form. Companies may have multiple different types of users, they need to think about how can I get this product in the hands of many people as possible, and man, think about that accessibility.

Dédé Tetsubayashi
And then when you talk about accessibility, I like to encapsulate it always and Woody, because if we don’t center, the folks who have the most difficult access to whatever product we’re building, we’re not actually reaching the people who we need to reach the most. And we’re not reaching their support network, which are three to four to five times more people. And when it comes to product development, we tend to build for that, quote, unquote, 80%, that middle group in the bell curve, right than the people who are on the edges. And so my focus and includes focus has been to showcase how to better design with the people with the most difficult access so that you’re reaching as many people as possible and as diverse a group as possible, ensuring that you’re not only building for people, you’re also building for eventual profits. Right? That’s right to work that way.

Debbie Reynolds
I agree. I agree wholeheartedly.

Dede Tetsubayashi
Well, I would love to hear more about your thoughts on automation, you started to touch on them a little bit earlier. But when it comes to automation, machine learning AI, what does their implementation and advancement mean for the future of data privacy and security? And if you’ve got some pros and cons for each, and in your opinion, we’d love to hear that too.

Debbie Reynolds
Yeah, I think, you know, always having a human centric focus, you know, a lot of the tools and things that are built today, they’re very corporate company focused, like, Okay, we want to do this XYZ, it isn’t necessarily putting the person or the human in the center. So let’s say you have the best product in the world, you know, people now are hat getting more rights, right. So they have more of a stake, not only in the product or service, that you’re serving them, but also how their data is handled. So, you know, we’re not gonna see a stop, you know, we’re gonna, we’re actually seeing an exponential growth in the use of AI, machine learning, automation, you know, there are positive ways, obviously, that you can be able to use that. But I think when I think of like AI and machine learning, or these other digital systems, I think I would have a sword, they have the cuts both ways, right? So a lot of times the focus is only the good stuff, you know, you could do this positive thing or whatever. But when that when that sort of swings the other way, what is the harm that happens? Right? So I’m, like, all for the rah rah about what you can do good with something. But then I also want to say, companies really need to be looking at the potential harm, and we’re seeing, especially Europe really going after companies on this AI kind of accountability. Like, you know, if so, what are you doing with AI? Can you explain what you’re doing? You know, what is this? What what do you think it’s supposed to do? What are the results? You know, where are you getting this data from? So they’re asking a lot of questions. And I see jurisdictions around the world, really picking up on that and saying, Hey, we want to be involved. We want to figure out what like what’s happening with with data and what what information and you know, hopefully, there’ll be more AI audits as well. So I think the idea is, definitely you know, don’t be afraid to use use automation or, you know, machine learning or AI, but then think about it as a not only a force multiplier, but like a sword. Okay, so I’d say, you know, people treat AI like it is a teddy bear, but it’s actually a grizzly bear. Okay, so if you saw off say, there, I put you in a room with a teddy bear you like, oh, okay, great. They said, You put you in a room with a grizzly bear, you’re like, wait a minute, wait a minute, I need to think this through I mean, so whatever. So as long as we think about it in those ways will be better put out better products that create less harm for people.

Dédé Tetsubayashi
Grizzly Bear versus teddy bear, which would you choose?

Debbie Reynolds
You know, you gotta be

Dédé Tetsubayashi
real it is the grizzly bear. Can’t be can’t be fake, you gotta be ready, right? To be ready. You have to be prepared. You have to be prepared for what you’re doing. Yes. And you have to be to to be prepared to actually have a mechanism to derail if things start to go sideways, you need to figure out ways to bring the Chrisley back into the center of the navy or to the other side.

Debbie Reynolds
Yeah, I mean, we’re in half. We need to have tough conversations people are going to be uncomfortable. We need to you know, we it’s gonna get heated it’s gonna get hot you know, we got a boat. That’s what is required to be able to do this in a way that doesn’t harm people.

Dédé Tetsubayashi
And part of that means remembering that humans are the the additional test that’s necessary you Can’t have complete automation, especially using systems that don’t fully understand what it means to be human, it can’t make the kinds of decisions or the kind of ethical judgments that humans can make, it can only say A plus B equals C, or a plus b equals c to a 70%. probability that c equals C, right? Or C is C, it might not be seen, it could be like, well, maybe CSV, but I don’t know. Because you fed me so crappy data. And you want me to give you some gold or diamond, and I can’t really,

Debbie Reynolds
yeah, I saw I had never seen Robocop before. Oh, you haven’t watched it. So my boyfriend May we see it recently. And he knows I love technology movies, and it’s all movie, but it has some very interesting points about automation and the proper place to be using it. So they have this this machine gun robot, and I guess some big machine gun robot, if they mark a target, he can like, you know, take this target out, if something happened, and they were trying to do a demonstration, and they mark this person, and they couldn’t undo, you know, she they couldn’t stop it, right. So we’re seeing like, you know, soccer this guy up or whatever, and one of the one of the people who was at hand, but the programs Oh, you know, just needs some tweaks. That’s the problem that we have. So it’s gonna go and do this, you know, first of all, you built it, and then you can’t stop it from doing what it’s doing, because it was built to do that, right? But then you’re like, Oh, well, we’re gonna pretend like this bad thing didn’t happen. And let’s think, you know, tweak it a bit and see you, you know, let’s keep on going make this money. So, yeah, that’s a problem.

Dédé Tetsubayashi
It is. And it makes me think of how we’re already using AI and automation systems. And in all areas of life, like we’re using it in our legal system, as you’re alluding to earlier, not effectively, obviously, we’re using it to determine whether someone is able to get credit cards or housing loans or whether they get insurance to cover, you know, care, things that we really shouldn’t be using for it at the moment until we truly understand how we can interrogate how it’s making decisions, and we don’t yet have much understanding on how to do that. We were at least we haven’t taught people how to be able to do that. So

Debbie Reynolds
yeah, and what’s for what’s been forced is more transparency, they haven’t, you know, happened before. So companies that do credit to like FICA, and all that stuff, vital, or whatever FICO scores and stuff like that, you know, that was already thought, Okay, this is our secret sauce. And we don’t have to tell you how we make these decisions, right. And so what these laws are saying, Yeah, you need to tell us that. So I think companies should do that type of rating, they’re probably shaking in their boots, because they have a lot of time to not be transparent. And I think that’s going to change in the future. So there was a case and you’re going on where a guy he tried to apply for, like an energy account, you know, like the, you know, the lighter gas company are helping, and he was denied based on some increase in some corporate credit agency in the Europe and the only thing you know, when they investigate it, he filed case and they investigate it. And they said, The only thing that they knew about this person was their name, the whether you’re male or female and their address. So the court was like, based on that, why would you deny it? You know, you didn’t have anything about his credit history, like what was it about her algorithm that said that he was not credit worthy without knowing you know what I’m saying? So, I think there gonna be a lot more things like that. We’re like, oh, well, we’re sorry, that we discriminated against, you know, obviously, there’s a problem in that algorithm and isn’t very transparent, because you really, you know, and the court said, you know, the company can’t say that this is like a secret sauce type of thing, because it could create a true heart for this person down the road. You know,

Dédé Tetsubayashi
I have a question for you on that to remind me, there’s one connected to Robocop, I want to ask you versus Robocop vs Terminator, which do you think is the better representation for where we are in terms of the teddy bear versus the grizzly bear?

Debbie Reynolds
I’ve never seen Terminator. I don’t watch every movie. That’s why my boyfriend maybe watch it but, but I thought Robocop it had a message in there that people really need for the to think about it. So it is again about the problem of humans abdicating their judgment or responsibility to machines. That is the problem. So I think that they they I think they lay that out very Clearly what that problem is, and we still are dealing with that today.

Dédé Tetsubayashi
I would recommend watch maybe the first two terminators, maybe the first three, I don’t know, it’s the same story, but it’s when the robots take over because they are making decisions for humans and their ultimate result to the question of who’s, who’s, who? Or what needs to happen in order to make the planet, you know, do better, or what what can we do to make things better in this world that we live in? Their answer is get rid of the humans because they’re the problem. Whereas now we’re like, Well, you know, they’re not gonna take over, right? Maybe they will. Who knows? Yeah, well,

Debbie Reynolds
they will we jump into the passenger seat. Right,

Dede Tetsubayashi
continue. And one more question, do you have? Do you have any thoughts as to where we are headed? Like, if we want to compare, for example, where Europe is right now, in dealing with our unpreparedness, for data privacy and data security versus the US? Do you have hope, one way or the other, that the US is coming close to Europe, whether Europe is in the lead, whether it’s neck to neck?

Debbie Reynolds
Yeah, Europe is at least 30 years ahead of where we are, and will never be what they are in privacy, it’s just not gonna happen. So the reason why I say that is because the US is very consumer, focused on privacy. So a lot of our infrastructure around privacy are the laws that have been created at this point, it’s very consumer focused, right? Not all humans are consumer. So there are gaps there. Like, I’ll give you an example. So let’s say, let’s say you were a resident of state of California, California, as you know, has like the strongest consumer privacy regulation in the US right now. Call the CCPA, soon to offer at the CPRA. But let’s say you, you go to a grocery store, you’re part of their loyalty program, or whatever they have obligations to you about how they handle their date your data and stuff like that. So let’s say you walk cross street to a church, and they have data about you also, but they don’t have to comply with that law. Because a lot of those non for profit companies are all of them are exempted from having to comply with the law. And you’re like, Well, what happened, I’m a human, right, like, you’re supposed to be protecting me and my rights, but you’re not a consumer. So that’s the gap that we have that Europe doesn’t have, because a lot of their rights are human based. So they’re focused on the individual, you know, what their rights should be. So until we really bridge that gap, I don’t think we’ll be anywhere close to where Europe is, we’re obviously trying to push more consumer rights and try to make it more broad in terms of house setting more sector specific things. So like, we have laws around health, we have laws about children’s privacy online, we have stuff around finance, there are a lot of things that, you know, there’s more stuff, there’s more data out there. That’s why we have problems in the US with their brokers, like people who buy data about people and sell to other people, because you can’t the way the walls are built or built. It’s like, okay, you don’t even know whose customer this person is, like this company is so and you’re not their customer, quote, unquote. So how can you file suit against them? You don’t know who they are? Right. So that is, that is a problem. That’s a gap that I hope we can bridge in the future. But you know, I think it just takes time, it’s gonna take a long time, many years, hopefully working on different levels of this problem to be able to fill those gaps.

Dédé Tetsubayashi
Wonderful. Thank you for sharing that. And lastly, for our listeners who are looking to learn more about what you do as data diva, where might we send them to be able to connect with you to learn more about this very important work?

Debbie Reynolds
Sure. Well, anyone could connect with me on LinkedIn, just type in data diva, Debbie Reynolds, David diva, my name is pop right up. Also have a website called Debbie Reynolds consulting.com. I have a lot of videos and other resources, articles and things that people can look at. So you know, we want to get educated on stuff. So I do a lot of like five minute videos about different laws or different technologies. So if anything comes up, you know around technology or privacy, there’s probably a video on my websites you can like look at for five minutes and tells you a bit about it.

Dédé Tetsubayashi
Thank you so much for sharing It’s been wonderful having you today.

Debbie Reynolds
Well, thank you so much. You had great questions. And wow, I really love your show and the things you’re doing because I feel like you’re illuminating something that really needs more more focus. And that’s kind of the data side of the equation, not just the legal side because I, I say data privacy is a data problem that has legal ramifications. It’s not a legal problem. They have data ramifications. So when people get that they’ll understand how data is kind of the center of everything.

Dédé Tetsubayashi
Yes, it is at the center of everything. I love that. Thank you so much.

Debbie Reynolds
You’re welcome. Thank you so much for having me on the show.

Dédé Tetsubayashi
Please join us again soon.

Debbie Reynolds
I will I will definitely

Transcribed by https://otter.ai

Scroll to Top