First Principles
October 8, 2020
Guests
Ann Cavoukian, Executive Director at Global Privacy & Security by Design Centre
Pierre Racz, CEO of Genetec
Description
In this, our maiden voyage episode of Engage: A Genetec Podcast, hosts Kelly Lawetz and David Chauvin turn the spotlight to critical, often controversial topic of privacy with two thought leaders on the subject, author and former Privacy and Information Commissioner for Ontario Dr. Ann Cavoukian, and Pierre Racz, CEO of Genetec, to help us take the pulse on privacy and what’s at stake amid the global disruption brought on by COVID-19 and the changes resulting from the Black Lives Matter movement.
Transcript
DAVID CHAUVIN (HOST): Welcome to Engage a Genetec podcast.
DAVID CHAUVIN: As consumers of technology, privacy is never far from our minds, and today, in the turmoil of a pandemic and large-scale social unrest, perhaps never more so.
"Cutting edge technologies, changing the way police fight crime."
"Nobody is listening to your telephone calls. That's not what this program is about. But I think it's important to recognize that you can't have 100 percent security and also then have 100 percent privacy."
"In my world, when it's, say, privacy versus public safety, I can assure you it's never privacy that wins, nor should it be. But what I reject is the proposition that privacy must suffer." – Ann Cavoukian
KELLY LAWETZ (HOST): In this, our Maiden Voyage episode of Engage, a Genetec podcast, we're turning the spotlight to the critical, often controversial topic of privacy. The two undisputed thought leaders on the subject, author and former privacy and information commissioner for Ontario, Dr. Ann Cavoukian.
DAVID CHAUVIN: And Genetec CEO, Pierre Racz.
DAVID CHAUVIN: To help us take the pulse on privacy and what's at stake in the global disruption brought on by COVID-19,
KELLY LAWETZ: And the aftermath following the killing of George Floyd.
DAVID CHAUVIN: I'm David Chauvin.
KELLY LAWETZ: And I'm Kelly Lawetz.
Interview with Ann Cavoukian
KELLY LAWETZ: To kick off our first episode, I asked Dr. Cavoukian to introduce us to the privacy by design framework.
ANN CAVOUKIAN: There are seven foundational principles. The first one is essentially to be proactive, address privacy, protective measures upfront before anything happens. Bake it into the code; bake it into your data infrastructure, so it becomes an essential component of what you're doing. The second and perhaps one of the most crucial principles with privacy is the default setting. The beauty of this is unique, which generally happens without it. Your customer is expected to read legalese and privacy policies for the opt-out box, saying not to use my information for anything other than the purpose I consent to. You know life is short. Most people have the time to do that. But it doesn't mean they don't care deeply about privacy. They care enormously. For the past two years, all public opinion polls, Pew Research, etc., have come in at the 90th percentile in terms of concern for privacy, 90 percent are very concerned about their privacy. Ninety-two percent are worried about the loss of control over their personal information. Now, privacy is the default swaps that on its head, it says to customers, you have to ask for privacy or search for an opt-out box. We give it to you automatically.
We are only permitted to use your information for the primary purpose that you consented to. Beyond that, we can't use it for any other purpose. If down the road a secondary use arises that we'd like to use it for you to come back to you and seek your positive consent. Privacy as the default is a game-changer. It builds trust like no other. Companies who have gotten certified for privacy by design, they've said it's incredible the trust business relationship we now have. When we go back to customers, later on, they always say yes if there's a secondary use that arises. They give us their consent because they know the length we're going to protect our privacy and how much we respect it. It is a win-win game-changer; privacy is the default. Then we have the vital need for security. You know, people often approach security and privacy in a zero-sum manner. You can have one versus the other. You can only have a positive game in one area, always at the loss of the other. This either-or, a win-lose model, zero-sum, is so dated, and you just throw it out the window. What I always say is yes, why? The term privacy subsumes a much broader set of protections than security alone. Suppose you don't have a strong foundation of security from end to end with complete lifecycle protection in this day and age of daily hacks. In that case, you're not going to have any privacy. So, make sure you have an excellent foundation of security, and encryption is huge. Make sure you build on encryption, visibility, and trust. I'm going out of order, but I'm just talking about it.
KELLY LAWETZ: Oh, it's OK.
ANN CAVOUKIAN: It's imperative that whatever information you collect from a customer, they have access to it. I always tell people, look, you may have custody and control over someone's data, but it doesn't belong to you. It belongs to them, to the data subjects. So, make sure you give them a right of access to their information, make sure it's physical, etc. The beauty of this is that it's the customers who point out any mistakes. You're a business, and you're collecting thousands of pieces of information; you don't know what's accurate and what's not. The data subject knows what's correct and what's not. So, it again makes it easier for businesses to operate. It enhances the quality of the information they have and the accuracy of that. So, again, a win-win. I always say keep it user centric. When you focus on the user, the customer, you will all gain. You show them the respect you have for their privacy.
Everybody wins because once they understand that you respect their privacy, trust grows dramatically, quality of information grows dramatically. And these people will not. You preserve their loyalty, but it also attracts new opportunities, friends and families, and colleagues of these individuals. So, you make it a win-win. Focus on privacy and shout it from the rooftops. If you're doing privacy by design, don't keep it to yourself. I always say tell your customers the lengths you're going to protect their privacy. Just shout it from the rooftops, tell your customers they will respect it enormously, and you will both gain. That's the only way everybody wins. You win; the customer wins; you both benefit. That's what a positive sign is all about.
KELLY LAWETZ: So, we've been through two new normals. 2001, that was a new normal. And now we're in a new-new normal where it's forced many enterprises to adapt. They need to move fast, or they die. And when I was thinking about that and reading your principles, and said, is this that false dichotomy still hold? Is there can you is it still win-win? Can you release technologies fast and try to bring some normalcy while maintaining privacy and security?
ANN CAVOUKIAN: Absolutely, because if you don't aim for a win-win, your win-lose will turn into a lose-lose over time. You have to aim for win-win multiple positive gains. I'm not suggesting it's easy, but it's much better than the opposite. So right now. So, you talked about 9/11, I would argue. I think it was my second term as privacy commissioner. I did three terms, and it was horrendous, of course. I mean, 9/11 was appalling. We got past 9/11, but we didn't get past the massive surveillance measures introduced during 9/11. So, in the United States, you had the Patriot Act that was introduced that is now just been renewed, renewed. Now, again, warrantless surveillance. And you have the Department of Homeland Security and all kinds of very untransparent behavior if you will. COVID-19, which enabled the emergency powers and privacy laws to be executed, meaning you couldn't do X, Y, or Z. Still, we will be able to collect that data because of emergency powers. That's the kind of thing that's happening. And I understand why the public safety measures, people want to know who has what, etc., but there must be very firm sunset clauses with strong end dates. What cannot continue to pass the end date is that surveillance and privacy form our freedom's foundation. You cannot have free and open societies without a solid foundation of privacy. So, don't tell me you're going to have to just sorry, you know, been there. We can't keep doing it. Yes, you can. If you want to live in a free society, you must preserve privacy. We can, and we will do this. And that's what I've been, you know, since the past two months. I've been doing media interviews every single day, usually two or three a day. The media call all the time because they're very interested. What's the effect of this on privacy? And I say the impact does not have to be what you think it's going to be, that we're going to lose our privacy over my dead body. There's no way we're going to give up all of our freedom and our privacy. We can, and we must do both, and I'm happy to work with governments. There are methods by which we can do this, but you have to start with; yes, we can have privacy and public safety.
KELLY LAWETZ: The other in many ways. I think us getting through this depends on privacy and trust because until we have the vaccine.
ANN CAVOUKIAN: Yes.
KELLY LAWETZ: We're probably going to have to use those contact tracing apps. Our workplaces are going to have to collect data on our health. They will have to share, and we won't be willing to share. I know I won't be willing to share if I don't trust that there won't be repercussions for me, that it's done in good faith, and it's done with me, the individual in mind.
ANN CAVOUKIAN: There's so little trust. No one wants any of their information going to the government or their employer business or whatever. Now, these contact tracing apps, if they don't have the public's trust, no one is going to use them. People are saying you need to have 50 to 60 percent of the population using them to have any effect. I'm sure there are other apps out there that do track everything, Singapore, China. I mean, do we want to go that kind of surveillance model? That's what they're doing.
KELLY LAWETZ: What could you tell organizations about practically how to weigh the risks of either expediting technology and not skipping the privacy and security aspects?
ANN CAVOUKIAN: You say technology moves is such a fast pace, as you know.
KELLY LAWETZ: Exactly.
ANN CAVOUKIAN: That the laws that we have in place, I mean, create a new law and the technologies already ten paces ahead of it. So, the laws we make have to have general guidance as opposed to specific. You must do this and this, for example, in the GDP, when they included my privacy by design framework, we don't have it here in Canada. Our Federal Privacy Commissioner, Daniel Tavan, went to the government three years ago. He told them that we have to upgrade our federal private sector legislation to PETA. It's dated compared to Europe, which has included my privacy by design. He told them we have to make these changes. Have they done anything? No. Despite the commission pressing them, one can only hope, but we have to keep pushing them to upgrade the laws. But in terms of tech, there's so much we can do at a tech level. The big move now is towards decentralization. When you have data all centralized in one honeypot, they can use that information for various purposes that aren't the original intention. But, you know, hundreds of eyes can look at it. That shouldn't be. It's a nightmare. When it's decentralized, then it goes under the control of the individual. The data subject is privacy is all about control, personal control over the use and disclosure of your information if you want to give away your information. Be my guest as long as you decide to do it. I always tell people, look, privacy is not a religion. Do whatever you want with your information as long as you are the one making the choice. And that's what decentralization supports. And now, increasingly, there are more and more coming out that supports that. In terms of self, Sullivan’s identity protected identifiers, and there's so much going on. So, I'm very optimistic for the future, even though people might shake their heads and say she's crazy. No, you never stop fighting for privacy and freedom. You know, I tweet every morning between five and six am the latest stories of the day. A lot of them come from Europe. I have a large following, and invariably, one or two people tweet back and say, lady, give it up. That ship has sailed; privacy is dead. I say, are you kidding me? Get another Fricking ship. You don't just end things because it looks complicated. The odds are against it. You figure out how to do it, and you move forward. And that's what technology can give us.
KELLY LAWETZ: Foundational principles are so important.
ANN CAVOUKIAN: Yes, I couldn't agree more. They've got to form the foundation literally of your operations. When you bake it in the code, it becomes an essential part of what you're doing, seamlessly embedded throughout your operations in the code. That's what we have to strive for. And we can do this. I've seen it done again and again. We have brilliant minds out there. We can do this. We have to keep our hope up. You can never give up. That's what I like to remind people whenever these bad things happen in the privacy world. Surveillance is on the rise, and people's activities are being tracked; everyone gets understandably very depressed that they're chipping away at our privacy again and again. Yes, they may be doing so, but it's temporary, and our job is to keep it brief. You do that by spreading the word on the importance of privacy and freedom. I mean, this is what changes our life. When I look at what's happening in China and Singapore, with everything being tracked and no freedom, I wouldn't want to live in a world like that, ever. Freedom allows you to decide how you want to live your life. I'm a big fan of Steve Jobs, the founder of Apple. Every six months, he used to buy a new white Mercedes. He'd take his old white Mercedes, take it into the shop six months a day, buy a new white Mercedes, exact make and model, take it home, then bring it back six months. Why? Because at that time in California, you had up to six months to get a license plate on your new vehicle. He didn't want to license plate number. He didn't want to be tracked. He believed in privacy. Unfortunately, after he died, they changed the law.
KELLY LAWETZ: I won't be changing my Yaris every six months. The government knows who I am. I want to take that same message that you're giving to consumers and citizens. The same applies to business, though, right?
ANN CAVOUKIAN: For sure and use it for your good businesses. I keep telling them this will give you a competitive advantage. It will develop trusted business relationships where presently there's such a trust deficit. So, this will work for you as well as for your customers. It has to be doubly enabling. Win-win, and that's if we can just spread that message across, that there will be considerable gains by using this, and people are much more inclined to do so. I speak to lots of board directors, and the first time I go in, they usually have sour faces. They think I'm going to tell them not to use the information they have. I say; give me 10 minutes, let me tell you what privacy by design is about and how it will increase your business interests and give you a competitive advantage. And I walk them through, and then everything changes. They're in there telling me what they do. How can I do it this way? Can I? You just have to get the mindset across to them that this will work to your advantage and your customers.
KELLY LAWETZ: Doctor Ann Cavoukian, thank you so much for your time today.
ANN CAVOUKIAN: Oh, it was my pleasure. Thank you.
Interview with Pierre Racz
DAVID CHAUVIN: On May twenty-fifth, we all witnessed George Floyd's senseless murder.
"A march for George, for Brianna, for Amar, for Jacob, for Pamela Turner, for Michael Brown, for Trayvon and anybody else who lost their lives."
DAVID CHAUVIN: Floyd's death initially sparked a series of global protests focused on police brutality against people of color. We've watched as the movement evolved into a broader call to reexamine law enforcement practices and funding.
Today's guest is Pierre Racz, CEO of Genetec. Pierre, thank you for joining us today. How are you doing?
PIERRE RACZ: I'm doing great. Dave, nice to talk to you again.
DAVID CHAUVIN: Absolutely, always a pleasure. As a tech executive and a thought leader, how do you think technology can continue to help public safety, considering the distrust between many communities and law enforcement? How can we avoid the label of Big Brother or the label of surveillance? And how can technology help public safety in a constructive way for all the communities out there?
PIERRE RACZ: Well, first of all, you have heard me talk for a long time about the fact that we're not big brother, we're big mother, and there's a fundamental difference between the big brother and your big mother, your big brother, your brother is jealous of you. And when he tells you to go and fetch the ball that's in the road, he secretly wouldn't mind if a truck would knock you over. When your mother tells you don't go on the road, it's because she genuinely doesn't want harm to happen to you. And so, the way we go about crafting our technology and the way we seek out our customers is our customers are big mother and not big brother. They genuinely are looking out for you. So do not confuse what is happening now as something that is normal or that we should consider normal. I think that it is we let some governance loops remain a little bit open, and now we're renegotiating these loops. That's perfectly fine in a liberal society, and we're trying to build, help build or do our part in it. It is a big mother, not a big brother.
DAVID CHAUVIN: And do you think the industry, in general, has that same approach?
PIERRE RACZ: The industry spends time neglecting how their technology is used. They can make it easy to do the right thing and make it harder to do the wrong thing. Yes, I think that that is shared. If we go to fundamental principles, none of our customers would wish to wind up like North Korea. Countries like China and Russia strategically do not work. We are opportunistically working in those countries for Western companies or the Russians and Western. But liberal democracy companies that try to work in those areas are not active in those countries for that specific reason.
DAVID CHAUVIN: And where do you draw that line of having your technology used in ways that you either do not condone or did not anticipate? Do you think it's our responsibility as an industry to ensure that citizens, whether it's citizens in a city-wide surveillance context or employees of a company in a private sector context? But do you think it's our responsibility to ensure that our customers can't breach people's privacy? Or is it on our customers to make sure that they use the product appropriately? So where do you draw the line, and where do you hold that that that accountability?
PIERRE RACZ: Well, the technologies are morally neutral. A hammer manufacturer can't build a hammer guaranteeing that none of your customers are going to whack someone and cause them bodily harm. So, it's impossible to prevent people from repurposing the technology. Still, we can do things that will make it less likely to misuse our tools. Unfortunately, we are rarely confronted with situations where we would be aware that the customers would be abusing the tools. We're in touch with Foreign Affairs Canada and the State Department when dealing with customers outside of the G8. We have an open discussion with them to vet the customers and ensure that the technology is not used for purposes other than what they would want. But, again, it's hard for us to know how it's used. Still, they have access to intelligence that we don't have access to.
DAVID CHAUVIN: So hypothetically, Genetec learns that the system is deployed in an area that breaches fundamental human rights. The technology is used by a government that has complete disregard for human rights. What powers do you have to ensure that that system can no longer be used that way?
PIERRE RACZ: I'll take a step backward and talk a bit of Henry Mintzberg. He made these three classes of human organizational structures. The first is the government, and their main job is the three aspects of security: military security, economic security, and social peace. Then you have the companies' role in the plan for the government is to create financial security. A company supplies the materials and the products that society needs. The third class of organizations is communities. Their role is to comfort and console in times of joy, through entertainment and sports, and console in times of grief, like during funerals. We do our job by making tools that society needs. These tools we make in conformance with the social contract of the societies in which we operate. But the oversight of the behavior of other governments is not the job of the private sector because we are essentially powerless. If you just take the GDP ratio between Genetec and China, it must be close to a billion.
DAVID CHAUVIN: So, it's more on than the public sector to ensure appropriate policies are put in place. Versus the private sector?
PIERRE RACZ: Well, negotiating the social contract is part of the public sectors not so much. It's it is the combination of the elected officials and the people that elect them. Right now, we're seeing that the people are insisting on the ends they want, and they want governance around the means of achieving these ends. So, everybody does want to have social peace. Very few people want there to be anarchy in the streets. But what we are discussing is what are the means that need to be employed to achieve that.
DAVID CHAUVIN: With that social contract, are you worried at all by the fact that many countries elected officials aren't as technologically literate as we would like them to be? To quote Congressman Makana, who said: "it's embarrassing how technologically illiterate most members of Congress are." All we have to do is look at when Mark Zuckerberg testified in front of Congress two years ago. Some of the questions asked were surprising and showed that these elected officials don't necessarily get the technology. And it's normal in a way because they don't come from the tech sector. They come from all sorts of walks of life. They're from different generations, different regional contexts. So, we can't expect them to all be tech experts and privacy experts regarding technology. But what can the private sector do without engaging in a massive lobbying effort? What can the private sector do to help educate our lawmakers and policymakers to ensure that they understand the challenges that are out there regarding privacy and technology?
PIERRE RACZ: Well, first of all, one of the three pillars of liberal democracies is the representative government. If our government has intelligent and less intelligent people, scientifically literate and less scientifically literate people, as long as it represents the population, that is perfectly normal. So, don't be too hard on the politicians in the sense that, well, you should be hard on don't be too nice to them. But if they are representative of the population, that that's a good sign. One of the responsibilities that the people who know how technology works is to help teach people. So, I don't know if you've read Alan Alda's book, which came out about three years ago. If I understood you, I wouldn't be making the face that I'm making. This book is illustrated very well by George Bernard Shaw's quote that the biggest problem with communication is the illusion that it occurred. As an engineer and a scientist, it is my job, among other things, to help explain the science behind the things that we're doing because there is logic. There is a reason behind the technology. And the more that people can understand it, the better they will be able to put it to use. Now, the people in government that understand it's incumbent upon them help make sure that people are as educated as possible. And ultimately, the responsibility for education does fall on the government. The private sector can do a little bit, but the means at our disposal are not the same as those of the government.
DAVID CHAUVIN: Earlier this week, Justin Trudeau mentioned that he wanted to push the concept of body-worn cameras with the RCMP, with the national police. That again brings up the same issue where you have those politicians. I want to push either push technology or push a new type of policy around that technology without really understanding the potential problems of that technology, whether it's privacy or technological barriers. We saw the same thing with the 5G debate. So, again, is it on industry thought leaders to take a position to advocate to the public sector what they should be looking at and what they should be concerned about?
PIERRE RACZ: Well, I think that, first of all, those questions are not at all technical questions. They're society questions and maybe as an educator, what we have to do is give people mental tools to clarify their thinking. So, in terms of the body wearable camera, one cognitive tool that I find very useful is the three laws of robotics that Isaac Asimov first wrote about in the 1930s. Just to remind people who have forgotten, the first law is that a robot should not harm a human or cause harm to come to humans by inaction. Law number two is a robot should listen to all the orders given to it by a human, except if it violates the first law and law. Number three is a robot should protect its existence unless it violates the first two laws. Now, Isaac Asimov's books were around exploring the paradoxes that ensue around these. For example, if a robot encounters an evil person, can he kill it? And then, of course, that a whole series of books that he wrote about that. So, the same thing. When we're talking about these body wearable cameras, we have to look at this from the perspective of the body wearable; a camera is a robot. It is not an anthropomorphic robot, as we mostly picture robots, but it is a square box, but it is a robot. And as such, it should be the robot of the person that is wearing that. Otherwise, what you are doing is you want to put political minders on your law enforcement or people. And that to me, having an electronic political minder that I have to wear all the time sort of very smacks very much of North Korea. Suppose we insist that it be done to law enforcement. In that case, they could insist that the citizens also have to wear political minders. So, I think that we should think about this in the bigger picture is; what does it mean? How do we want to use robots? The robots we want are these body wearable cameras.
On the one hand, society could mandate that the people responsible for carrying around high-powered directed energy weapons, such as firearms, come with accountability for their actions. And we can use a body wearable camera to be accountable. But I think that the law enforcement officer, I want to have them, they're thinking human beings, and I want them to exercise judgment. Putting a political minder around their neck assumes that they cannot exercise judgment on their own. So, I think that we should give them the right to turn it off or else they won't do their job the way we would want them. So, if they can't turn it off, they'll be unable to interview victims of sexual assault who would not want to be recorded. However, they need to be accountable for when they turn it off so society can demand that if you turn it off, you have to explain why you turned it off. And if nothing happened during the period, you turned it off. I mean, it's not a problem. But if something terrible has happened while you turned it off, then you have just eroded the presumption of innocence that you might have had if you had it turned on. So I think that there are ways of using these devices. But we have to take a step back and think about what it means to our society. It's not a technical question. It's a society question.
DAVID CHAUVIN: Last note, you know, you've mentioned before your thinking of how our society will be judged by society two hundred years from now. How do you want to be remembered two hundred years from now?
PIERRE RACZ: No one is going to judge me personally in the twenty-third century because it'll be relatively irrelevant. And notwithstanding that we're a good bunch of people, Genetec also is going to be a tiny blip. But there could be many different types of futures. I am hopeful that the future we're going to have is going to be the one that Gene Roddenberry envisioned that is a bit demised by his quote: "In the future, there will be no hunger, there will be no greed, and all the children will know how to read." So, the last part is that people will know a lot of things. They'll have a lot of knowledge. Many of the social ills that we have today, particularly discrimination, are based on a lack of knowledge. People underestimate how much television reduces this lack of knowledge and that discrimination, although not great today, was worse.
Three hundred years from now, I am confident that if the world doesn't turn into the North Korean model, that will be the case. Just discrimination will be a bad idea from the past that we haven't figured out as we crawled out of the past the Stone Age. There is no place for that kind of behavior in the space age for many reasons. One is that as we enter the space age, resources do become, especially in terms of energetic resources, they do become more abundant. They're going to judge us if we navigated our way towards this hopeful future or if we took a bifurcation towards a totalitarian dystopia. To quote the old Leigh Hunt poem, Abou ben Adhem, I pray, then write me as one who loved his fellow man.
DAVID CHAUVIN: And so, you believe that the focus on privacy and governance is a step in the right direction?
PIERRE RACZ: Yes, I believe that this invasion of privacy buy's us very little and costs us very much.
DAVID CHAUVIN: Thank you very much for your time. This was an incredibly insightful conversation. Any last words?
PIERRE RACZ: I hope you invite me back.
DAVID CHAUVIN: Fantastic. I was speaking with Pierre Racz, CEO of Genetec. Thanks for listening, and we'll catch you on the next episode of Engage, a Genetec podcast.
Engage, a Genetec podcast is produced by Bren Tully Walsh; the executive producer is Tracey Ades. Sound design is provided by Vladislav Pronin with production assistance from Caroline Shaunessy. Engage a Genetec podcast is a production of Genetec Inc. The views expressed by the guests are not necessarily those of Genetec, its partners, or customers. For more episodes, visit our website at www.genetec.com on your favorite podcasting app or ask your smart speaker.