Baking Privacy Into Product Design

This Soapbox has come and gone!

  • Michelle Dennedy, VP and CPO McAfee

We hope to see you at the next Soapbox. We'll update this page soon with interesting tidbits about the event plus the podcast!

About Michelle Dennedy

Michelle Dennedy, Chief Privacy Officer and VP at McAfee, was a hoot at her soapbox. She took a subject — privacy and digital products — and made it entertaining and enlightening. The audience was in stitches throughout the event.

But while our chat with Michelle was entertaining, it was also very informative. She told us what companies get right, how they get things so wrong when it comes to privacy — which she called a “human right” — and that companies need to think in more people-centric terms to protect the privacy of their customers. She also tackled why Facebook and Google are "poster children" for problems in privacy and big data.

Feel free to listen to the podcast as you read through the summary of the event below.

Listen to the Michelle’s Podcast


Subscribe:
iTunes RSS

Privacy is Dead, Long Live Privacy

On Sept. 12, 2001, the day after the world changed, Michelle made a promise to protect her then-newborn daughter. That moment framed Michelle's work as a champion for better privacy controls.

It was a moment for all of us.

Like all of us, Michelle has a story about how 9/11 affected her. She shared with us that her mom was almost on Flight 93, but postponed her trip because Michelle delivered her baby late.

She is very much alive nagging me happily from New Jersey.

Those days were scary for the nation and the world. At the time, Michelle worked for Sun Microsystems. Her boss, Larry Ellison, had declared that privacy was dead in the days after 9/11, said Michelle. It was a frenzy to figure out how everyone was to deal with privacy in this new era. So she and others got together and formed a privacy plan. What they came up with still guides how Michelle approaches privacy today, even in her current role at McAfee. And that is:

Not just write policies, but to build fabric, and build culture and build passion, and build architecture and engineering behind human rights.

She said there's a misconception that corporations can't participate in that — that somehow they can't see privacy as a human issue.

There is this agreement that corporate can't meet human rights, and I think that's bullsh*t and just needs to be taken off the table.

Whenever There's a “Who,” There's a Privacy Issue

Michelle was party to the early days of the Cloud. She said that when we went from timesharing to networking to the Cloud, security and privacy was always going to be an Achilles' Heel. And when it comes to privacy, there's always a "who." Who owns the data, who wants to own it and who it comes from.

Whenever you have a who you have a privacy issue.

It's not as simple as pouring that information into a bucket of storage, she said. That information has to travel from who to who. And if you haven't figure out the who, then you've got a problem. Tabling that doesn't solve anything and creates more problems than solutions.

As long as we think of privacy as something that we sometimes to get to have or that it's strictly seclusion or hiding away some dirty little secret, you're never going to code to that, you're never going to plan for that.

She used the medical field as an analogy. We're very particular about our medical care, yet we don't "have that same relationship with data," Michelle said.

A lot of folks, on both sides, are fighting for a comprehensive law to make privacy issues easier to deal with. That's because there's different, fragmented laws that make it difficult to come up with a solid privacy policy or terms and conditions.


Poster Children for Privacy Problems

Michelle once said that companies like Facebook and Google are "poster children" for privacy issues. But she also sees that as an opportunity — privacy can and should be baked into our products.

She pointed to Google and how in the early days. In fact, Google incorporated after it had to cash their first check. At the time, Michelle said, they had two issues: privacy and intellectual property. But they had the view the world that information wanted to be free.  

And I think it was out of innocence and naïveté rather than evil. The thought that information wants to be free neglects 3,000 years of history of people who want to learn stuff and want to have control over their reputation.

But the problem is that companies treat privacy as an afterthought, as do people who just click "I agree" on a product's terms and conditions. Companies need to present their privacy in a digestible way because studies show people don't read the policies, said Michelle. She gave the example of McAfee, which uses a comic-book style to explain its privacy policies.

Lawyers Don’t Have a Magic Elixir

Companies treat lawyers as this other thing that has to be brought in, and usually at the end of a project. It's kind like:

Bob, Ryan … and legal. Make sure legal is there.

If companies want solid policies that protect their customers, then the lawyers need to be brought in sooner. There's an attitude that lawyers have a magic elixir that can make bad ideas legal. They don't, said Michelle.

As for startups that might consider copying and pasting another company's terms and conditions to save money on an attorney, Michelle said don't. Spend the money, it'll be well spent in the long run, she said.

Our conversation with Michelle continued as she took audience questions. We even talked about who invented the "off" switch! We'd like to thank Michelle and everyone who attended!

Don't Miss Out on Our Next Soapbox

Transcript

Ryan : How is everyone doing today? Can you hear me? Is that good? All right.

We're going to get started soon. Welcome to our third ZURB Soapbox in our new headquarters. So I want to thank you all for coming out and bearing with the bits of traffic outside. I know it's a little difficult to get around here, but I appreciate you making it out.

We do have a hashtag for those of you that are tweeting live, #ZURBsoapbox, and both our Twitter handles, Michelle's and myself are on either side. We properly positioned that to where we planned this. It's production value.

I just want to start with just a little introduction. On September 12, 2001 the world changed, or the day after the world changed. Michelle Dennedy looked at her then-newborn daughter and made a vow to protect her. That framed her work as a champion of privacy in the products that we use. In the past she had said that Facebook and Google are poster children for problems in privacy in big data, but that there is also a bigger opportunity. I want to get into all of that, but please give a warm welcome to Michelle Dennedy, Chief Privacy Officer and DP at McAfee.

Thank you, Michelle, for making it all the way out to little Campbell, California.

Michelle: Thank you.

Ryan: I appreciate . . .

Michelle: You've got me all worked up and misty eyed now. You started off with a bang.

Ryan: Well, I want to get you a little more misty eyed.

Michelle: I like it, okay.

Ryan: This is like the Oprah Show, we're going to get you to cry, reveal a little secret, you know.

Michelle: I have no secrets.

Ryan: But I do want to talk about that moment, because on September 11th the world did change, and it affected all of us, the nation, the country, the world. And each of us has more than likely our own personal story around that.

Now your moment came almost the day after; like I said in the intro you were looking at your newborn daughter. What was it in those days and in that moment that shaped your almost never- ending battle for privacy?

Michelle: My quest.

Ryan: Yes, your quest.

Michelle: It is a little emotional, because some of the people who were there with me are in the audience. Well, it was a moment for all of us, right; everyone here remembers exactly where they were and what they were doing. But when you're obsessed with data as much as we are and were at the time, it takes on a really interesting cast.

Now I lived in New York; right after college I moved to New York. And I lived with a woman who was at Cantor Fitzgerald, so the minute the very first plane hit I knew I'd lost at least a dozen, if not more, friends who were there at the time. So there was that element.

I had also booked my Mom who lives in Princeton, New Jersey on flight 93 to come out and help us with the new baby. And, thankfully, timing and not architecture, my daughter was late as she always is now that she's almost twelve. And so I called my Mom and I said, "You know, don't come on the Tuesday, Mom; why don't you come on the Friday, and don't come on September 11th." And so she is very much alive, nagging me happily from New Jersey to this day.

So there were all those elements in my personal life. There was the difficulty of tracking people down, so all of my New York friends, especially the ones who worked down town, it was a frenzy of you don't want to call because you want them to get to their families, but you want to call because you're desperate to know who is still alive. And yet you're sitting here in California and it's paradise, sunny, perfect days. Everyone is just kind of shell shocked not knowing what to do or think.

And then enter our boss, Scott McNeilly and Larry Ellison who got on and said, "All right, privacy is dead now. Let's just all agree that privacy is gone; we have to have a centralized database to get the bad guys. We need to give all our data to the government, and the government will figure this all out." Does that sound familiar?

Where's Jonathan Fox? There he is. So Jonathan Fox, who was my Chief of Staff at Sun Microsystems and came back for more pain at McAfee, and Ryan was there as well on our team. We called each other the next day and we said, "Can you hear what Scott is saying about privacy? We sell Java, which has garbage collection and all these other inherent security features. We sell Solaris." At that point we had started building clouds and calling them networks. We have McAfee two doors down from our office doing encryption. How in the hell can we say that we are giving up an infrastructure, that we've given up on human beings, and we've given up on identity, so what are we going to do?

And that's really where we coalesced and we put together a business plan to actually have a business unit at Sun Microsystems at that time. And that framing and that architecture of that framing is really what leads us now that we're McAfee to not just write policies, but to build fabric and build culture and build passion and build architecture and engineering behind human rights and still be able to be a corporate citizen, because you have a lot of advocates on this side and then you have technology and there's some sort of like a detente and agreement that corporate can't meet human rights, and I think that's absolutely bullshit and just needs to be taken off the table. Are we allowed to swear in this session?

Ryan: Feel free at all times. You've used the lightest curse words of any ZURB so far.

Michelle: Well, I'm just getting warmed up. I figured on a soapbox there would be soap in the mouth by the end of the session.

Ryan: Once again, the hashtag is #ZURBsoapbox, attributed to this twitter.

Michelle: Encryption, brute force encryption.

Ryan: But it's interesting that you say that there is not really this separation of corporate and human rights. What were some of those actual challenges back then and how did they compare to the challenges today?

Michelle: What's interesting, I think the challenges are very similar actually. I think we still are a little bit slaves to the technology itself. And we design our business plans and we think about what can possibly be based on things like bandwidth. Encryption in 2000 and 2001 was not as easy and ubiquitous as it is now. We barely had basic encryption, and certainly didn't even think about 256, and now we're there and we have enough fat pipes to do it.

We also didn't have platforms that talked together terribly well, and so we decided that safety looked like dumping it into one platform. And by the way, let's get the cheapest and the most security whole vulnerability laden. So we made all these weird ironic choices based on the technology rather than peeling back and having a data centric architecture. And I think that issue and that lack of leadership at the board level, kind of at the seat of power, demanding a person-level architecture is still a whole today. You don't hear a lot of this outside of this outside of auto committees at the board of these companies.

Ryan: Right. And in those days, too, cloud computing in itself was brand new and you kind of were spearheading that effort at Sun. What was it there that was the big product privacy issues, and how does that kind of inform how we approach it today?

Michelle: That's a great question. And we knew when cloud went from time- sharing to network.com to grid to now the marketing term of cloud; it was that security and privacy were always going to be that Achilles heel for data. People always want to own stuff, and even if people don't think they have a privacy problem because they equate it with secrecy rather than authorized, fair, studied, [inaudible 08:31 antiquated] use, which is the real definition of privacy. It's much bigger and broader and richer than people think it is.

Even if you think you just have an intellectual property issue and you put that on a cloud, there is the who owns it, who wants to steal it, who wants to leverage it, who do you want to sell it to. Whenever you have a "who," you have a privacy issue. And I think the breakdown comes from, "Yes, we can stick things in giant buckets of storage; maybe we can encrypt them, but they also need to travel to a who," and we haven't figured out the who.

Ryan: So it's not much, because I think when most people think about privacy they think about protecting ourselves, right? CYA, right?

Michelle: Right.

Ryan: We think more in those terms of privacy. We don't really think of it as privacy for what we're transmitting out of privacy for how we are protecting our customers. And have those concerns really changed all that much from those days to today?

Michelle: No. Well, I think we have foundation of definition deficit, if you will; because what we're doing right now, this is privacy. We have decided to come together, we've decided to communicate and have a conversation with; we have authorized fair, appropriate proportionate sharing of information. That's privacy.

And as long as we think about privacy as something that you either sometimes get to have or that it's strictly seclusion or hiding away or some dirty little secret you're never going to code to that. You're never going to plan for that, you're never going to have a business plan based on that.

You might have an advocacy group on that, but you won't really have this thing that turns into currency. And I think as soon as you talk about privacy and currency in one sentence you get this automatic reaction, particularly in Europe, where people kind of rear back and go, "Wait a minute-I'm not for sale."

But think about the medical profession, like what is more core to your corporal life here on the planet, right? But you do pay for drugs and you do pay companies to develop them, and you do pay doctors an extraordinary amount of money to come and have a specialty. Why don't we have that same relationship to data about people, where we do pay for all of the gears, the people, the process, the technology, and the education? And why don't we pay for more people, more common people, to go to D.C. and demand our rights to have a privacy law on the Federal level? There are only a couple of voices in D.C. and it's the corporates and it's the very, very edge advocates.

Ryan: Right. And are they fighting for, in the corporate, or are they fighting against it?

Michelle: We're fighting for a comprehensive law, because it makes our lives easier, and it makes our lives more efficient than having these crazy patchwork quilts of sectoral laws, and if you're already following those sectoral laws basically being governed by Europe and other international entities, it makes it cheaper if we have a Federal law. If you're a small player and you're not playing at all right now you might not be very happy about that, but your voice isn't being heard in D.C. about that either.

Ryan: And how would a comprehensive law help people kind of on the front lines, the product designers, the people who are actually considering how they build their products, and especially when you have companies like a Google where they have platforms across many channels? How do they take that into consideration?

Michelle: When you walk into an engineer's office, and I use engineering very, very broadly-from code slingers to architects to designers and everyone in between-so I mean, a comprehensive [Inaudible 12:14] maker or a creator, if you will. And you walk into one of these guys and you go, "I need to have a system; it's going to be a retail system, so there will be quite a bit of payment data, there will be preference data, all of this will be marketable by all sorts of third-parties swirling around and I need to own this stuff, and there's going to be a little shizzle of IP in there." And all of those are requirements that a maker can write down and go, okay, I get it, I get it.

Then you come in with your privacy requirement, and go, "Can I have reasonable security? And I need to have proportionate access of data, and I need to know that the data elements are in use only for a certain period of time, and as soon as they're no longer in use then you get rid of them." None of those are really requirements; those are kind of aspirations.

So what we need to do is get down that trellis, and I think laws help you get there, but I'd much rather see it come organically from the maker designs and requirements and engineering standards. Lawyers and legislators are notoriously bad at translating into requirements; but if that's the forcing function that's the forcing function.

Ryan: And without that what is it that the frontline product designers can think about, especially when you can sign in by your email, you can sign in by Facebook, Twitter. And maybe one day we'll sign in with Instagram, whatever, right?

Michelle: Yeah.

Ryan: What about those considerations; how do they account for that, because all of those are avenues for something to go wrong or to leak out?

Michelle: Well, this is the half full, half empty conversation, and they are all avenues to leak out. But there are also ways of authenticating in. So if you think about the fair processing principles, and these go back into the l960s, so they didn't necessarily contemplate all of the variants of technology today; but neither did Brandeis when he wrote in 1890 about [codacting], or the privacy intrusion of people being able to stand around on the street and maybe catch a glimpse of a woman's ankle. Now we can barely get Kim Kardashian's ass off of our screens. Poor Brandeis; if he only knew.

Michelle: Ryan Riddle, Ryan Riddle. We just have to make sure we cover all of George Carlin's swears; I don't know if anyone's old enough to remember that.

Ryan: The seven words you can't say on television.

Michelle: There you go. There's no FCC in this room. I can remember [inaudible 14:43].

Ryan: When you have Facebook and Twitter and all those things ...

Michelle: Oh, yes, sorry, [inaudible 14:48] and on again. So when you have all of these different avenues what you do is you go through the fair processing principles and you break them down into what does fairness mean. Well fairness as it turns out can be articulated; it can't necessarily be quantified, but it can be articulated. People don't want to be surprised, so what is the element of not surprising people?

Well telling people about stuff is important. So if you're going to use one of these venues for authentication, like a Facebook or an Instagram, tell them that that's what you're using. Now the difficulty is when you're dependent on a third party source for that sort of authentication. You're kind of at the mercy of how they're doing their policies and how they're being transparent or not.

So knowing that you think, okay, I'm going to go for this kind of cheap route of an existing social platform as some sort of an authenticator. And then you look at the data object that you're trying to protect. How sensitive is it? How serious is it? Is it about the level of this kind of throwaway sharing that you're doing on Facebook? Okay, well then my compensating controls for the knowing weakness of my authentication channel can be a little bit lighter.

If it's a very serious something, and I'll use a McAfee so this won't be a product pitch thing, because I don't know our catalog well enough-quite honestly-at all. But, that said, we do have a nifty little thing called social protection. It's a free app, so if you're on Facebook, and I'll say, "Denise I want to show you a picture of my daughter." I don't really like to share a lot of pictures of my kids on Facebook, but I have downloaded social protection and it's not an encryption, but it's an obfuscation of a photo.

So what happens is she downloads it? I direct it to her, but everyone else on my Facebook feed sees a blurry picture. She sees it in the clear. Because it's that much level, my policy for that type of data that I'm sharing is that much higher. If it's just a picture of a birthday cake or Jason's pictures of her garden, right, beautiful flowers; the flowers don't have a heck of a lot of privacy. So she'll send those out in the clear.

Ryan: Well, I don't know; the flowers may disagree.

Michelle: The flowers may disagree, so any animal people here, sorry. Vegetable people, I don't even know that that's called-or pagans, I guess. I always thought I was a pagan, but I guess not. But that's one of those things, and so when you go down that step of fair processes, fairness of who's collecting what, is it proportionate, and am I asking you your waist size to serve you a hamburger. Well, there might be some nefarious data processing going on behind the scenes.

It's like that old ACLU video where a guy calls up and orders a pepperoni pizza-have you guys seen this? Oh, search on pizza delivery ACLU when you get back to your desk. He did this great thing where it's a mocked up call and the guy is like, "Hi, I'd like or order a pepperoni pizza," and she's like, "Oh, I see that your cholesterol is quite high, and, oops, I see you're going to Hawaii, and woo, your weight is way too high." And so at the end of the day this poor schmuck gets like a tofu gluten free sad pizza, you know. So it shows you like this, ha, why did he take my waist size when he ordered my Big Mac?

So proportionality literally and figuratively, I guess, is really important when you think about data. Who here knows who invented the light bulb, at least on paper? Not a trick question. Thank you, everyone knew the answer, but one voice. But you're absolutely right. Now who invented the off switch? I don't know either, and I've been asking that question actually for years.

One day I'm going to get curious enough and I'll actually find out. Technologists don't like to turn stuff off. People don't usually get vice presidency titles because they turned stuff off, which is a shame. Because the off switch is actually the key to the industrial revolution, not the on switch; that fucking thing burned everything down. Sorry, that was like BAM!

I went down to like number seven on the list, and look, he's got his bingo card out. Somebody actually took the time after a talk I gave once, with a handwritten note that said, "Curse words are the tool of the uninformed," or something. I was like, "Then you sit up here." Whatever. But it's true, right; the off switch, the control switch, the authentication switches are sexy little critters that actually let you innovate on top of the platform.

It was cool that you had indoor illumination, but if you couldn't control it and you had an unlimited heating source you also couldn't read at night, you couldn't have shift work; you couldn't have factories that went on and on and on. And yet we don't even know who invented that thing. It probably was [inaudible 19:33] itself, which will really take the wind out of my sails.

Audience: [inaudible 19:39] The gaslight, the on/off was the gas cock.

Michelle: Exactly.

Audience: Because they tried to figure out how to make it work, and unfortunately it killed hundreds of thousands of people in America until they figured out how to put a stop on the gas cock so you couldn't turn it past off back to on.

Michelle: Exactly. Now just for the record, I did not say gas cock.

Ryan: That's #ZURBsoapbox.

Michelle: Ryan Riddle. Ryan Riddle's like a super hero, it's delicious and I love it.

Ryan: Ironically, yes. I am a comic book lover. But speaking of . . . I lost track after gas cock I lost track of the questions. It's like, ah, there it goes. But to take it back a little bit to Facebook and Twitter and all that authentication, in your privacy-by-design talk, at least the one I saw didn't have as many curse words, but maybe I'm watching the wrong ones.

Michelle: It's okay, Ryan , it's all just us.

Ryan: But you mentioned that Facebook and Google are poster children . . .

Michelle: I'm going to get in so much trouble. They are huge customers, use them all the time, we love them so much.

Ryan: For some of the problems in privacy, but you followed that up by saying there's an even bigger opportunity. And I wanted to know if you could just elaborate a little bit on that, and what is it actually that they get so wrong or have gotten wrong?

Michelle: Yeah, I think there are a few things that they've gotten wrong. Now I don't work for either company, so I don't know what's going on in their business behind closed doors.

However, I've been around the valley a long time, and so I have an idea about what's going on. I think the interesting thing is first talking to a terrific guy who has passed away unfortunately. His name is Dr. [inaudible 21:32]. He was the one who invited the boys over for dinner when they were working on their PhDs-the Google boys that is.

And he invited Andy Bechtolsheim, who is the founder of Sun Microsystems, and other companies in the valley. And he's kind of this human computer. If you've ever met Andy he doesn't have like a human interface; he's kind of like non-compatible. But he's very, very smart, and he's one of the most brilliant designers of all time.

So he just sat at the dinner as he does, and at the end of the dinner he handed Sergey Brin a check that said, "To the order of Google, Inc. for $100,000." And they actually took 30 days scrambling to find an attorney and incorporate so they could cash his check, and that was the very first money that Google ever got.

All that back-story is, Rajiv was there, it was at his house, and so I was talking to Rajiv back in the day about the opportunity of Google. And at the time I was a patent litigator and so they in my mind had two issues. One was intellectual property and the other was privacy. And they totally disagreed with me. They really viewed the world, and probably still do, that information truly wanted to be 'free' and it's really easy when you're a post.doc to feel that way. Right?

I mean, your pizza; you wander around like scarfing conference rooms for pizza, and your life is just different. You don't really own anything. They aren't billionaires; they aren't getting the worlds governments knocking on their doors. And they're not the subjects of every data regulator on the planet, screaming at them for their mishandling of personal data.

But at the time the philosophy was an innocent one, I think, and I think it is out of innocence and naivete rather than evil. But I think that the thought that information wants to be free neglects three thousand years of history of people who actually want to own stuff, and who want to have control over their reputation. People have died over the thought that their daughter had stepped out before marriage, right?

So the reputation is very, very powerful. I think when we ignore history and we only look at machines we go very, very south. I think that's the fundamental problem with both of those companies. And I use them both extensively. It's ironic how many privacy people are on Facebook, shamelessly, like wow, can you believe, "Privacy, why can't anyone listen to us at work?" On Facebook, send, tra, la.

But I think they've kind of lost the mark. I think the system and the sharing they get; the protection, the controls, the management, the ownership, the sense of culture is not the same as when there were limited people on timesharing. If you said bad things, or did bad things, you were flamed, and you had no more access.

I mean, the punishment for violating privacy in the early days of computing was severe. You got no more time on a timeshare. See, he's nodding in the back. And you know exactly what I'm talking about. And viruses were things that everybody knew were the common enemy, and everyone got against them very quickly.

So we don't have those controls yet in this world, but I think what we're seeing is because it's a machine-centric design instead of a people centric design, and it's an ad network centric design, because that's the business. We are not their customers; we are their inventory. And we must never forget that. I think that's the issue.

Now the lost opportunity is people actually are very, very attracted to being with other people in a very respectful, responsible way. So I don't think that these companies are bad; I just think that there's a new way of sharing. And I think the interesting thing is, and we have an online safety school program at McAfee and it's wonderful to go and actually talk to younger people from K through 12.

Younger people want to protect their privacy. They want to control their secrets and their messages and their brands. They want to have a weekend persona that's maybe a little bit different than their daytime persona. They want to have a grandma persona, and they're learning how to do that with the tools that they have now. It's a people centric kind of organic happening that's happening.

So that is my optimism about those systems. I think the customers are starting-the true customers, not the inventory-are going to migrate off that platform as quickly as they can. If I could port my friends off of Facebook right now I really would; not because I dislike Facebook per se, but because I'd like to see a better platform that I understood a little bit better.

Ryan: What is it about Facebook that you don't understand?

Michelle: Have you read their privacy policy? Let's order some coffee and we'll all read through the controls together and I'll get you home by midnight. I'm so mean; I hope there's not Mark Zuckerberg hiding in the back with a machine gun. Ryan Riddle.

Ryan: All right, I'll take that one.

Michelle: Is this the way you thought this talk was going to go?

Ryan: No. I don't know; I've lost track of my questions.

Michelle: But these guys, that's why they came.

Ryan: I'm breaking a sweat now.

Michelle: This is every day. If you're sad, come work for us. We do this every day.

Ryan: But I like what you said, there. We have to think of privacy as a people centric thing. It's almost as if we are designing for people, and we're also designing their privacy at the same time.

Michelle: Exactly.

Ryan: And those are the things that we have to keep in mind. And can social sharing platforms actually get better at that?

Michelle: Absolutely. I think this is where the ZURBians come in, right, so this is where I think it's really exciting, me talking to designers and makers and artists. Actually we have 3,000 years of humans rolling around on the planet. We have a lot of data. We talk about big data. We have big data about how humans like to behave with each other and against each other and around each other. I think if you look at that data and you look at things like art and you look at things like pictures and music.

So I'll give you an example and if I say, da da, da da, da da, da da, what does that bring to mind? Yeah, even the people who are way to the young to be sitting in that movie theater. You all know what that is, exactly. Why do we open contracts when there's no score going, warranty, warranty, warranty, warranty.

Or this is where our data [inaudible 27:55]. There should be a score, right, for the dumbest show with the most obvious jokes, there's a laugh track. Why aren't we doing that? Why aren't we explaining complex technologies using pictures and arts, comic books, or privacy policy at McAfee.com is a comic book. That's where they all have our ninja shirts on. Because all the research says no one reads privacy policies. How many people in this room have read one privacy policy this year? Okay, I'm very impressed.

Ryan: I just press "Agree" whenever I have to update iTunes.

Audience: That counts as reading.

Michelle: No, it does not. You know, where's the ejector seat for privacy [inaudible 28:32]? Yeah, if you wanted to it as a [inaudible 28:36] it's a wonderful tool.

So our privacy policy in Word is 16 pages long. And I probably didn't include everything that all the regulators around the world. So when you look at a new agreement and it's not just a privacy policy, part of it is buttocks covering, because I've already gone to the F-bomb I've got to go back up now. It really is, it's required by the law; we have to cover the bootie. There is stuff you have to set.

Ryan: She's slowly making her way back down.

Michelle: Oh, bootie, booties like for a baby or my bootie call bootie, and I'm going like bootie.

Ryan: That is ZURB Soapbox.

Michelle: It's Ryan Riddle. Ryan Riddle talks bootie calls with Michelle Dennedy. My mother will be so proud. I was going somewhere with that. Oh, so we have all these studies that show part of it is the vehicle is a corporate protection mechanism, and that is a very serious goal. We really do want our businesses to be healthy and alive and not sued into oblivion.

However, those policies are really also for the consumer and the user, and I think it is our imperative to communicate better. And so we saw these studies that no one reads policies and so we thought, wow, there's the old adage of a picture's worth a thousand words. Well, my thing is already 16 pages long, and I haven't even gotten to half the content, so what if I put pictures in there, we could have thousands of ideas and thousands of words using a graphic novel. And that's how we got hooked up with ZURB and we have this partnership and they drew our little ninjas for us for privacy policy.

The next one we hope will be animated with music, and we have a little [inaudible 30:13] video there. But I think that's where we need to go. I think we need to make the terms and conditions should not be a throwaway at the bottom of the page; they really are kind of why you have come to the party. And if they're not part of that central we all understand what we're doing, then I think it becomes very difficult for a social network in particular to articulate all the ways that humans are going to interact. So I think we just need to move as users as well as industry people to design using what we know about this 3,000 years of human.

Ryan: Awesome.

Michelle: It's a little idea.

Ryan: It's a little idea.

Michelle:t We'll get back to you next Wednesday when we're done.

Ryan: Yeah. But on that note, the 3,000 years of history, I want to open up to the audience questions. I want thank you once again.

Michelle: [inaudible 31:05] stirred pot.

Ryan: Yeah, I love that you stirred the pot and I hope everyone tweeted it out. But I want to open it up to questions, and who would like to go first?

Audience: When you mentioned the graphic novel I realized that we use info graphics all the time to try to explain concepts to various people. And it sounds like that might be a good settling point, I'm sure. But I guess the thing is I then realized it takes the company a lot of effort to do that rather than just shove the privacy policy at the bottom, and it sounds like the best thing we can do is try to get the good companies to do that so that the bad companies are shamed into doing it, if nothing else.

Michelle: I think that's how a lot of stuff does happen, right? I they're the same as these Bangladeshis factories, right. We need really marquee loud brands to step up and say, you know, you're going to have to pay $7 for this t-shirt because I'm not going to murder women in Bangladesh. And that's how this happens. So part of it was again the corporate side of me said, people don't realize that security is an ongoing transaction where we're constantly talking to your machine in the language of machine data.

We are constantly looking at things like the IP address, and it really came into focus when I went to Paris and I talked to a staffer of a data regulation body. And she said, "Oh, did you know, Michelle, there's a new piece of [inaudible 32:29]? I was like, oh, new Spyra; this is like exciting, what will we do, what is it? The IP address. And I thought, oh my God, this is woman who's going to be regulating us, and she really in her heart of hearts thought the IP address was a new piece of code that we had launched on. I said, no, that is the Internet. Oh, and she had no idea.

And I thought through no fault of her own, she's an attorney by trade, and she knows the regulations very well, she knows nothing about the technology. And I thought, oh, gosh, we need to educate our regulators as badly as we need to educate our users. Right? And so it was both the self-protection model for us as well as our belief that we do need to lean forward as bigger companies and well established companies to do that. Mr. Fox.

Mr. Fox: [inaudible 33:26] privacy policy. There's the language that visual privacy improves. [Inaudible 33:39]

Michelle: And ZURB is quite good at this.

Ryan:Speaking of ZURB, we've got our [inaudible 33:51] instigator on.

Question: This is a simple one. What's the difference between a privacy clause and [inaudible 33:54]?

Michelle: In terms?

Question: In terms.

Michelle: Yeah. So in ours there's a huge difference; there's a massive difference. Sometimes privacy policies are inside the terms of use of the site agreement or the EULA terms for the licensing agreements. But the privacy policies are simply terms for our use of data that's about people.

Question: Okay, but do you think people understand the difference between that?

Michelle: No, I think there are attorneys who don't know the difference. It makes you want to drink in the morning.

Question: Which are you more scared of businesses or people?

Michelle: In light of privacy terms?

Question: The documents themselves.

Michelle: I wish that businesses were scared about what they're saying about their own business. What typically happens is businesses do what they d. And then the amazing thing is when you go from private practice, when someone's paying you $500 a hour plus, you have a name and they'll use your middle name like my Catholic School Catechism name, you know.

When you come and join a company suddenly it's like, Bob Ryan and Legal. Who was in the meeting? Make sure Legal's there. Yeah. And they try to call you legally, and they'll do what they do and they'll say, okay, put this in the terms. And then you go off as the lawyer and you try to be a part of the business, and often it's quite late. And I'm only half joking; I think a lot of people think that either lawyers have some sort of magical elixir that suddenly makes bad ideas turn legal, or that they can trick us by not telling us until the last second and we have to write some sort of magic terms and conditions.

But often I think that's the relationship, that the lawyers are deemed kind of this like bionic crust around the corp, and then the corp is doing whatever it's doing in the interim. Where terms and conditions are really, really good is when you get in there early in enough and you walk through an idea and you say, what are all the things that can go right and let's look at that in terms of the positive side of the value. And then let's look at all the things that can go wrong. And that's when you balance the data value or the business process value versus the rest. And that's what you should be able to articulate before you launch a business.

Question: Most small businesses they don't have money, you know, so they go I'll just copy this privacy policy, and I'll bet that's 90%, right? Just because they go, oh, here's a legal guy, I'll just use that one. So are there any research people on the open source side who can figure out how to do this without necessarily ...

Michelle: There are. An attorney-so I'm going to recommend to you that you use an attorney-it's money well spent. I really will. I know you guys do, I'm just saying in general; generally, you know. It is money well spent.

I've got a couple of vectors in my head, so let's go with the VC vector first. It is my belief that the VCs are getting more sophisticated about having like a CIO on loan, or a sales startup on loan. I think they're going to start to get better and better at getting governance people, security and privacy people on loan for very small startups. It depends on what your business is, but if your business is writing apps or you're creating a very data centric system, understanding what that data [inaudible 37:23 corpuses] goes beyond your legal risk. It's so core to your business.

I've seen so many businesses sell themselves as a widget, when what they really are is this amazing like cloud gateway data assessment tool, but they think they're a widget. It's because they haven't sat down and thought about what is the data corpus; let's think about data as cash. And there's no company, even if there are two people, that's too small to think about what it is that they do and why it is different in the marketplace. If they haven't done that thinking then they really don't deserve any funding. Right?

So there are policy generators, which is the other side of the brain that was thinking. There are privacy generators out there. I don't know of them off the top of my head. [Inaudible 38:06] and even worse, because I have to write a global policy. That's the other thing I think people really need to know and understand. If you've got .com after whatever it is you're doing you're on a global stage, so you really have to think very carefully what is your product. Is it focused toward a certain sensitive group? Is it a financially motivated thing that you hope to sell the banks? Is it something directed to kids? Is it something that's healthcare where there are specialty rules and regulations? It not so complicated that you can't figure it out with some of the resources that are there online.

Online there's a great site that's free called Privacy Association.org. It's the international association of privacy professionals, and they actually offer cheap training as well, so you can take like a day of training and get certified in privacy. But don't walk around with your certification like you're like a cop now, or you'll end up like Zimmerman. But you'll know the basic rules.

Ryan: Really. We have time for one more question. [Inaudible 39:02] in the back had his hand up first.

Michelle: Yeah, he had the gas cock [inaudible 39:09]. Get your hands off my gas cock.

Question:[inaudible 39:20] Have you and your colleagues, wherever you've been, envisioned an organization in which the members would be deliberately anonymous to the organization; and structurally what that could look like, particularly if it's an interaction maybe for a medical entity or medical information coming from personal apps or devices. We need some kind of a firewall between the organization and our numbered members so that we don't know those IP addresses where this data is coming from. And it's structurally, forget politically and socially, but can you think of a way structurally where we could just identify them by voice prints maybe and biometrics we could deliberately know who and where they are?

Michelle: That's a great question. I think even today given the platforms we have. I'm thinking of my terrific friend. She's got a company called Wickr, and do you use wicker? I love it. So you as the user get to decide, the other person has to have the Wickr up as well, and you decide when that message will time out. So I'll send something to Ryan and I set the time up to five days and it disappears. He has no control over that content; he can't forward it on. It's 256-bit encryption; there's photography, there's attachments, and it's a neat little app. Right now it's kind of a neat little app.

So you take a neat little app like that and you can have a handle on being relatively anonymous. And then you can use something like Tour or any of these other kind of obfuscation, anonymous type browsing stuff; that's the technical analysts stuff.

So you can think about the technology and how you can string it together, but I think where you're starting from is the right place which is imagine a world where we would want people to be anonymous because of X. And I guess the X question mark in my mind and your scenario is do we want these guys to just be able to freely access information that some source is putting up? Do you want them to be able to talk to each other in an anonymous fashion?

And I think that plays in a lot of different ways, in fact just yesterday I had a call from someone-oh, look, it's my Dad. Hi, Dad, I'm here. I'm not calling him back. We're writing a book with my Dad and he's driving us crazy.

Ryan: What's the book?

Michelle: Ryan Riddle. It's called Privacy Engineering, and you're going to love it.

Ryan: Well, if you said, Mom, you could have definitely tuned out.

Michelle: Yeah, it's called Privacy Engineering, and it's coming out hopefully at the end of this year, knock on wood, if we get it written.

But anyway, I got a call just yesterday or the day before and a similar type of scenario except that the X question was very senior executives don't have a lot of social networking ability and they are very closely examined. They are very public figures and everything that they say can have a billion-dollar consequence.

So to have a system like that for them, I think would be really valuable for governance overall. And so that was the question that was posited to me, "Can you just do that sort of thing for executives where they could say we're considering using a new SEC treatment; has anyone else done this, who's a good lawyer? Who sucks? Whatever. And then have this timeout capability where the question would only be up there unattributed for X amount of time.

There are all sorts of reasons to have private sharing, but I don't think everything goes into that bucket. And I think that's where we fall down in the kind of hardcore security black hat communities is we have this belief that everyone wants encrypted email all the time. I only use my Wickr stuff to use my very potty-mouth obviously limited vocabulary. Right? I do my day mail, but I use my snarky funny stuff on my Wickr mail. It's not ever day that I'm needing encrypted email, and I think that's where we have fallen down. You don't always need everything to be encrypted.

We were talking earlier about what would a network be like, but you do need, if it's not going to be protected to a very high level, you do need to have that thoughtfulness and going in with the business proposition in mind, and thinking through in scenario testing those things, because I think that's where some of these companies that are turning into massive companies, they really didn't scenario test what if a billion people were on line at once. What if they were all tweeting about a revolution? What if they were all lying about that revolution?

They lied about Obama getting blown up one day and the stock market plummeted. That's a huge problem; that's a crime. And the information wars end on a really dark note. The information wars are coming, right? You don't have to throw a bomb anymore, you can upset a stock market and you can say that the food supply is poisoned, you can subvert traffic to a power plant-and all virtually. That's a pretty stark thought.

One of the other passions of mine is really getting geeky, getting into STEM, which is stem plus art. Getting people excited about being a cyber warrior whether you're a businessperson or a lawyer, or a taxidermist, or a gas in forensics-a gas cock-knowing guy. So I think it's an exciting time.

Ryan: I should put that on a business card.

Michelle: Ask me about gas cocks. I love that story. It's such a simple solution, just don't turn back this way and you can't turn it back on again.

Ryan: Do you have time to take one more question?

Michelle: Absolutely. I think I'm on a roll here.

Ryan: I know this young lady's raised her hand like three times already.

Question: So how do you compare Facebook with the [inaudible 45:00]?

Michelle: I don't even know diaspora.

Question: It's [inaudible 45:04] that's all.

Michelle: I need to check it out. I don't know. I will know. I like it.

Ryan: Awesome. One more full question and that's our last question.

Question: This is related to the Facebook, and you mentioned about Facebook [inaudible 45:19] customer unrelated to that point.

Michelle: I've really feeling bad; I don't work for Facebook. So I don't plan any of their strategic planning or whatever. But the basic business model, and it's not a judgment on them; it really isn't. If you look at what their 10K is they are a publicly traded company now. They are run on advertising, so who do you need to please when you're revenue source comes from advertising- that's their customer. And so it's not a judgment to say that I'm not their customer. I don't pay them anything.

If they change their model and I think there's an estimate like for Google or Facebook, if we each paid $50 a year we could be ad free. But they would then have to market to us and make sure that we kept paying that $50 annuity every single year for that free search. And suddenly it would be really easy to just switch off to other free searching. Right?

So the advertising model I don't say that as a point of judgment; I mean, you really do have to follow the money in many cases to figure out what is their business model and how are they going to have to behave to continue to survive, and they have to keep feeding that beast or come up with a new model for revenue.

Ryan: Very good.

Michelle: That's an even darker thought than cyber warfare.

Ryan: I was trying to end it on a little hope.

Michelle: Ryan Riddle.

Ryan: Great, well thank you very much, Michelle. I really appreciate you making the trip. Very good.

Learn from Past Influences