Tuesday, April 28, 2020

Privacy in the Age of Coronavirus: A Conversation with Stephen Wicker

 

Steven Cherry Hi this is Steven Cherry for TTI/Vanguard.

If we step away from the horror of the coronavirus—the overwhelming cases and new cases and deaths; the hospital scenes of corridors ringing with more attention-needing alarms than there are nurses and parking lots with refrigerated trucks; the EMTs forced to return to work even after testing positive; the warehouse workers and delivery drivers who don’t know if the next box they touch will be the one to give them the virus—if we step away from all of it, from enough distance, we can glimpse how the pandemic is holding up a mirror to the world, as each nation shows its essential character—its ability or inability to band together and head down the epic journey the virus is taking us on; its willingness to trade privacy for safety, its transparency at the level of government and the individual citizen, in a town or neighborhood, on the street, walking home from a shift as an essential worker, or just from the grocery store.

South Korea has been one of the most aggressive countries when it comes to contact tracing. When someone tests positive for coronavirus, the local district uses cellphone data, taken by the government directly from the carrier networks, to send out emergency text alerts informing people that there is a new covid-19 case in the area where they live. Names are withheld, but some districts publish the routes of confirmed patients, the public transport they took, and the medical institutions that are treating them.

The U.S. has a history, at least a recent history, of protecting people’s privacy, even, and especially, in medical contexts. The 1996 Health Insurance Portability and Accountability Act, or HIPAA as it’s called, is a case in point. And yet, even that law explicitly permits “reporting of identifiable data for public health surveillance.”

The U.S. coronavirus statistics are horrific; the South Korean ones enviable by comparison. And so perhaps it was inevitable that two of our biggest tech companies, Apple and Google, gatekeepers of our mobile operating systems, with a nudge and some research by one of our leading tech universities, MIT, have taken the first steps toward an app that uses the short-range network capabilities of our phones to do some of the same South Korea-style data-collection that can lead to contact tracing.

To be sure, there will likely be some key differences, but it nevertheless raises some of the same questions of surveillance and loss of privacy.

People with a Ph.D. level of understanding of networks and cellular systems who also research matters of privacy and security are rare and highly-prized individuals, and I’m happy to have one of them as my guest today.

Steven Wicker is a Professor of Electrical and Computer Engineering at Cornell University. For some years now, his research has focused on the interface between networking technology, law, and sociology, with a particular emphasis on how design choices and regulations can increase or diminish our privacy and speech rights. And of course, I’m happy to be speaking with a fellow Stephen, even if he spells his name with a ‘ph.’

Steve, welcome to the podcast.

Stephen Wicker Thank you very much. Pleasure to be here.

Steven Cherry First, maybe you can lay out the Apple Google scheme as best we know it today, and we should note that it's barely a scheme. Nothing has been built yet and it's not even clear that they would be doing the building of whatever system we end up with.

Stephen Wicker Well, that's exactly right. It's basically a paper design at this point with a few guidelines. The basic idea is to use Bluetooth technology to record encounters with other people as you go about your everyday life or at least as you're able to do so under current conditions. The basic idea is to use Google technology to detect when another Google-capable device is nearby and to exchange keys to exchange identifiers so that if you or the other person comes down with a coronavirus, the fact that you were relatively close can be recorded, taken advantage of, and the person who is not yet ill can be notified.

Steven Cherry So the first level of privacy protection then is to translate phones into a sort of anonymous keys. How does that work?

Stephen Wicker Exactly. So what happens is software will be used to develop anonymous keys that change fairly frequently throughout the day. This is Crypto 101. You don't want to use the same key for a long period of time. And so the identity of the phone associated with the keys can only be unwound after the fact.

Steven Cherry This is all based on Bluetooth, as you said, which is a short range network between devices. That's how our phones connect to wireless headphones and speakers. How are computers with wireless mice and keyboards work. But it's also now being used in a lot of other places, department of grocery stores and so forth. How does that extension of Bluetooth work?

Stephen Wicker Okay, so Bluetooth was initially developed to be a short-range communication technology like Wi-Fi, but even shorter distances. It was supposed to be very low power and it still is and capable of supporting earphones and those sorts of things. Well, that low power, short-range capability turned into proximity detection. So when we go and do a store, the fact that our Bluetooth device is detectable means we're close by. In fact, we could be looking at a particular exhibit at a particular set of ads that are set up in the store. And so Bluetooth becomes a vehicle then for identifying people who are paying attention. And then, of course, trying to close the transaction, getting them to buy whatever it is they're looking at.

Steven Cherry Yeah. And I guess stores could even use it to see how long people linger over a particular brand of cereal or something. There's a there's an explicit consent in those cases, right? Certainly there's a consent when you pair up your phone with a wireless headset, for example. And there's even a consent between you and the store initially, right?

Stephen Wicker Well, actually, that's a little complicated. So when we try and, for example, set up our, ah, earbuds, we have to go through a process of connecting those buds to our phone. And it takes some effort. But what most people don't know is that the Bluetooth technology in their phones are continually advertising the phone's presence. And so by walking into a store, you may be engaging in some Bluetooth traffic that you're not aware of. Now, there's an implied consent in the sense that you've walked into the store and there may even be a sign up that says, no, there's Bluetooth interaction underway. But most folks don't realize that Bluetooth is a very active technology. It advertises your presence. And simply by having it turned on in your phone, you're engaging in some transfer of information.

Steven Cherry Now, the Google–Apple app, if and when it exists, would have an explicit opt-in. I understand.

Stephen Wicker That's correct. In order to—especially in its initial phase—you would have to actually opt-in and acquire an app and go through a certain set of steps in order to activate the process. But that's just step one. Step two will involve having this contact tracking mechanism as part of the actual operating system. So even if we don't opt-in, at some point, it's going to be in your Apple OS or Google's Android operating system.

Steven Cherry Yeah. And so. So it's going to keep track of my movements. So by "it" I guess I mean, the phone itself is going to keep track of my movements and it's going to keep track of any of these anonymized key identities that I come into contact with. And then what happens?

Stephen Wicker OK. So there's two different things going on. If you use a cell phone, your location is being tracked, period. This is not something you can opt out of except by turning off your phone. So let's suppose I have Verizon as my cellular service provider. Simply by using my phone, my location is being tracked. So that information is available. OK. Now, the Apple–Google contact tracker, at least in theory, will not obtain location information. So, for example, if I walked within six feet of you, there would be a record that we were six feet apart at some point in time, but we wouldn't know exactly where we were at that point in time, if that makes sense. Unfortunately, I think that level of anonymization can be can fail. They can figure out where we were. But at least as it's designed, the Apple Google system will not collect location information.

Steven Cherry So it's just going to keep track of these sort of, my-Bluetooth-device-is-close-to-your-Bluetooth-device at a particular point in time.

Stephen Wicker Exactly. So if you want to think about the total collection of data by this system, you're going to have a list of contacts. I was in contact with you at a certain point in time. My family—a lot of the time. Et cetera. So it'll simply be a list of people or IDs, I should say, and associated times.

Steven Cherry And then if and when. Let's say I test positive. What happens then?

Stephen Wicker All right. So if you test positive, then you will tell your care providers that you had this particular app. The associated authorities will then use the information from the app to determine who you've been in contact with. And they will let those people know that you tested positive. So that's when anonymization ends. The world, or at least those you've been in contact with know that you've been sick.

Steven Cherry But it all starts when I report basically to the system that I've tested positive, and that's a that's a choice that I can make.

Stephen Wicker That's right. Or at least that's the way it's been proposed.

Steven Cherry OK. And I think that it's different from the Korea model with the information just goes out as a matter of public safety record.

Stephen Wicker That's right. It's a more, how shall I put this, there's a lot more authority in the Korean model. You really don't have a choice. You can't opt out. Your movements will be tracked. And if you become ill, everyone you've come into contact with will be informed.

Steven Cherry And so how accurate is the Bluetooth data?

Stephen Wicker So the Bluetooth data is actually quite accurate because there are a couple of basic elements of physics involved. You're not going to be able to make contact through Bluetooth unless you're reasonably close. There are situations in which you might be 18 feet instead of six or something like that, but close enough. In other words, it's not the sort of thing where it will accidentally get someone from the other side of the city or something like that.

Steven Cherry You alluded before, the data starts out anonymized, but it may not stay that way. I mean, it could be really hard, especially in less densely populated areas, to really protect somebody's identity, when enough general information is known about them.

Stephen Wicker  Well, you're touching on really the most important part, from the privacy advocate's point of view. This data is allegedly anonymized, but it would be very simple to de-anonymize it. Let's consider my own data. I spend a lot of time around someone else, my wife who has an Apple phone. And so the fact that I'm in continuous, almost continuous contact with this person and then there's some others much more infrequently they're going to know it's me. It's very easy to figure out.

Steven Cherry I understand that this is happening in Korea. I mean, initially what happens is somebody tests positive, and messages go out to people in that area saying someone had tested positive. But it's not that hard in many cases for people to figure out who it is. And then there's an even further outing sometimes and even harassment. And in fact, there was a survey in Korea recently that found that people were almost more afraid of the social stigma than the disease itself.

Stephen Wicker Well, you know, that's right, at a number of levels, but first off, you know, simply knowing the route someone takes when they take it and so forth, that really narrows things down. You know, people going to work at a particular place at a particular time or going to the market at a particular time. That's an element of de-anonymization that is frankly, used with a lot of data sets to figure out who is who. It's not that hard. We do a lot of things in very individual ways. And so picking apart who is who is not that difficult. We have to assume that it can be done. There are many examples of supposedly anonymized datasets which have proved not to be anonymous.

And so we have to expect that that's going to happen with this data. Well, with regard to the stigma, clearly people are concerned. In fact, they're scared of this disease and perhaps rightly so. And they may lose their affability, their love for their individual, for their fellow man because of this disease. And so we have to expect people not to act as they would in normal times. It's unfortunate, but it's true.

Steven Cherry And so it seems like it would even be potentially a problem of it backfiring people not seeking testing and treatment in the first place.

Stephen Wicker Well, exactly right. If I'm concerned about the stigma and I think that, for example, I may have had the disease and didn't have symptoms, something like that, I could. I may not want people to know, you know, they may overreact. They may not let me shop in their particular shops, whatever the case may be. I may be thrown out of my apartment complex.

Steven Cherry So people might not even engage with the app at all. As far as we know, Apple and Google are just going to create and publish an API. And I guess that's just some code that an actual app would hook into. How would that work? And who would eventually build the app?

Stephen Wicker OK, so as a first step, the API is the application programming interface. It basically allows people to write code without having to worry about all the details of, for example, how the Apple iOS works. In other words, I don't have to be an expert in communication theory in order to write an app that takes advantage of an iPhone. So very helpful. And that's standard business practice. It's been around for a long time. So who would write these apps? Anyone who's interested. Any computer scientist or even any decent coder who wants to contribute to this may come up with a program that can use the API.

Steven Cherry Do you think it would end up being managed by healthcare companies or healthcare NGOs, or are they going to be a million apps out there? We'll just go to Github and grab the one we like or how is this going to work?

I would hope that it's going to be managed by the healthcare industry in some way, because otherwise we would have a lot of applications and some of them would not be as good as others. In fact, some may even include malware. We have to be careful about that. There'd have to be some sort of registration and licensing process to ensure that if we put these apps on our phones, they're doing exactly what they say they can do. But I would think there'd be a handful, that would emerge. That would be associated with the larger healthcare groups.

Steven Cherry And so how does this app fit into sort of the broader picture of contact tracing? I mean, let's say I'm at the receiving end and I find out that somebody I've been in contact with yesterday has tested positive today. What happens next?

Stephen Wicker OK. I would expect that that individual would have the opportunity to be tested. You'd have that chance even though there aren't as many tests as there should be in this country. I would think that you would qualify for a test if someone you've been in contact with tested positive. That would be a logical next step, a quarantine. You know, you may not have symptoms yet, but you may in a couple days. The basic things we've been doing recently, for example, if we've been traveling or something like that.

Steven Cherry In the 1890s, there was a large outbreak of tuberculosis that was fought in part by a system that was new at the time of notification and treatment directly by local health departments. And as far as I could see, this was like the classic tradeoff. You lost your privacy and you gained access to treatment. But there was this one twist to it. Doctors were allowed to request that health departments, health department inspectors not visit their patients. They the doctors would treat them themselves. So we ended up with a two-tier system of the upper- and middle-classes, who had doctors, were essentially exempt from the stigma of public knowledge of their illness, while the working class and the poor—who only had access to treatment through public clinics—had to allow the authorities to surveil them and publicize their case in exchange for treatment. I say all this to ask do you see any potential for class distinctions or any other distinctions to be reinforced here, or is there a chance that our electronic, mobile-based modern contact tracing be democratic and egalitarian?

OK, so, you know, there's one basic issue, namely that the more wealthy folks are, the greater their access to health care. I mean, that's unfortunately a fact in this country. You you've got more rapid access to physicians. You can text them, whatever the case may be, and get a response. And those with fewer resources just aren't able to do those things. Now, one of the things about code about software is that it can be more democratic, it can be more egalitarian in the sense that we're all using the same platform, we all get the same performance. And this is the ideal. This was the hope for the Internet when it first emerged as a commercial system in the mid-90s. You know, it was that it would be this egalitarian platform. And the cellular handset has that capability as well, that potential. It's unfortunate that it's been taken over by those who wish to provide us with programming, but it doesn't have to be that way.

So let's focus now on the specifics of this issue. We can have a situation in which the basic contact tracing—that can be highly democratic in the sense that everyone's contacts are treated the same. Now, once you're seeking health care, you're back in that old system of the wealthier you are, the potentially the better the health care you're going to get. But there's no reason why this contact tracing can't be highly democratic.

Steven Cherry So overall, are you more hopeful or fearful about this system if and when it comes about?

Stephen Wicker You know, I have mixed feelings because, as you know, I'm aware fact of your tuberculosis example from the '90s. I want to go back into the 1850s. There's a classic example. An English physician named John Snow collected data from a cholera outbreak and using location data, he was able to determine where the hotspot was and literally shut down a particular pump that was providing contaminated water. There are a lot of wonderful things you can do with this information. My concern is what's going to happen when we move on. What's going to happen when we're no longer in crisis mode? Are we going to shut down these capabilities or is it going to become something that is part of everyday existence? 9/11 was a horrible thing. You know, it affected all of our lives. But a lot of the laws that were passed as emergencies that provided certain surveillance capabilities, we're still dealing with. And I am concerned that if we develop technologies, build them into our handsets, we may never get them out when we're through with this particular crisis. So, you know, I feel very positive in the sense that it could have a big impact. I'm concerned that we won't be able to turn it off when the time comes.

Steven Cherry There are WWI-era laws that we're not too happy are still on the books.

Stephen Wicker Oh, my goodness. Yes, there there are First Amendment cases that go back to WWI where you just wouldn't believe it. Really? I couldn't do that. I would go to jail for saying these things. Yeah, well, we tend to—we react strongly when we are confronted by these events, tragedies, whatever the case may be. We don't do so well when it comes to going back to normal.

Steven Cherry In this case computer code could supersede legal code, though, right? I mean, it would be possible for Apple and Google to just say, well, we're done with the Coronavirus. We're turning off this capability now.

Well, Apple has been very good about asserting the privacy rights of its customers. There was a case in which the FBI wanted to open up an Apple phone that had been encrypted and Apple fought against it. And I think rightly so. So there is a lot of power in these corporations. It's been said that code is law. I think it's Larry Lessig who said that first. But the basic issue here is that the software will do what the software does. And if it's turned off, it's turned off. And as we've seen, our Congress isn't always up to speed on what the latest technology is doing. So there is a lot of power in the hands of these companies. They can go the wrong way, though. We may have companies that choose to continue their practices regardless of what the law says. And that can be highly problematic.

Well, Steve, there are still a lot of unknowns, even a lot of unknown unknowns in the parlance of a long-gone secretary of defense. So as if this thing ever comes off. We may have to have you back to hear how it's going, not just epidemiologically, but in terms of privacy and security. Thanks for your time today.

My pleasure. Nice talking to you.

We've been speaking with network and privacy researcher Stephen Wicker about a new form of contact tracing, American-style, in the age of the Coronavirus and beyond.

For TTI/Vanguard, I'm Steven Cherry.

This interview was recorded 21 April 2020.

Music by Chad Crouch

Audio engineering by Gotham Podcast Studio, New York, N.Y.

We welcome your comments @ttivanguard and @techwiseconv

Note: Transcripts are created for the convenience of our readers and listeners. The authoritative record of TTI/Vanguard’s audio programming is the audio version.

Resources

Apple and Google Want to Turn Your Phone into a Covid-Tracking Machine

Updated Apr 13, 2020

https://www.vox.com/recode/2020/4/10/21216675/apple-google-covid-coronavirus-contact-tracing-app

Seoul’s Radical Experiment in Digital Contact Tracing

April 17, 2020

https://www.newyorker.com/news/news-desk/seouls-radical-experiment-in-digital-contact-tracing

Public Goods, Private Data: HIV and the History, Ethics, and Uses of Identifiable Public Health Information

Public Health Rep. 2007; 122(Suppl 1): 7–15.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1804110/

How Apple and Google’s Coronavirus-Tracking Technology Works

April 14, 2020

https://www.fastcompany.com/90490059/how-apple-and-googles-coronavirus-tracking-technology-works

Author: Steven Cherry

Director of TTI/Vanguard, “a unique forum for senior-level executives that links strategic technology planning to business success. In private conferences that are part classroom, part think-tank, and part laboratory, its members—corporate and government leaders, entrepreneurs, researchers, and academics—explore emerging and potentially disruptive technologies.”

Twenty years experience as a technology journalist and editor, at the Association for Computing Machinery (ACM), and the Institute for Electrical and Electronic Engineers (IEEE). Founded the award-winning podcast series, Techwise Conversations covering tech news, tech careers and education, and the engineering lifestyle. Teaches an intensive writing class as an adjunct instructor at NYU. Previously taught essay writing and creative writing at The College of New Rochelle.

Add comment

Log in to post comments