Data Big and Small
Regional Meetings
Prior conferences
Upcoming conference
more about us



Risk, Security, & Privacy
September 12–13, 2017
Washington, D.C.

Johns Hopkins University Applied Physics Laboratory
11100 Johns Hopkins Road
Laurel, Maryland 20723
September 14, 2017

BY August 18, 2017



conference overview
Risk and security can no longer be separated, if they ever could. In fact, at some companies, cybersecurity is being moved from its corporate silo into a broader basket of risk management. The argument has been made that Target had the right strategy in neglecting cybersecurity; the hit from being hit, they say, still cost it less than the costs and friction of a proper cybersecurity strategy, which may not have been effective anyway. That view is too cynical for most organizations, but a realistic assessment that puts cybersecurity as just one element in a broad framework of costs, risks, and rewards is surely in order.

We’ll look at a variety of risks that need to be managed, focusing on those introduced by computer systems, mobile devices, IoT, and cloud computing. We’ll also consider the tensions and tradeoffs among security, efficiency, customer satisfaction, and privacy.

list of speakers

Julie Ancis says that as we automate more and more processes, we need to guard against the way biases and other implicit associations weaken the algorithms we construct just as it can impact human decision-making.

What’s next after 2-factor authentication? Andrew Bud of iProov will argue it’s machine-learning-based facial recognition.

Jason Hong, author of “Toward a Safe and Secure Internet of Things,” says there is only one point in time when the foundation is laid for how computation, communication, and sensing will be woven into our physical world, and that time is now.

Code at the level of the chip has the potential to be more secure than any other. Yet the potential for risk is also greatest. Serge Leef of Mentor Graphics will discuss both.

John Nay’s startup, Skopos Labs, uses machine learning to predict the outcomes of votes on Congressional legislation.

Russ Warner will argue that the polygraph hasn’t evolved in 80 years and never worked well in the first place. His alternative? A lie detector that uses a big data analysis of eye movement.

We think AI will be a big part of the solution when it comes to malware. Vincent Weafer of Intel Security asks, “What about when the black hats discover machine learning?”

According to Arvind Narayanan, you’re not nearly as anonymous as you would like to be. Netflix, for example, went to great lengths to anonymize the Netflix Prize 500,000-subscriber dataset, yet Narayanan and a fellow researcher has shown that an adversary who knows only a little bit about an individual subscriber can easily identify this subscriber’s record in the dataset.

Suzanne Barber is the Director of the Center for Identity at The University of Texas at Austin, which has developed an ID360 Scorecard that helps organizations assess the maturity and risks of their identity management and ability to combat fraud.

Maciej Ceglowski compares data to nuclear waste which comes in two flavors: extremely radioactive and concentrated, and low-grade waste, such as contaminated topsoil. In data, we have especially sensitive financial and medical records, but also bulky low-grade but still dangerous data, such as from your fitness tracker, which might reveal that you're having an affair on your lunch hour.

Arwen P. Mohun, author of the 2013 book, Risk: Negotiating Safety in American Society, says that there have been three stages in the evolution of risk in American society, matching three stages of society itself: the pre-industrial, industrial/manufacturing, and consumer. By studying this progression, we can see the outline of a fourth understanding of risk, one that matches the digital era of cyber-insecurity.

home about us activities and deliverables contact faqs copyright