• Below is a summary of a conversation with John Verdi, Vice President of Policy, Future of Privacy Forum (FPF), and panelist at the Data Done Right conference at the U.S. Chamber of Commerce that took place in Washington D.C., on July 11, 2019. Mr. Verdi supervises FPF’s policy portfolio and has written extensively about the privacy policy landscape.

    Q1: In the op-ed you wrote for The Hill last year, you encourage people to get to know their privacy settings in order to avoid a “data use disappointment.” I have a two-part question. First, what advice would you give manufacturers of connected devices about designing the products they sell?

    JV: My recommendation would be to focus on building products and services in ways that are consistent with consumer expectations and/or delight consumers. The worst spot a manufacturer can find themselves in is one where they surprise people in ways that feel negative and invasive to them. Instead, if they surprise people with features they love and that benefit them, it often goes over really well. The problem comes in when the feature is seen as not benefiting the consumer but benefiting others, whether the manufacturer itself or another related entity. This doesn’t strike the right balance and is when backlash can happen. 

    Q2: Second, do you think checking privacy settings is a reflex that consumers need to build up when they buy new products, much like the safety belt education campaign in the 1980s?   

    JV: It would be great if consumers did build up that reflex, but frankly it isn’t a realistic expectation. The number of products people use and the time it would take for them to go through that process for every product isn’t tenable. Instead what we’ve seen result in greater success is when the manufacturer conducts a privacy “check-up.” This is when they affirmatively reach out to consumers and tell them what the default privacy and security settings are, instead of forcing them to select their privacy preferences up front. These check-ups are one of the most popular approaches we’ve seen.

    Q3: We know that data will continue to grow at an exponential rate and is constantly generated across every aspect of our lives and our environment. What tools do we have at our disposal to help manage the risk and maximize the rewards from an ever-expanding data landscape?  

    JV: The good news is that we have a number of privacy-enhancing technologies (PETs) at our disposal that can be used by any steward of personal data. For example, homomorphic encryption can be a highly effective tool, preserving the utility of data sets while safeguarding that data from breaches and privacy invasions. De-identification and the ability to make data pseudonymous can make it harder to associate that data with individuals. This is done through a procedure by which personally identifiable information fields within a data record are replaced by one or more artificial identifiers, or pseudonyms, so it’s harder to deduce one identity from another. Many more technologies and frameworks are available for organizations to not only help manage the data but also help manage the risk of noncompliance with current and potential privacy regulations.

    Q4: Most people aren’t aware that the Future of Privacy Forum offers master classes for regulators, policymakers, and staff seeking to better understand the data-driven technologies at the forefront of data protection law & policy. Why do you think it’s important for them to have this knowledge, in contrast to, let’s say a background in foreign relations?  

    Data regulators are typically experts in law, policy, and government. Some are technologists, but that’s not the usual profile. There’s broad agreement that for these officials to appropriately regulate data, they should have a better understanding of both policy politics and the technology itself, as well as how business works in different sectors. From machine learning and artificial technology to adtech and mobile technology, in order to make nuanced decisions, it’s important that enforcement agencies and others in government understand the underpinnings of how this all works and the economic forces that drive them. They don’t have to be engineers, but we help them become more conversant in technological capabilities as well as business models by providing a robust curriculum that’s authoritative, accurate, and neutral.

    Subscribe to RampUp

    Subscribe