• People-based marketing only works with data. The more data use is focused on the recipient’s best interests, the more welcome our marketing efforts are.

    Laws governing personal data use for marketing and other purposes have not caught up with the increasingly data-driven world we live in. The European Union’s General Data Protection Regulation (GDPR) sets out to address this gap. In doing so, it is becoming the global standard that all companies handling European data must comply with come May 25, 2018.

    To learn more about GDPR, data policies in the U.S., and their effect on consumers, we spoke with Sheila Colclasure, Global Chief Data Ethics Officer for Acxiom and LiveRamp. Listen to the audio here or read the interview below.

    RampUp: Sheila Colclasure, thank you so much for sitting down with us and talking all things data ethics, data privacy, and GDPR, which is just one of those acronyms that we’ve been hearing so much about in our field. I wanted to talk to you because we’re living in a time where consumers are hyper aware of data collection. They know that when they click on something via Facebook or Google, something else is going to happen to them as a result of the action they took.

    So, how does this consumer perspective on data and marketing and the experiences they have across channels and platforms play into how laws like GDPR are drafted, debated, revised, and finally enacted?

    Sheila: As the world accelerates into the digital era, consumers and regulators are more aware of data capture and data use. In particular, the regulators are concerned about observational data because in the digital era, we have near full-time observation. We, as consumers and business people, are all generating data every moment we carry around our smart device or interact with a connected device. So that has driven some concern and the need to modernize data protection laws.  

    GDPR, in particular, is the buzzword of the day. It is white hot. GDPR stands for General Data Protection Regulation, and it is a modernization of European law. Historically, data protection in Europe has been governed primarily by country law, enacted pursuant to the European Union (EU) Data Protection Framework. It is essentially a treaty between the EU Data Protection Authority and each of the EU member countries, requiring them to enact data protection law in their countries according to the framework.   

    The GDPR is part of something called the Digital Single Market Package, an acknowledgement that the world is becoming digital, and the market is becoming digital. For the digital market to work efficiently, it has to be interoperable, so the data protection law that governs the data that drives the digital market needs to be interoperable as well.  

    So, that’s what GDPR is intended to be — a pan-European standard to govern data collection and use in the digital era. That’s sort of part one of the answer.

    We are accelerating into a time where all of our human experiences will be driven by data. 

    The second thing is in the digital era, with all of this data collection, we are accelerating into a time where all of our human experiences will be driven by data. We are at an inflection point in human history where to harness all of the digital data needed to drive our digital experiences, we must use advanced algorithms. As the mass of data increases exponentially, we will see algorithmic technology become more sophisticated, make use of the data, learn from the data, and improve our human lives.  

    As you can imagine, there’s a certain amount of fear and certainly some mistrust. There’s mistrust by the regulators because some businesses have either been careless or inadvertent and haven’t handled or protected data well. We all receive the blowback from that — which means an erosion of trust in the commercial marketplace, specifically as it relates to data stewardship.  

    There is the consumer activist community and the civil libertarians who want to protect consumers and are the protective voice. They too have questions about the trustworthiness of business capture and use of data in the digital era.

    So, that is why we are seeing a tightening of laws, and certainly the GDPR is a big, massive body of law — and some of it is fairly disruptive to business as usual.  

    RampUp: We think of GDPR as something happening in Europe, but obviously it affects anybody who does any sort of business with European data involved. So, would you say that Europe is kind of a first mover in regard to having a new data policy that reflects the data-driven world that we live in?  

    And following on that, you’ve also spoken about the need for global data policies. How can these global data policies take shape within companies and brands or organizations to be able to comply with GDPR and potentially other types of similar laws coming in the future?

    Sheila: You’re exactly right. GDPR, I don’t know if I would say first mover, but it is certainly a well thought out, very fully developed body of law. Europe has been quite strident about this concept called adequacy.  

    Because GDPR is well developed and comprehensive, and the European data protection authorities have been strident and emphatic about their conservative approach to data protection, it is setting the tone. This idea of adequacy is that the data can’t flow from Europe into your country unless the country law is adequate or rather “materially equivalent.” This is absolutely driving GDPR as a global standard.

    Now, I want to contrast that for just a moment with the United States. The United States is a somewhat different marketplace culture than Europe, and we have a different legal structure than Europe. Our legal structure around data is data-use driven. We don’t have one comprehensive national data protection law. What we have are sectoral data laws that regulate the different uses of data.

    We have the FCRA (Fair Credit Reporting Act) that regulates the credit world, insurance underwriting, and employment. We have the GLBA (Gramm-Leach-Bliley Act). We have HIPAA (Health Insurance Portability and Accountability Act). We have HITECH (The Health Information Technology for Economic and Clinical Health Act). We have CAN-SPAM (The Controlling the Assault of Non-Solicited Pornography and Marketing Act). And we have other laws that regulate the different uses of data.  

    So, while in the United States we don’t have one national data protection law, we do have sectoral laws that regulate the use of data fairly aggressively. What we do in the United States, as a general rule, is outlaw harmful uses of data.

    In Europe, it is somewhat inverse. In Europe, in general terms, the legal approach is to say what is permissible. This is an inverted approach to how data is regulated in the U.S..  

    This differentiated legal approach is an important distinction, because what is and is not an acceptable use of data is really based on our cultural norms and our social values. An example that is frequently referenced is the data collection and use history in Germany during the time leading up to World War II. This is a very different history of data collection and use than, say, the U.K.. Germany’s history has informed their societal values and correspondingly a very thoughtful and careful regulatory approach to the use of data.  

    Think of laws as a codification of these social norms, things that, as a society, we collectively believe are okay or not okay.

    If I were going to pick one thing that we all need to be crystal clear on in GDPR it is the notion of “legal grounds to process data.” There are six legal grounds in GDPR — but as a matter of practical effect, the regulators have left us adtech players with only one legal basis. Consent. It creates considerable friction, click-fatigue, disrupts innovation, and, quite candidly, it’s not the best protection for data subjects.

    What is the best protection for European citizens? The best protection is one that requires businesses to be accountable for the impact of data use on individuals. Accountability means five things: being committed, having a method to put the commitment into effect, monitoring to ensure the method is working, transparency and meaningful control for the individual, and standing ready to demonstrate that promises are being met. Businesses need to design their data-activated solutions in ways that ensure that there’s no hidden bias or discrimination and that the result of the use of data would be deemed fair by the data subject. This approach really addresses the data stewardship and trust issues and provides the individual with greater protection.  

    As we accelerate into the world of machine learning and artificial intelligence, operating against a unified ethical framework is vital. We should design our advanced algorithms in ways that are human-centered, prevent harm, and deliver outcomes that are beneficial. If we all operate ethically and design our data-driven solutions with an ethical method, then the outcomes will be good for all of us, for our children, and for the world.

    RampUp: So, it seems like everyone is thinking ahead to May of next year as a kind of deadline to comply with GDPR. It sounds like there’s quite a bit of thinking that companies and brands have to do for years to come regarding how they use data and how they’re going to comply with various different laws. In the immediate term, there are many statistics out there as to the high percentage of companies that will not be able to comply with GDPR as it’s written today, come May of next year.

    What’s your take on this thought that companies just will not be ready and may have to suffer financial consequences?

    Sheila: I think that’s an inevitability. Remember, GDPR was finalized in April of 2016. The lawmakers gave businesses essentially two years to remediate, adjust their business practices, and get into compliance with the GDPR. We’ve had a two-year runway to get ready and May 25, 2018, is the enforcement date. Of course as we all know, the enforcement and consequences for failure under GDPR could be extraordinary.  

    RampUp: Last question, and this relates more to consumers and the consumer experience that they will have with brands with GDPR in effect. How do you think this will fundamentally change the way people interact with the brands that they love or may love or grew to love moving forward?

    Sheila: Well, I hope that GDPR gives the entire consumer marketplace greater trust in business. We certainly need that. There have been countless incidents in the news that have called into question the trustworthiness of business. GDPR, I hope, will reinforce and shore up consumer trust in business.

    Really, brands want to delight their customers. They want to serve them, build lifetime relationships with them, and they want the customer to trust them. Hopefully, that’s what GDPR will achieve.

    Subscribe to RampUp