< WIRe Blog

The Future of Data Privacy

by Lisa Wilding-Brown, Chief Research Officer, Innovate MR

February 1, 2020

Market researchers are, maybe more than any other industry professional, at risk of getting lost in a chasm of buzz words and jargon when it comes to defining the work that we do...

Some of these terms are borrowed from science; others introduce a new methodology and signal an evolved insight into the questions we seek to answer. And then there are words like “privacy” whose use — complicated by its inexplicit, culturally amorphous, and rapidly changing definition — can create distance between ourselves and our most valuable asset: the research participant. Just take a look at the average privacy policy and you will quickly discover what I’m getting at.

Earlier last year, The New York Times published a noteworthy article on the oblique and incomprehensible nature of 150 prevalent privacy policies and characterized them as a “disaster” with the vast majority of these policies deemed unreadable; requiring a college reading level or beyond. Additionally, the article indicated that many policies serve to protect companies, not consumers; written by lawyers, for lawyers. Google’s privacy policy, for example, has evolved greatly in complexity and length, starting at just 2 minutes in 1999 and peaking at 30 minutes in 2018. It is safe to say that our industry is no exception to this troubling status quo and our appetite for buzzy words (and their respective imposing methodologies) have made their way into our privacy policies; leaving consumers and researchers equally mystified.

Source: The New York Times examined 150 widely-known privacy policies and evaluated these policies using the Lexile test developed by the education company MetaMetrics. The test measures a text’s complexity based on factors like sentence length and the difficulty of vocabulary. According to the most recent literacy survey conducted by the National Center for Education Statistics, over half of Americans may struggle to understand dense, lengthy texts. This means a significant chunk of the data collection economy is based on consenting to complicated documents that many Americans simply can’t comprehend.

The Data Collection Economy

As more governments and regulatory agencies take swift action against companies that fail to protect consumer privacy by enacting sometimes confusing or complex laws, the market research industry is scrambling to meet new obligations related to data privacy and transparency. In order to understand the breadth and depth of new data regulations and its impact on our industry, we must understand the wider eco-system that we operate in first. Data-hungry industries — including tech giants — are making moves. Google recently announced its intention to phase out support for third-party cookies in Chrome with the goal of creating enhanced consumer privacy via a series of API tools in what Google has baptized the “Privacy Sandbox.” In sharp contrast, Facebook announced a new first-party cookie that will aid publishers and advertisers in the measurement and optimization of Facebook ads; allowing publishers to analyze browser data blocked by third-party cookies.

It’s both interesting and alarming to see two tech goliaths take such a different position on privacy; it’s no wonder that many research companies see privacy as a burden or a hurdle to “overcome.” However, simply preparing to meet tech and legislative requirements as they are introduced is a short-sighted strategy and misses the opportunity to develop methodologies and operating procedures that align with the ethics and values consumers demand. As we embark on a new decade, a stance of radical transparency is the only position which meets the demands of policy, provides consumers a fair opportunity to understand and protect their data privacy, and invites the public at large to truly embrace market research.

As a veteran member of WIRexec, I had the privilege of crossing paths with Kerry Edelstein a number of years ago. Not only did we grow up in the same hometown and start our research careers at the same company, but we soon discovered a shared passion on several issues. A kindred connection was born, and we have been threatening to collaborate ever since. Over this past summer, we decided now was the time; our shared commitment to data ethics and privacy was undeniable, and the industry needed to hear our voices. Soon after, InnovateMR teamed up with Kerry’s firm, Research Narrative, to conduct a groundbreaking study to better understand the participant perspective on data privacy. We had a lot of questions: What do consumers know about data privacy? What do they care about? How can researchers be trustworthy stewards of data? Our research also sought to address topics such as:

  • When it comes to “data privacy,” how differentiated is market research from other data heavy industries like media, ad tech, digital analytics, and healthcare? Do Americans understand the difference?
  • What impact have widely covered data breaches had on Americans’ perceptions of privacy and the trustworthiness of different industry sectors?
  • What are users’ expectations of industry and corporate America, when it comes to data privacy and remediation?
  • Will people pay for privacy? Are they interested in being paid to give up privacy?
  • Where do Americans stand on data privacy oversight and regulation?

Our Methodology

Our team collected 2,000 interviews from American consumers ages 18-79, in a 22-minute online survey that was balanced for age, ethnicity, and gender representation relative to US census. Participants were recruited from InnovateMR’s double-opt-in panel, PointClub, which was balanced for psychographic considerations such as panel tenure (length of time in panel), panel engagement (level of past participation frequency), and recruitment source (how panelists were sourced to join the InnovateMR panel). Survey responses were screened for several fraud indicators, using Innovate’s next generation fraud mitigation technology.

What Did We Uncover? Transparency is Needed!

The vast majority of participants indicated a desire for companies to be more transparent with how firms’ use personal data. A 30-page privacy policy is simply not adequate! To add more complexity to the conversation: in 2019 IBM Security and the Ponemon Institute released the 2019 Cost of a Data Breach Report, which was based on in-depth interviews with more than 500 companies around the world who have experienced a data breach between July 2018 and April 2019. The analysis in this study considered hundreds of cost factors from legal, regulatory and technical activities, to loss of brand equity, customer turnover and the drain on employee productivity. Based on the study’s findings, the average data breach costs a company 8.2 million dollars in the United States and takes an average of 245 days to fully identify and remedy. Further, notorious data breaches such as the Cambridge Analytica scandal have been widely publicized by the media. This combination of data security shortcomings, privacy violations, and dense privacy policies that lack transparency have created a culture of distrust when it comes to data privacy.

Trust Is Earned, Not Given

According to our research, 88% of Americans wish companies would be more transparent with how they use personal data. Furthermore, 70% of respondents indicated that they felt as though companies today do not want consumers to understand how their data is used. This statistic alone paints a gloomy outlook for researchers and reinforces our greatest fear: if consumers don’t trust companies with their data, do our surveys appeal to a sufficient cross-section of the populations we seek to represent?

The Perception of Industry

Our study revealed that the Social Media industry has a lot of trust to build; Social Media is considered 60% “very” or “somewhat” unethical in its use of personal data, with Advertising taking silver at 54%. Market Research fared better at 29%, however we are not the star athletes here: Academia (18%) and Healthcare (13%) achieved the lowest rating on the unethical scale. While our industry should feel some relief that we don’t share the same status as Social Media and Advertising, 29% indicates there is significant room for improvement and a desire from consumers for us to behave in a more transparent manner. Interestingly the GRBN, a non-for-profit MR industry association, presented a similar narrative in its Trust Survey of 2018, where respondents indicated a 27% trust rating among MR firms. It has been nearly 3 years since the GRBN collected this data and sadly, our study suggests we have failed to move the needle.

Private Conversations Are Not So Private

Our data uncovered that consumers find themselves at a troubling intersection where a lack of security, privacy, and transparency have created a very real and material lack of trust in emerging data collection forms. When presented with several scenarios, consumers indicated a heightened sense of fear and concern that passive data collection activities fail to meet basic ethical standards. 73% of consumers indicated concern related to companies recording and storing private conversations and messages.

Respect My Data Rights, or I’ll Support Regulation!

While we did see a lack of specific awareness around regulations such as GDPR and CCPA, the result of this distrust is that an overwhelming majority of consumers now seek regulation at both the state and federal level. What is even more fascinating is that our findings were consistent across political affiliation, making this a truly bi-partisan demand in an otherwise polarizing political environment. Although we can’t agree on many facets of public policy, consumers are squarely aligned regardless of where they fall on the political spectrum when it comes to data regulation.

More than ¾ of Democrat, Independent and Republican participants expressed support for the following regulations:

  • Inform consumers of personal data uses and allow an opt-out of those uses
  • No 3rdparty data sharing without active opt-in
  • No sharing of data with government agencies without notification
  • Consumer access to all data collected about them

Collected Without My Permission

While lengthy privacy policies (in miniscule font) might technically capture consent, our study revealed that consumers don’t recall giving permission to many types of data collected. Over half of the participants surveyed believe their web browsing, search history and site visitation (54%), along with purchase behaviors (49%), are being tracked without their permission. Nearly 50% of Americans see companies following them with targeted ads and they don’t recall providing consent.

Percentages reflect the % of respondents who believe this information is collected from them, without permission.

While new technologies and emerging methodologies can help us get closer to the truth, we must not lose sight of the impact these approaches have on consumer privacy and the perceptions we curate among research participants. Behavioral-tracking technologies (when bridged with self-reported data) can provide researchers with an exceptionally comprehensive consumer outlook however, if the ethical and legal considerations are not regarded, we run the risk of compromising the very thing we look to instill: trust.

Take Action!

While we may feel overwhelmed, transparency can and should be regarded as an opportunity, not a burden. There are material changes MR companies can employ to win over consumers:

  • Beyond the obligatory privacy policy, provide a concise and simple summary which distills complex legal jargon. Participants should be clear on how your business collects, stores, shares and destroys personal data.
  • As updates are made to your privacy policy, provide a summary of what has changed and provide users with the ability to review historic versions.
  • Be explicit – provide the purpose and need for this data and how it benefits not only your company but the participant as well.
  • Ensure that an active opt-in is presented during each instance of personal data collection. It is unfair to expect that a consent captured at panel registration provides companies free reign to collect additional personal information in perpetuity. Each business scenario is different and the drivers for personal data collection vary from project to project; advise respondents every step of the way and reconfirm their consent.
  • Clearly explain why you re-collect redundant information such as demographic data. It’s no secret that, in the age of programmatic sampling, many systems do not adequately map demographics and as such these variables are re-collected (and often in the same user session). Previous research-on-research we have conducted indicates that this process feels unusual and disrespectful to participants. If you need to collect information that has been previously provided, explain to participants why you need to do this, otherwise you run the risk of participants feeling annoyed and even suspicious of your intentions.
  • Ensure there is a sufficient feedback loop for survey enquiries and find ways to share non-proprietary data with respondents. Giving back to your survey respondents has been shown to increase participation rates and offset attrition. Strive to make the participant experience transformational, not transactional.
  • Fully document your participant data flow and security mechanisms as well as implement standard operating procedures designed to mitigate the risk of a data breach. Mandate employee and 3rd-party compliance through on-going training and e-signature agreements.
  • Socialize knowledge throughout the entire organization and develop processes to support continued education on this dynamic and complex topic.
  • Establish a regular cadence to revisit privacy protocols, partner agreements and ensure all internal stakeholders have a seat and voice at the table.
  • Get involved! Don’t be intimidated by consumer privacy. Find ways to engage with your organization and the wider industry. Share your commitment to data ethics with others and use your personal influence to help your colleagues gain a deeper understanding.

Interested in learning more about our Data Privacy study? Do you want to get involved in future dialogue around data privacy and consumer trust? We’d love to hear from you! Request an Executive Summary of our study and contact us at Lisa@innovatemr.com or Kerry@researchnarrative.com

About the Author

by Lisa Wilding-Brown, Chief Research Officer, Innovate MR

Lisa Wilding-Brown

Chief Research Officer, Innovate MR

Lisa Wilding-Brown has been in the industry since the inception of online research. Lisa’s specialties include sampling and survey design, panel development and management, online and mobile recruitment as well as full-service mixed-mode field management.  As Chief Research Officer, Lisa is responsible for panel quality, best practices, and sampling methodologies as well as spear-heading the firm’s DIY platform, the InnovateMR Insights Platform. Prior to joining Innovate, Lisa was a member of the executive team at uSamp where she developed the company’s global panel and led the firm’s mobile business division.  Lisa got her start in market research at Harris Interactive, where she played an instrumental role in developing the company’s full-service online project management group.  Wilding-Brown later joined the panel team, focusing her efforts on research-on-research, survey methodology, specialty panel development and online user engagement for the Harris Poll Online.

Wilding-Brown serves on the board for both the MMRA and UTA’s Master of Science in Marketing Research.  Lisa has presented at such conferences as AAPOR, the MRMW North America and Europe, The Insights Association, and she has published research related to engagement, online survey design, online quality and mobile best practices.  Lisa chairs a committee focused on enhancing user experience for the Insights Association’s Online Sampling Forum and she is an active contributor to the GRBN, Quirk’s, Greenbook and WIRexec.

More From the Wire Blog: