Source: The New York Times examined 150 widely-known privacy policies and evaluated these policies using the Lexile test developed by the education company MetaMetrics. The test measures a text’s complexity based on factors like sentence length and the difficulty of vocabulary. According to the most recent literacy survey conducted by the National Center for Education Statistics, over half of Americans may struggle to understand dense, lengthy texts. This means a significant chunk of the data collection economy is based on consenting to complicated documents that many Americans simply can’t comprehend.
The Data Collection Economy
As more governments and regulatory agencies take swift action against companies that fail to protect consumer privacy by enacting sometimes confusing or complex laws, the market research industry is scrambling to meet new obligations related to data privacy and transparency. In order to understand the breadth and depth of new data regulations and its impact on our industry, we must understand the wider eco-system that we operate in first. Data-hungry industries — including tech giants — are making moves. Google recently announced its intention to phase out support for third-party cookies in Chrome with the goal of creating enhanced consumer privacy via a series of API tools in what Google has baptized the “Privacy Sandbox.” In sharp contrast, Facebook announced a new first-party cookie that will aid publishers and advertisers in the measurement and optimization of Facebook ads; allowing publishers to analyze browser data blocked by third-party cookies.
It’s both interesting and alarming to see two tech goliaths take such a different position on privacy; it’s no wonder that many research companies see privacy as a burden or a hurdle to “overcome.” However, simply preparing to meet tech and legislative requirements as they are introduced is a short-sighted strategy and misses the opportunity to develop methodologies and operating procedures that align with the ethics and values consumers demand. As we embark on a new decade, a stance of radical transparency is the only position which meets the demands of policy, provides consumers a fair opportunity to understand and protect their data privacy, and invites the public at large to truly embrace market research.
As a veteran member of WIRexec, I had the privilege of crossing paths with Kerry Edelstein a number of years ago. Not only did we grow up in the same hometown and start our research careers at the same company, but we soon discovered a shared passion on several issues. A kindred connection was born, and we have been threatening to collaborate ever since. Over this past summer, we decided now was the time; our shared commitment to data ethics and privacy was undeniable, and the industry needed to hear our voices. Soon after, InnovateMR teamed up with Kerry’s firm, Research Narrative, to conduct a groundbreaking study to better understand the participant perspective on data privacy. We had a lot of questions: What do consumers know about data privacy? What do they care about? How can researchers be trustworthy stewards of data? Our research also sought to address topics such as:
- When it comes to “data privacy,” how differentiated is market research from other data heavy industries like media, ad tech, digital analytics, and healthcare? Do Americans understand the difference?
- What impact have widely covered data breaches had on Americans’ perceptions of privacy and the trustworthiness of different industry sectors?
- What are users’ expectations of industry and corporate America, when it comes to data privacy and remediation?
- Will people pay for privacy? Are they interested in being paid to give up privacy?
- Where do Americans stand on data privacy oversight and regulation?
Our team collected 2,000 interviews from American consumers ages 18-79, in a 22-minute online survey that was balanced for age, ethnicity, and gender representation relative to US census. Participants were recruited from InnovateMR’s double-opt-in panel, PointClub, which was balanced for psychographic considerations such as panel tenure (length of time in panel), panel engagement (level of past participation frequency), and recruitment source (how panelists were sourced to join the InnovateMR panel). Survey responses were screened for several fraud indicators, using Innovate’s next generation fraud mitigation technology.
What Did We Uncover? Transparency is Needed!
Trust Is Earned, Not Given
According to our research, 88% of Americans wish companies would be more transparent with how they use personal data. Furthermore, 70% of respondents indicated that they felt as though companies today do not want consumers to understand how their data is used. This statistic alone paints a gloomy outlook for researchers and reinforces our greatest fear: if consumers don’t trust companies with their data, do our surveys appeal to a sufficient cross-section of the populations we seek to represent?
The Perception of Industry
Our study revealed that the Social Media industry has a lot of trust to build; Social Media is considered 60% “very” or “somewhat” unethical in its use of personal data, with Advertising taking silver at 54%. Market Research fared better at 29%, however we are not the star athletes here: Academia (18%) and Healthcare (13%) achieved the lowest rating on the unethical scale. While our industry should feel some relief that we don’t share the same status as Social Media and Advertising, 29% indicates there is significant room for improvement and a desire from consumers for us to behave in a more transparent manner. Interestingly the GRBN, a non-for-profit MR industry association, presented a similar narrative in its Trust Survey of 2018, where respondents indicated a 27% trust rating among MR firms. It has been nearly 3 years since the GRBN collected this data and sadly, our study suggests we have failed to move the needle.
Private Conversations Are Not So Private
Our data uncovered that consumers find themselves at a troubling intersection where a lack of security, privacy, and transparency have created a very real and material lack of trust in emerging data collection forms. When presented with several scenarios, consumers indicated a heightened sense of fear and concern that passive data collection activities fail to meet basic ethical standards. 73% of consumers indicated concern related to companies recording and storing private conversations and messages.
Respect My Data Rights, or I’ll Support Regulation!
While we did see a lack of specific awareness around regulations such as GDPR and CCPA, the result of this distrust is that an overwhelming majority of consumers now seek regulation at both the state and federal level. What is even more fascinating is that our findings were consistent across political affiliation, making this a truly bi-partisan demand in an otherwise polarizing political environment. Although we can’t agree on many facets of public policy, consumers are squarely aligned regardless of where they fall on the political spectrum when it comes to data regulation.
More than ¾ of Democrat, Independent and Republican participants expressed support for the following regulations:
- Inform consumers of personal data uses and allow an opt-out of those uses
- No 3rdparty data sharing without active opt-in
- No sharing of data with government agencies without notification
- Consumer access to all data collected about them
Collected Without My Permission
While lengthy privacy policies (in miniscule font) might technically capture consent, our study revealed that consumers don’t recall giving permission to many types of data collected. Over half of the participants surveyed believe their web browsing, search history and site visitation (54%), along with purchase behaviors (49%), are being tracked without their permission. Nearly 50% of Americans see companies following them with targeted ads and they don’t recall providing consent.
Percentages reflect the % of respondents who believe this information is collected from them, without permission.
While new technologies and emerging methodologies can help us get closer to the truth, we must not lose sight of the impact these approaches have on consumer privacy and the perceptions we curate among research participants. Behavioral-tracking technologies (when bridged with self-reported data) can provide researchers with an exceptionally comprehensive consumer outlook however, if the ethical and legal considerations are not regarded, we run the risk of compromising the very thing we look to instill: trust.
While we may feel overwhelmed, transparency can and should be regarded as an opportunity, not a burden. There are material changes MR companies can employ to win over consumers:
- Be explicit – provide the purpose and need for this data and how it benefits not only your company but the participant as well.
- Ensure that an active opt-in is presented during each instance of personal data collection. It is unfair to expect that a consent captured at panel registration provides companies free reign to collect additional personal information in perpetuity. Each business scenario is different and the drivers for personal data collection vary from project to project; advise respondents every step of the way and reconfirm their consent.
- Clearly explain why you re-collect redundant information such as demographic data. It’s no secret that, in the age of programmatic sampling, many systems do not adequately map demographics and as such these variables are re-collected (and often in the same user session). Previous research-on-research we have conducted indicates that this process feels unusual and disrespectful to participants. If you need to collect information that has been previously provided, explain to participants why you need to do this, otherwise you run the risk of participants feeling annoyed and even suspicious of your intentions.
- Ensure there is a sufficient feedback loop for survey enquiries and find ways to share non-proprietary data with respondents. Giving back to your survey respondents has been shown to increase participation rates and offset attrition. Strive to make the participant experience transformational, not transactional.
- Fully document your participant data flow and security mechanisms as well as implement standard operating procedures designed to mitigate the risk of a data breach. Mandate employee and 3rd-party compliance through on-going training and e-signature agreements.
- Socialize knowledge throughout the entire organization and develop processes to support continued education on this dynamic and complex topic.
- Establish a regular cadence to revisit privacy protocols, partner agreements and ensure all internal stakeholders have a seat and voice at the table.
- Get involved! Don’t be intimidated by consumer privacy. Find ways to engage with your organization and the wider industry. Share your commitment to data ethics with others and use your personal influence to help your colleagues gain a deeper understanding.
Interested in learning more about our Data Privacy study? Do you want to get involved in future dialogue around data privacy and consumer trust? We’d love to hear from you! Request an Executive Summary of our study and contact us at Lisa@innovatemr.com or Kerry@researchnarrative.com