A Strategic Guide for Ethical Data Management
Data Privacy: An Overview
“Data privacy,” a term that often prompts unease due to its inherent complexity, essentially refers to an individual’s control over their data — its collection, use and dissemination.
Businesses acquire such data through form submissions, cookies and geolocation tools. Marketers and companies use collected information for advertisement or share it with other companies as revenue streams.
Data privacy puts you, as a consumer or website user, at the helm of your data’s security. It enables you to track where and when your data is distributed, traded, or utilized for advertising.
Transparent disclosures on websites, newsletters, contact forms and alert sign-ups aim to help users control their data privacy more effectively. However, only some websites are transparent about their data practices, and many users need to pay more attention to data privacy notices, leading to escalating issues and potential penalties.
Importance of Data Privacy
The risk of falling prey to malicious threats escalates when your data privacy is jeopardized. For instance, if the wrong hands acquire confidential information like your social security number or birthdate, you could be a victim of identity theft.
Once marketers acquire your mobile number or email, they can target you with sometimes unwanted marketing campaigns.
That’s not to say that all marketing is unwanted — far from it. Some people search for products to leverage ad campaigns featuring competing deals, strategically abandon their shopping cart or leverage the power of anonymous browser settings. And people enjoy exposure to new or improved products or solutions to problems.
Data Privacy: A Historical Perspective
The concept of data privacy and associated apprehensions have been present for longer than commonly thought — predating even computers and social platforms like Facebook and TikTok. The idea of data privacy dates back centuries.
In 1775, Ben Franklin acted as the first Postmaster General, creating what we know as the postal service today. Early privacy measures included locked carriers’ saddle bags and laws regarding opening someone else’s mail. In 1789, the Fourth Amendment, prohibiting unreasonable searches and seizures and stipulating requirements for issuing warrants, was introduced. The invention of Morse code in the 1830s also brought an initial level of privacy, later enhanced by using cyphers to encrypt messages further.
U.S. Data Privacy Laws Over Time
U.S. data privacy laws have come a long way since then.
There’s now a system of laws and multiple organizations that deal with specific personal data types.
You’ve probably heard of the Federal Trade Commission (FTC) — the nation’s primary enforcement agency for all things data privacy and security related.
There’s the Health Insurance Portability and Accountability Act (HIPPA) — for health information.
The Family Educational Rights and Privacy Act (FERPA) helps protect children’s privacy, including, most notably, school records.
The Children’s Online Privacy Protection Act (COPPA) protects the data of children under 13.
The Telephone Consumer Protection Act (TCPA) helps protect against telemarketing (we could use a little more help here, nudge, nudge).
State laws like California Online Privacy Protection Act (CalOPPA) and Delaware Online Privacy and Protection Act (DOPPA) also contribute to the landscape of data privacy regulations.
The landscape of organizations and laws contributing to safer privacy data handling and distribution is widespread, vast and evolving.
Modern Data Privacy Controversies Explained
Edward Snowden and the NSA (2013): Edward Snowden is a former National Security Agency (NSA) contractor. In 2013, Snowden leaked classified documents that highlighted potential NSA global surveillance programs. The leak became headline news and instigated conversations regarding data privacy for everyday people.
Snowden’s leaks displayed an indiscriminate, egregious collection of phone records and data mishandling by Google, Facebook, Apple and Microsoft via a program called PRISM.
Data privacy became a polarizing national discussion regarding government surveillance, individual privacy rights and the balance between national security and personal privacy. Those debates continue today.
Facebook – Cambridge Analytica Scandal (2018): Cambridge Analytica, a political consulting firm, obtained the personal data of tens of millions of Facebook users without their consent, leading to the creation of psychological profiles to target voters with personalized political ads.
In order to gain access to the data, the firm leveraged a quiz called “thisisyourdigitallife.”
The quiz collected data from people who downloaded it and answered questions.
Moreover, the app scraped the data of the quiz takers’ Facebook friends.
At the time, Facebook had no policy against the practice of data collection or related scraping.
This scandal caused a massive public uproar regarding how personal data is shared and used by corporations like Facebook and others.
General Data Protection Regulation (GDPR) & California Consumer Privacy Act (CCPA): The creation of the GDPR in 2018 sought to increase data collection protections for European consumers. GDPR was, and remains, controversial as many publishers feel it’s too stringent and results in laborious website adaptations. If you process data in the EU, GDPR applies to your business. In today’s online businesses, avoiding GDPR’s reach is challenging.
Similarly, California passed the CCPA, which resembles the GDPR and provides California residents with enhanced personal data rights. CCPA aims to provide consumers with more control over their data. For example, as a California resident, you can request that an online business delete your collected data.
Consumers are more concerned than ever regarding their personal data.
High-profile data breaches and salacious scandals have partly driven this.
The result is consumers changing privacy settings, refusing cookies and deleting apps. And that’s only the beginning.
But the tech industry is reacting to change.
Apple has taken a strong public stance on privacy by introducing new privacy features in its mobile and desktop operating systems. Apple’s changes have been controversial (do they really help the consumer), and moreover, they’ve cost Facebook an estimated $10 billion.
Facebook and Google continue to use new technologies and methods for delivering personalized ads while preserving user privacy. But it’s proving a long, arduous road in a world where consumers don’t trust them.
Facebook is experimenting with technologies like Secure Multi-Party Computation (SMPC), Blind Signature and On-Device Learning.
Google has proposed methods like Federated Learning of Cohorts (FLoC) and Google Topics as part of its Privacy Sandbox initiative.
These technologies group users by browsing habits and interests. This allows marketers to target them still, but not personally —, rather, under the veil of a group.
The future of data privacy remains uncertain. Different stakeholders have differing visions for the future of data privacy. Anytime billions of dollars are involved, solutions take work.
What’s Next?
Google’s Privacy Sandbox: Introduced in August 2019, the Privacy Sandbox is Google’s ambition to improve privacy protection and without sacrificing existing web capabilities. It doesn’t sound straightforward. Well, it isn’t. And it is as robust as you might expect.
The project includes five APIs that aim to replace traditional cookies:
- Trust Token API: This is Google’s alternative to CAPTCHA. Users are asked to fill out a form; next, they receive cryptographic tokens.
- Aggregated Reporting API: This will allow behavioral metrics, such as engagement, reach, views and impressions to be collected without tracking the user.
- Conversion Measurement API: This API signals when user conversions occur without revealing personal information, similar to Apple’s SKAdNetwork.
- Retargeting API: Known as the TURTLEDOVE proposal, it allows ad networks to add users to segment groups in the browser based on certain actions.
- Topics API: This replaces the controversial Federated Learning of Cohorts (FLoC) and determines a user’s top interests based on browsing history, sharing that information with participating websites and advertisers without involving external servers.
FLoC was a machine learning technique that allowed browsers to form a centralized model without exchanging data, categorizing users into “cohorts” based on browsing activity.
However, FLoC faced criticism and concerns over new potential privacy risks, leading to its replacement with the Topics API.
The Topics API determines interest categories based on a user’s browsing history in a given week. These topics are stored for three weeks and shared with participating websites and advertisers when the user visits.
The Topics API differs from FLoC in that it does not group users into cohorts, excludes potentially sensitive categories like gender and race, and provides increased user control.
Despite these changes, Google’s Privacy Sandbox is facing legal scrutiny.
The U.K.’s Competition and Markets Authority is investigating potential unfair competitive advantages against Google. This could lead to yet another antitrust claim against Google.
The evolution of technologies such as the Internet of Things (IoT) and facial recognition are likely to scrutinize privacy further. An Accenture study suggests that consumers are uncomfortable with data collection via microphones or voice assistants, and the public needs to demonstrate higher levels of trust in tech companies and advertisers using facial recognition technology responsibly.
The focus on data ethics is rising, with brands and platforms potentially losing sight of these considerations, leading to a loss of consumer trust. Situations where ads have appeared next to debatable content have caused alarm for marketers who tend to tread lightly on the issue.
Personal Information Management Systems (PIMS) are revolutionary systems that help people gain more control over their personal data. PIMS, also known as personal data stores or vaults, allow individuals to control the collection, storage and sharing of their personal data. If done right, PIMS is a huge win for people looking to more easily protect their data.
PIMS facilitate compliance with privacy laws like GDPR and CCPA and can lead to greater efficiency, privacy control, reduced hacking incentives and less regulatory risk. PIMS fall into two categories: local storage models, where data is kept on users’ devices, and cloud-based models, where data is stored in one location or among various service providers.
Overall, the evolving data privacy landscape remains an ongoing public and political debate. The development of new technologies is a hopeful future for a complex position pitting the financially motivated needs of marketers and companies against the ethical desires of people.
Strategies Marketers Can Adopt:
- Show the value of personalization: Organizations should balance protecting consumers’ data and using that data for personalized marketing. Emphasis should be placed on empowering consumers to control how their data is used, ensuring the relevance of communications, and maintaining transparency about data usage.
- Be clear and transparent about data privacy: The complexity of data privacy policies often discourages consumers from reading them. Organizations should strive to simplify these policies and clearly state how personal data will be used. Transparency can encourage consumers to share more personal information, provided they know it is stored safely.
- Emphasize data security & protection from breaches: Data breaches can significantly erode trust in a brand. Organizations should adopt robust data security practices, such as identifying and classifying sensitive data, limiting access to information, and regularly testing security systems.
- Embed data ethics into the organization: Beyond just complying with data regulations, companies should embed ethical considerations into their data practices. This includes respecting the individuals behind the data, eliminating bias and promoting accountability and transparency.
- Prepare for a cookie-less future: In light of increasing regulations and the phasing out of third-party cookies, organizations should focus on collecting and utilizing first-party data. They should also consider using data clean rooms and transitioning from data management platforms (DMPs) to customer data platforms (CDPs).
In conclusion, while building consumer trust is challenging, particularly in the current digital landscape, it’s achievable through transparent, ethical and security-conscious practices. By doing so, organizations can maintain consumer trust and create a more meaningful, personalized customer experience.