|

Hackable: The New Privacy Ethics

(a six-post series)

Privacy & Disclosure of Personal Data

As people spend more time online and using apps that collect massive amounts of information, government entities grapple with how to define and protect privacy through regulation. To deem privacy waived by a click that allows access, e.g., by acknowledging cookies, seems unprincipled. But there is a give and take, and due to an improving body of law, there are now ways to limit cookies to the “strictly necessary” and reject the collection of additional trackable data and to reject commercial uses. Completely opting out may bar access to a website. The EU “cookie law” (ePrivacy Directive) governs cookies and other trackers (anything that stores, accesses, or uses data from an individual’s device) using an informed consent approach.

The US does not have federal laws requiring consent for cookies. Consent fatigue occurs when people simply click as a reaction to the pop-up and to get to the information or webpage they want without much thought about the privacy of their data. People continue to exchange privacy for access without understanding how much personal data they are divulging, who can use that data, and for what purposes the data may be used. While cookies are an example of a permissions approach to data use, there are infinite types and pieces of data across devices that make a permissions-based approach naïve and unlikely to achieve many ethics-based goals like fairness and privacy.

privacy computer
Photo by Elisa Ventur on Unsplash

Privacy and Unexpected Uses of Data

The ethical questions arise from the reason people click-to-continue, the amount of unexpected extra information being collected, and the way the information can be paired with public records or other purchased data to create a personal profile. The profile predicts behaviors but also allows wrongdoers to blackmail, pigeonhole, or harm personal reputation. With so much data collected and stored, hacking is more profitable, tempting, and pervasive, and ransomware attacks continue to bring victims’ operations to a halt. Through selling data as well as using it as a marketing tool, companies also profit from the data without compensation to the people whose data is used, something I addressed in Barcode Me.

This post will address the distinction between constitutional privacy and confidentiality as an aspect of privacy. The next few posts will cover pediatric data, FBI concerns (crimes and cybersecurity), discrimination and bias, how corporations use data for a competitive edge, and the use or abuse of data in international politics.

Privacy laws and books.
Photo by Iñaki del Olmo on Unsplash

Two Laws Attempt to Protect Confidentiality: Are They Solving the Correct Problem?

The European Union General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) were enacted in 2018, with the CCPA becoming effective January 2020. (Other states are also tackling privacy law.) The CCPA was enacted to promote the ability of consumers to choose to protect their privacy and to resolve some issues surrounding the inadvertent waiver of privacy. Its purpose was not necessarily to keep data private, but to allow consumers more control over both privacy and use. The act governs companies who make money selling data and who collect significant amounts of data. (It does not apply to a host of small businesses.) The GDPR and the CCPA differ in that the CCPA allows consumers to opt out of certain data uses (although consent is not necessary for companies to collect and process the data) while the GDPR specifies legal uses of data preventing companies from harvesting data for other means without consent. GDPR gives consumers a right to prior consent while CCPA merely gives them rights over the proscribed uses. (Also, note that the CCPA does not apply to data already available in the public sphere or to health data that is covered by other laws.)

In all jurisdictions, people can click to waive rights. Some argue that one of the primary flaws in both GDPR and CCPA is that the laws still allow data collection and processing. Once data is stored, it is vulnerable to hackers. As cybersecurity improves, hackers adjust. The sophistication of hackers is constantly improving.

The consumer data protection laws focus on data collection and use, and intend to protect confidentiality, one aspect of privacy. Confidentiality is important for many reasons like protecting a bank account or credit card number that could be stolen, keeping embarrassing purchases, searches, or website visits private, or maintaining trust in the doctor patient relationship. TikToc, Facebook, Instagram, and Snapchat users waive their privacy regularly. People also use GPS features and have the ability to click share on social media after making a purchase on any number of shopping websites. It becomes difficult to argue that a person in favor of divulging so much information has a legitimate interest in confidentiality. Yet people continue to be surprised when they realize how much data is collected and how it is used. A framework to address confidentiality must recognize the willingness, preference, and ability of people to waive confidentiality.

Two-Pronged Privacy

Privacy incorporates a “right to be left alone” and confidentiality. Absent confidentiality, the right to be left alone may shrink or continue amid uncertainty, losing its value. Privacy is therefore broader in context than mere confidentiality.

While I do not support devaluing confidentiality, I assert that ethicists should recognize a two-pronged privacy. Many people want to make data privacy a human right, yet the degree to which people value confidentiality varies. Those concerned with privacy are also willing to store data on computers, websites, and apps where the data may be tracked, collected, used, abused, and hacked. Privacy as a human or constitutional right will stem from protection from government intrusion and protection of a “sphere of privacy” rather than from confidentiality for its own intrinsic value or for practical reasons. The US Constitution and the United Nations value privacy because of the imperative to protect people from surveillance, not for the mere sake of confidentiality itself. The distinction between privacy from government intrusion and confidentiality is crucial.

I would go further and assert that the importance of confidentiality has more to do with surveillance and protection from government, corporate, or criminal wrongdoing than mere secret-keeping.

Privacy and constitution flag
Photo by Robin Jonathan Deutsch on Unsplash

Constitutional Privacy

Privacy protects actions from government infringement (constitutional privacy) and it also serves to keep some actions or data confidential (colloquial privacy). This section addresses whether privacy will remain meaningful if confidentiality is no longer part of it. That is, will information be adequately protected from intrusion regardless of the ability of hackers, government, and marketing firms to access, analyze, and act on the data? Rationales for limits on government surveillance and the right to be “left alone” might become weaker if the type of data collected is generally released by a click anyway. The CCPA and the GDPR both presuppose private data and do more for confidentiality on its face than for true protection from surveillance or intrusion.

The constitutional right to privacy is not explicit. It is found in the First, Third, Fourth, Fifth, Ninth, and Fourteenth Amendments and broadly applies to freedom of speech, freedom from unlawful search and seizure, and decisions or actions generally expected to be private or personal in liberal society. In general, there must be a “legitimate expectation of privacy” as would be necessary in applying the Fourth Amendment prohibition against search and seizures to internet driven data. In the sphere of data stored on computers, typed into websites, and generally disclosed in some way, the expectation of confidentiality may be unrealistic depending on methods that are generally unknown to the user (whether a website collecting your data uses edge or cloud computing, blockchain technology to confirm transactions, cybersecurity software, etc.).

In Sorrell v. IMS, the US Supreme Court struck down a Vermont statute prohibiting the sale or transfer of prescriber identifying data without consent. The data in question was collected by pharmacies and sold to IMS, a data firm, then resold to marketing firms. The Court’s rationale was that because the government could access the data for certain uses pursuant reporting requirements, it was no longer confidential. Therefore it could be used for corporate purposes under the guise of free speech. There is something especially alarming about Sorrell v. IMS: The Court held that the data was less worthy of protection because of required confidential reporting to a state government agency, not because it was voluntarily disclosed. Under that logic, all data subject to any reporting or required for a purpose (even where privacy has legal protections like data given to a health insurance company, online stores, or schools) becomes fair game for other uses. If it were never truly confidential, both government and companies would consider it unprotected.

privacy hacker
Photo by Florian Olivo on Unsplash

Hacking has become so commonplace that we must wonder whether courts will eventually find that any data stored where it is hackable is no longer private. While the law rightly tries to punish hackers and corporations that allow data to be vulnerable, some laws are not attempting to protect data deemed public that used to seem private. Social media and apps have changed public perception of privacy raising public concerns but the willingness to publicize seemingly private information has also increased. Socializing in person was traditionally private. Socializing on social media is not. A lack of protection would apply to the confidentiality part of privacy and would not itself allow for government intrusion or use of the data. But the collection of data can lead government to increase surveillance, a problem in free society. For example, data from public cameras in New York City are used by law enforcement.

Examples of Using Data for Agenda Marketing

To further apply that logic, if an entity used data entered voluntarily or web-browsing and search data (or if a hacker accessed data) that indicated someone used or googled a medical abortion pill or emergency contraception, a corporation with permission tracking personal data may market certain goods and services to that person. But also, an anti-abortion legality group might market its agenda to the person. The person may have expected confidentiality and see that as a breach. But privacy as freedom from government intrusion becomes murky. If authoritarianism were to come about and abortion rights were repealed, retroactive punishments of individual “suspects” could ensue. The confidentiality aspect of consumer privacy is the precursor to the protection from government interference currently protected under the constitutional privacy umbrella.

There are many examples of harmless cookies resulting in large pools of deidentified data that make the population vulnerable to corporate marketing and government surveillance.

Privacy and Ethical Considerations

While a framework that emphasizes control over data may serve consumers well, ethical privacy policies must provide protection of confidentiality and protection from government intrusion in private matters. Placing a two-pronged privacy in a responsible technology framework that includes all stakeholders and looks to the societal risks of an end to traditional privacy may inform new approaches that go beyond ethical frameworks of duties, rights, fairness, virtues, and utility. Transparency and consent will not cover the societal stakes.

It may be futile to stop the collection of data. So much is already in the hands of both government and corporations. Controlling its use could become the only practical approach. Data is not neutral. Nor is it used neutrally. Those using it can shape societal preferences affecting habits, health, employment, and political agendas. Such a responsibility calls for broad ethical considerations, not merely weighing the benefits of opting in or out of cookies.

Feature Photo by Kristina Flour on Unsplash

Similar Posts