The abrupt move to remote work and online living in 2020 put privacy concerns on hold. Businesses scrambled to avoid disruption through online connectivity, and people relied on technology to find normalcy in almost all facets of life, including social interaction, education, health care, entertainment, and shopping. In short, the need for business continuity in this new environment outweighed the need for security and, as a result, put privacy at risk.
The story is now well known. This global shift forced businesses to rely upon technology that was not secure or had individuals trust in apps that collected and used data to carry out the “new” aspects of daily life. Sadly, this behavior has subjected individuals to alarming privacy- and data-exposure risks, while cybercriminals have found easier ways to monetize their crimes.
There is no doubt privacy will once again be a major issue in 2021. Let’s take a closer look at what happened in 2020 and what to watch for in 2021.
CCPA Celebrates a Birthday
January 2021 marks the one-year anniversary of the California Consumer Privacy Act (CCPA), a data-privacy regulation that applies to any for-profit business that does business in California, collects consumer data, and meets one of the following criteria:
1. Has an annual gross revenue in excess of $25 million.
2. Holds personal information of 50,000 or more California consumers, households, or devices.
3. Has more than half of the company’s annual revenue arising from selling consumers’ personal information.
What happened in 2020: In November 2020, California voters passed Proposition 24—now known as the California Privacy Rights Act of 2020 (CPRA)—which is a ballot measure that amended the CCPA. Businesses must be compliant with these changes by 2023. While the CCPA has only been in effect for one year, Proposition 24 evolves the law in several ways, some limiting and others expanding the breadth of the CCPA.
What to watch for in 2021: There is good news for companies. As a limiting factor, the proposition removes the devices from counting against an organization’s number of records held. Further, it changes the threshold so that only businesses that buy, sell, or share the data of 100,000 California individuals or households are subject to the rules (unless the company meets the threshold of bullet points one and three). This is clearly good news for companies, as some may not meet this changed threshold.
It’s not all good news, though. Proposition 24 also expands the breadth of the CCPA. For example, the CPRA enables individual consumers to exert control over how every company uses their data and restricts commercial use of this data to terms explicitly agreed to by the consumer. This is unfortunate for companies, as many of them are already cash-strapped after suffering through 2020 and continue to feel the pain in 2021. Spending additional funds to permit consumers to have such control may put these companies out of business.
Organizations will, at a minimum, need to update policies and procedures to enable correct handling and deletion of personally identifiable information, ensure that use of any appropriately collected information is necessary and proportional, and ensure consumers can choose whether the information may be used for other purposes.
Many small to midsize organizations that do not already have a robust privacy regulatory compliance regimen in place may need to make substantial changes to be compliant come 2023. While the 53-page law was generally created to target large entities that are consuming large amounts of data, smaller businesses will arguably be the first to be fined, given that they are less likely to have policies and procedures in place or cannot afford to provide the new opt-out feature.
Further, once in effect, companies will face far larger penalties and will not be given the opportunity to remediate issues before they are brought to a suit. In fact, the violation of a minor’s privacy rights could mean a financial penalty of up to three times the current fine, and a company will not be given time to “cure” an issue, instead facing immediate penalties.
There is bad news for consumers, too. While the CCPA provided equal protection for consumers’ right to choose whether their data was used by implementing a non-discrimination clause, Proposition 24 seems to remove that protection. Under Proposition 24, companies are permitted to charge more to consumers who opt out of using their data, essentially enabling companies to put a monetary value on consumers’ data. This creates two more significant problems for the consumer: It forces consumers to opt out of every company’s data-collection practices, and, if consumers do opt out, they may now be charged more for doing so. This is both time consuming and costly to the consumer and takes advantage of those who cannot afford to opt out.
And there is more bad news for consumers and businesses: A problem that could exist for all citizens and businesses is that California intends to create a new privacy-administration organization, called the California Privacy Protection Agency, to take over enforcement functions from the Department of Justice. Funding for this agency would come from money “raised” from the penalties made against companies, flagging concerns that the agency would be incentivized to penalize.
While this agency will only have purview over companies that do business in California, there is always the possibility that this could set a precedent for other states or, eventually, a national standard. Until that time, companies will have to monitor and comply with the laws that each state sets in place.
General Data Protection Regulation
Interestingly enough, while it was once the “new kid on the block,” the General Data Protection Regulation (GDPR) seems to be old news these days. GDPR created one privacy law across the European Union (EU) and quickly became a global privacy standard. If you offer goods or services to, or collect data about, an EU citizen, then this law applies to you—irrespective of where your organization is based or if you collect payment. GDPR dictates that organizations must receive permission before collecting any data and disclose why data will be collected and how it will be used. Finally, should a breach occur, companies must report it within 72 hours.
What happened in 2020: Marriott received the second-highest UK Information Commissioner’s Office (ICO) fine of $128.2 million in 2020. The fine dates back to a data breach that happened between 2014 and 2018. During negotiations, the fine was reduced to $23.8 million. Was the change in the penalty because the ICO felt that Marriott acted promptly in notifying its customers, or is it reflective of harm that Marriott suffered from COVID-19-related loss of “income”? What is clear is that there is still no credible way to value a GDPR claim.
What to watch for in 2021: Expect in the near future that those entities that show COVID-19 harm may incur fines, but ultimately negotiate and settle to a lesser amount. However, moving forward, anticipate that the ICO will likely cherry-pick cases to create structure for companies to reference and plot where they may potentially sit for a future fine. Here is where we will see what the ICO really intends as we pull out of the COVID-19 haze. It is unlikely that the ICO will continue to take COVID-19-related impacts into account over the long term for those failing to implement sufficient organizational and technical measures to prevent a cyber breach or mitigate its effects.
Illinois Biometric Information Privacy Act
The Illinois Biometric Information Privacy Act (BIPA) requires that a company acquire written consent from a person to collect and use her biometric data, and that the company must disclose what is being collected. Biometric data is defined as any physical characteristic that can uniquely identify a person, such as fingerprints, face, or iris. Of the three states that have stand-alone biometric privacy laws, Illinois’ is the most comprehensive. The law gives residents the right to sue companies for up to $5,000 per violation, which could add up to billions of dollars in payouts for tech giants that lose such class-action suits.
What happened in 2020: In January 2020, Facebook agreed to pay $550 million to settle a class-action lawsuit over its use of facial-recognition technology in Illinois, giving privacy groups a major victory that again raised questions about the social network’s data-mining practices. The case illustrates the protections that strong state laws may offer consumers.
What to watch in 2021: This suit acts as a landmark case in privacy legislation that has been evolving to match new technological innovations, and demonstrates consumer-privacy protections that state laws can provide. It also coincides with raised concerns over the use of new surveillance technology, like facial recognition and using biometric data to identify unknown suspects. Many privacy-rights groups are worried that these new technologies are an overreach of power and could end anonymity in public. How companies are using biometric data is now under intense scrutiny.
The suit represents a loss for the tech companies that use biometric technology by establishing a precedent for future similar cases. It also helps settle the question of whether a person is actually “aggrieved” and allowed to seek damages under BIPA if they never incurred an actual injury or adverse effects from a BIPA violation.
In 2021, more biometric privacy laws will likely pop up or evolve on state and federal levels, which presents a challenge for the biometrics market that is projected to continue to grow in the coming years.
What happened in 2020: Tech companies quickly responded to COVID-19 with new contact-tracing apps fraught with potential privacy landmines. In an attempt to reduce the spread of the virus, many companies and governments in 2020 rushed to create apps to help track potential exposure to someone who tested positive for COVID-19. Some of these apps were created using Apple and Google’s API, which is designed to tackle any privacy issues or concerns by only using Bluetooth data collection, and these apps function solely as an exposure-notification system. Others were created using alternative systems with less security and fewer protections in place. Some of these apps ask for access to people’s location, camera, microphone, photos, contacts, phone carrier, IP address, device name, and even Apple Music data, raising concerns about how well protected they are and whether all of that data is truly necessary to be effective. On top of that, the alternative apps used proximity tracking detection methods considered less secure and private than those of Apple and Google’s API Bluetooth feature.
What to watch for in 2021: Security of contact-tracing apps will be and should be scrutinized as millions of individuals would be at risk if a data breach were to occur. Privacy laws related to contact-tracing apps are expected. Incidents flagged as misuse of information in both the UK and Israel concerning sharing contact-tracing information collected with law enforcement have shed light on privacy issues arising from the contact-tracing apps. It follows that legislation or restrictions may be passed in 2021 to protect users and their data.
Focus on Security
As the shock of the pandemic wears off and people begin to adjust and recover, hopefully there will be a renewed focus on privacy and security. By taking very simple steps, businesses large and small can help guard against cyber threats. Businesses should be encouraged to look to their cyber-insurance providers for guidance on security measures and support to help navigate the ever-changing privacy legal landscape.