As consumer demand for new artificial intelligence (“AI”) tools continues to grow, businesses must be prepared to build tools with “privacy by design” principles in mind, and to remain educated about privacy best practices and risk mitigation strategies when working with AI. The following areas provide the greatest opportunities to manage data privacy risks and
Q&A about the new Oregon consumer personal data protection law.
Continue Reading A New Consumer Personal Data Protection Law in Oregon
Earlier this month, the Oregon state legislature introduced Senate Bill (SB) 619, “relating to protections for the personal data of consumers.” The bill has since been referred to the Senate Committee on Judiciary and the Joint Committee on Ways and Means. Of course, Oregon would not be the first state to enact general, or omnibus, privacy legislation; to date, five states (California, Virginia, Colorado, Connecticut, and Utah) have done so, with the first two operative as of today. Likewise, Oregon is not the only state to introduce new omnibus privacy legislation this month. The introduction of this bill (and other general state privacy legislation) remains significant because the prospect for omnibus federal privacy legislation (in the near term) fizzled out when the 117th Congress adjourned.
No bill exists in a vacuum. Structurally, SB 619 generally follows the Virginia Consumer Data Protection Act (VCDPA), as do the laws enacted by Colorado, Connecticut, and Utah.
SB 619 is only 17 pages long, not as slim as the VCDPA (8 pages), but not as bulky as the California Consumer Privacy Act (59 pages). Unlike the CCPA, SB 619 does not reference any implementing regulations; however, implementing regulations could be added.
As with any omnibus state privacy bill, the proposed legislation raises some key questions:…
To say that class action litigation regarding the use or collection of “biometric information” – such as fingerprints, face records, or voice records – is expensive would be a gross understatement. The damages sought, and sometimes recovered, in litigation under the Illinois Biometric Information Privacy Act and similar laws that impose statutory penalties can be…
If you manage a company that collects and otherwise processes personal data (which is just about every company, these days), you may need to protect your own pocketbook. As governments across the globe continue to enact and enforce data privacy, data protection, and cybersecurity laws, data becomes more readily available, and the volume of incidents…
It’s a great time to be a privacy attorney. On October 17, 2022, the California Privacy Protection Agency (CPPA) released the next draft of the regulations under the California Privacy Rights Act of 2020 (CPRA) as well as a document explaining the proposed modifications. Two days of public hearings were recently held on October 21-22…
In Illinois, the Biometric Information Privacy Act (“BIPA”) regulates the collection and use of “biometric information” such as fingerprints, facial images, and voice records. It imposes significant penalties and has generated a cottage industry of class action litigation—hundreds of cases have been filed and millions of dollars in liability have been assessed. It is also the most well known and heavily litigated of a slew of newly enacted, or soon to be passed, state and local laws aimed to regulate biometric information.
Many Illinois defendants had hoped that their liability under BIPA could be limited because, they argued, a one-year statute of limitations should apply to BIPA claims. But, in a recently issued decision, Tims v. Black Horse Carriers, Inc., 2021 IL App (1st) 200563, the Illinois Court of Appeals rejected this position for a majority of BIPA claims. It held that a five-year statute of limitations applies to the most frequently cited sections of the statute.
Continue Reading Illinois Court of Appeals: Statute of Limitations for Most Biometric Privacy Claims Remains at Five Years
Is your business using or thinking of using facial recognition technology for activities in Portland, Oregon? Think again.
That’s the message to businesses operating in Portland in a new ordinance that broadly bans the use of facial recognition technology in the city, subject to certain exceptions. The ordinance, which took effect January 1, 2021, restricts private businesses from using automated or semi-automated processes to identify an individual by comparing an image of a person captured through a camera with images of multiple individuals in a database. Due to the expansive language contained in the final version of the ordinance, routine business practices used to support or improve operations are no longer permitted. For example, retailers may have previously used software that compares surveillance video images of individuals as they enter a store with a cloud-based photo database to identify suspected shoplifters. The ordinance now prohibits use of this software.
The law also has teeth. It creates a private right of action, statutory damages of $1,000 per day for each violation, and allows for recovery of attorneys’ fees. Similar to other biometric privacy laws, this ordinance has the potential to trigger a wave of costly class action litigation and upend business operations. This ordinance creates significant risk with use of facial recognition technology, and organizations should proceed with this awareness. The law also raises numerous unanswered questions, as noted below. …
Continue Reading Portland’s New Facial Recognition Ban Increases Litigation Risk, Creates Uncertainty
Digital transformation, the process of leveraging technology, people and processes to innovate, requires an “all-in, ongoing commitment to improvement.” But the main drivers of digital transformation – data and profits – don’t always mesh seamlessly.
As shown by recent class actions filed against Blackbaud and Morgan Stanley, and a settlement with the New York Attorney General by Dunkin’ Brands, digital transformation has numerous cybersecurity issues that present legal obligations and potential liability.
In May, Blackbaud, Inc., a company that provides cloud software services to thousands of non-profits including hospitals, suffered a ransomware attack. In July, it began informing its users of the attack, many of whom used Blackbaud to process personal and sensitive information.
On August 12, the first of many lawsuits was filed against Blackbaud. Among the allegations in the lawsuit, Blackbaud is accused of failing to properly monitor its computer network and systems, failing to implement policies to secure communications, and failing to train employees.
The five years prior to the attack are telling. In that timeframe, Blackbaud underwent a digital transformation that involved acquiring numerous other software platforms including a predictive modeling platform, and a software provider focused solely on corporate giving.
Since the ransomware attack, Blackbaud has published cybersecurity improvements that support adherence to industry standards for incident management, employee training, systems and network testing, risk assessments, application security, encryption, and end-user authentication.…
Continue Reading Digital Transformation – Cybersecurity Lessons from Recent Lawsuits
March 2020 will long be remembered as the month and year of en masse shutdowns. But the pandemic has done little if anything to slow new cybersecurity and data privacy laws. As highlighted below, regulations for one have been submitted (CA), another has gone into effect (NY), and yet another has been proposed (CA).
California Consumer Privacy Act (“CCPA”) Gets Confirmed by State Attorney General
After nine months, a lot of public input, and three proposed drafts, the regulations for enforcement of the CCPA have been submitted for approval. The final text of the regulations demonstrates how granular enforcement could be. Here are five examples:
- A business must provide at least two methods for consumers to send requests for deletion of their information.
- A service provider can retain, use, or disclose information in certain circumstances, such as to detect security incidents even after a deletion request.
- A business must confirm within 10 days that it has received a request to know what it has collected from consumers.
- A business must have a documented policy for verifying the identity of a person making a request related to their personal information.
As this recent article illustrates, many ransomware operators are now collecting information from victims before encrypting their data, and then threatening to release what they’ve collected – or actually releasing some of it – to increase the chance they’ll get paid. There have been many cases already where at least a portion of data has…