A New Consumer Data Protection Bill in Oregon: A Summary of SB 619

Earlier this month, the Oregon state legislature introduced Senate Bill (SB) 619, “relating to protections for the personal data of consumers.”  The bill has since been referred to the Senate Committee on Judiciary and the Joint Committee on Ways and Means.  Of course, Oregon would not be the first state to enact general, or omnibus, privacy legislation; to date, five states (California, Virginia, Colorado, Connecticut, and Utah) have done so, with the first two operative as of today.  Likewise, Oregon is not the only state to introduce new omnibus privacy legislation this month.  The introduction of this bill (and other general state privacy legislation) remains significant because the prospect for omnibus federal privacy legislation (in the near term) fizzled out when the 117th Congress adjourned.   

No bill exists in a vacuum.  Structurally, SB 619 generally follows the Virginia Consumer Data Protection Act (VCDPA), as do the laws enacted by Colorado, Connecticut, and Utah. 

SB 619 is only 17 pages long, not as slim as the VCDPA (8 pages), but not as bulky as the California Consumer Privacy Act (59 pages).  Unlike the CCPA, SB 619 does not reference any implementing regulations; however, implementing regulations could be added.

As with any omnibus state privacy bill, the proposed legislation raises some key questions:

Whom does the bill protect?  Under SB 619, that would be Oregon consumers, i.e., residents acting in any capacity other than a commercial activity or performing duties as employer or employee.  The VCDPA has the same limitation.  A similar limitation in the CCPA expired on January 1, 2023. 

Who will be subject to SB 619, if enacted?  SB 619 applies to any person that conducts business in Oregon, or that provides products or services to Oregon residents, and that, during a calendar year, controls or processes the personal data of either (a) at least 100,000 Oregon consumers, devices that do or can identify at least 100,000 Oregon consumers, or a combination of both; or (b) at least 25,000 Oregon consumers, with at least 25% of its gross revenue attributable to the sale of personal data

What can a consumer request (and expect to receive) from the controller?  A controller is any person who, alone or together with others, determines the purposes and means of processing personal data.  A consumer may obtain from the controller confirmation of whether a controller is processing or has processed the consumer’s personal data, a list of the categories of the personal data subject to processing, and a list of the specific third parties (as opposed to categories of third parties) to which the personal data has been disclosed.  In addition, a consumer may require a controller to correct inaccuracies in the personal data about the consumer, to delete personal data about the consumer, and to opt-out the consumer from targeted advertising, selling of personal data, and certain profiling. 

Is there an opt-in requirement for sensitive personal data?  Yes. The VCDPA features the opt-in requirement as well, though SB 619 defines sensitive personal data differently.  The CCPA gives the consumer the right to limit the use or disclosure of sensitive personal information (to certain enumerated purposes). Colorado and Connecticut are also opt-in, while Utah is opt-out.

Is there a broad private right of action, available both on an individual and class-wide basis?  Yes, so long as the consumer suffers an ascertainable loss of money or property as a result of a controller’s violation.  For example, a failure to delete a consumer’s personal data could give rise to individual or class-wide claims, provided that there is an ascertainable loss.  The CCPA features a limited private right of action (on an individual and class-wide basis) for data breaches.  In contrast, the Virginia, Colorado, Connecticut, and Utah laws are enforceable only by the Attorney General of each state. 

Is there potential personal liability for a director, member, officer, employee, or agent of a controller in violation of SB 619?  Yes.  This is not found in any of the existing (5) omnibus state privacy laws. 

Is there an ability to cure violations?  Maybe.  The Attorney General will notify the controller if the purported violation can be cured, and if curable, the controller will have 30 days to do so. 

Last, but not least, when would SB 619, if enacted, become operative?  There are three operative dates to note: July 1, 2024 (general, including the serving of an investigative demand by the Attorney General), January 1, 2025 (the date the Attorney General may first bring suit, subject to the possible 30-day cure period), and January 1, 2026 (private right of action).  In Oregon, the effective date is the date a bill becomes law.  Typically, a bill takes effect the first day of the year following passage.  One or more operative dates can be used to delay operation of one or more parts of a bill, if administrative preparation is required. 

Bills change, and SB 619 may be no exception.  Please stay tuned for updates.

Class Action Suits Targeting Biometric Information Continue to Seek Large Payouts

To say that class action litigation regarding the use or collection of “biometric information” – such as fingerprints, face records, or voice records – is expensive would be a gross understatement.  The damages sought, and sometimes recovered, in litigation under the Illinois Biometric Information Privacy Act and similar laws that impose statutory penalties can be truly shocking.  And, if a recent complaint under a Portland, Oregon city ordinance and a jury verdict in Illinois are indicative, there is no immediate sign that this trend will reverse.

First Ever Lawsuit Under Portland Facial Recognition Ordinance Seeks $10 Million

On December 1, 2022, plaintiffs Brian Norby and Jacqueline May filed the first ever lawsuit alleging a violation of the City of Portland, Oregon’s ban on the use of face recognition technologies, Code of the City of Portland, Oregon, Chapter 34.10 (“Facial Recognition Ordinance”).  The Facial Recognition Ordinance became effective on January 1, 2021, and broadly bans the use of facial recognition technologies by private entities in public accommodation.  For an analysis of this ordinance, and how to comply with it, see our previous article here.

Norby and May allege that on January 5, 2021, days after the ordinance became effective, each visited the same convenience store owned by Jacksons Food Store (“Jacksons”) and each was subjected to the use of facial recognition technology in violation of a “statutorily protected right to privacy.”  Norby and May do not allege any other specific damages, and they allege that Jacksons used this technology in just three of its Portland, Oregon stores.  Nevertheless, Norby and May seek certification of a class action of all individuals “exposed to Facial Recognition Technologies when they visited Jacksons store location in Portland, Oregon.”  They then claim statutory damages of $1,000 per day for each violation of the Facial Recognition Ordinance.  In all, they estimate these amounts to be $10,000,000, and they also seek to recover attorneys’ fees and expenses.  

First Ever Class Action Jury Verdict under BIPA Results in $228 Million Award

If the damages in the suit under the Portland ordinance seem implausible, an Illinois jury and court, ruling on a suit under the Illinois Biometric Information Privacy Act (“BIPA”), suggest otherwise.  The BIPA regulates the collection and use of biometric information, and, in many ways, serves as a template for the Portland Facial Recognition Ordinance and other similar laws.  A discussion of this statute is available here.  Many suits have been filed under the BIPA, but it wasn’t until earlier this year that the first such case, Roger v. BNSF Railway Co., went to a jury trial.

Rogers was first filed in April 2019 in Illinois state court and was removed to federal court in the Northern District of Illinois.  In the complaint, the plaintiff sought to represent a class of more than 44,000 truck drivers who visited facilities owned by BNSF, a national railyard owner and operator.  The suit alleged that BNSF required drivers who visited its railyards to provide biometric identifiers in the form of fingerprints and hand geometry to access BNSF’s facilities and violated BIPA by (i) failing to inform class members that their biometric identifiers or information were being collected or stored, (ii) failing to inform class members of the specific purpose and length of term for which the biometric identifiers or information were being collected, and (iii) failing to obtain written consent prior to collection. BNSF noted that a third party, not BNSF, collected the information and that it had no knowledge of the collection.  The jury disagreed.  After a five-day trial and just one hour of deliberation, the jury found that BNSF recklessly or intentionally violated BIPA more than 45,000 times.  The court then applied BIPA’s damages provision and entered a judgment of $228 million, not including attorney’s fees. BNSF has stated it will appeal, and the parties have exchanged settlement proposals to finally resolve the matter. 

Any Recourse?

A recent case from the Ninth Circuit suggests that courts might narrow disproportionate awards of statutory damages in some cases.  In Wakefield v. Visalus, a case brought under the Telephone Consumer Protection Act, the court held that an aggregate award that is “wholly disproportionate” and “obviously unreasonable” in relation to the goals of a statute and the conduct the statute prohibits can violate due process.  In these extreme cases, a damages award can, and should, be reduced.  But such a holding has never applied to a case under a biometric statute, and its broad impact is unclear. 

The best recourse is, therefore, proactive: any business that uses or collects biometric information should make sure that it understands and complies with the requirements of all relevant laws.  The consequences of failing to take these preventative measures could be significant.

Executives Personally Sued for Data Privacy Incidents

If you manage a company that collects and otherwise processes personal data (which is just about every company, these days), you may need to protect your own pocketbook.  As governments across the globe continue to enact and enforce data privacy, data protection, and cybersecurity laws, data becomes more readily available, and the volume of incidents increases exponentially, individual executives and board members are being named personally in lawsuits for breaches of their fiduciary duties.

Individual executives and board members, thus, should address and manage any personal risk by, for example, (1) ensuring adequate insurance coverage, (2) remaining up to date on standards and protocols for corporate data security, and (3) creating clear data-privacy and protection roles and responsibilities in corporate governance documents.

The Current State of Corporate Data Protection

With the enactment and enforcement of data-protection laws such as the General Data Protection Regulation (GDPR) in Europe and data-privacy laws such as the California Consumer Privacy Act (CCPA), as well as various cyberbreach-response statutes, the legal risks related to personal data have ballooned.  Many data-related lawsuits are filed as putative class actions, again, increasing the potential liability.  A quick search on Westlaw reveals over 2000 cases related to data breaches, but with the oldest from only 2004.  And there is no sign that the number of incidents and laws enacted to help prevent them will decrease in the future.

History of Corporate Liability for Data

Other than the common law tort for invasion of privacy, most liability related to the collection and other processing of personal data is created by statute.  Typically, under those statutes, the company, alone, is directly liable if there is a mishandling of personal information resulting in damages.  Recently, however, plaintiffs also have started naming individual company executives and board members in lawsuits for their role in any mishandling of personal data, in an attempt to impose direct personal liability.

Courts are willing to impose personal liability on executives and board members under certain circumstances.

Although the cases naming executives and board members personally for data incidents are somewhat new, courts have provided some guidance on how they might succeed.  For example, the Court of Chancery of Delaware recently dismissed one of these lawsuits, and, in so doing, provided a roadmap for what a successful suit might look like.  In that case, Construction Industry Laborers Pension Fund v. Bingle, the plaintiffs sued various executives and board members of a software technology provider.  The plaintiffs sought to impose personal liability on those individuals for alleged breaches of fiduciary duties to the company.  Specifically, the plaintiffs alleged that the defendants had failed to “adequately oversee the risk to cybersecurity of criminal attack.”  The company’s governance documents had included express references to both cyber and data security, all the way up to the Board level.

Despite acknowledging that the executives and board members had duties to the company related to personal data, the Court refused to impose personal liability on those individuals.  That is because, the Court explained, the plaintiffs also needed to establish that the individual defendants (1) intentionally acted with a purpose other than that of advancing the best interests of the company; (2) acted with intent to violate positive law; or (3) intentionally failed to act in the face of a known duty, demonstrating a conscious disregard for their duties.

But because the plaintiffs in that case failed to establish at least one of those, the court dismissed the case. Thus, if a plaintiff can prove that an executive has acted in bad faith, by establishing one of these three factors, courts may be willing to impose individual liability on executives and board members for personal data incidents and violations.

Takeaways

To avoid or reduce the risk that individual executives and board members will be personally named and held liable in a personal-data lawsuit, companies should adopt the following general steps:

  1. Review and monitor all applicable data and cybersecurity laws to help ensure that the company is not affirmatively violating them.
  2. Review and revise corporate governance documents to help ensure proper oversight and monitoring of personal-data risks.
  3. Ensure proper cyber-risk training for all management and board members; ideally, have at least one board member with cyber-expertise.
  4. Review D&O (Directors & Officers) and other insurance policies to ensure that, to the extent possible, all executives and directors are indemnified for actions applicable to personal-data and cybersecurity incidents.

Taking these steps will help ensure that the company and its executives and directors provide sufficient data protections under the law, while also helping to protect those officers and directors from being exposed to individual and personal liability.

The Current State of General State Privacy Laws

It’s a great time to be a privacy attorney.  On October 17, 2022, the California Privacy Protection Agency (CPPA) released the next draft of the regulations under the California Privacy Rights Act of 2020 (CPRA) as well as a document explaining the proposed modifications.  Two days of public hearings were recently held on October 21-22, 2022.  Given the rather extensive proposed changes, it seems unlikely that these will be the final regulations.  The current draft of the regulations is 72 pages long.   Most of the CPRA provisions become effective as of January 1, 2023.  While CPRA enforcement does not begin until July 1, 2023, and then on a prospective basis, there is enough of a difference between the California Consumer Privacy Act of 2018, as amended (CCPA) and the CPRA (which amends the CCPA) to warrant the review of current processes, operations, and policies.  In addition, the 30-day cure period available under the CCPA disappears under the CPRA.  In short, there is some work to do, collectively.  In the meantime, and until June 30, 2023, the CCPA (including the existing regulations) is still enforceable.  Deep breath.

That Was Then

When I co-taught Comparative Privacy Law at a San Francisco Bay Area law school in Spring 2020, the landscape seemed much simpler.  On the European side, we had the General Data Protection Regulation (GDPR), some opinions (guidance) from the European Data Protection Board (EPDB) and many more from its predecessor, the Article 29 Working Party, and an ocean of case law.  The Weltimmo decision (C-230/14) was and remains one of my favorites.  Not only does it shed light on the concept of an establishment in a given country (Hungary), but it also teaches readers that many problems can be avoided by simply being responsive to, and not upsetting, customers.  On the US side, in terms of general state privacy laws during that time period, it was the CCPA. 

This is Now

When I co-taught the course in Spring 2022, I focused on the US side and in particular the CCPA, CPRA, the Virginia Consumer Data Protection Act (VCDPA), and the Colorado Privacy Act (CPA).  Each state in the union has its own data breach notification law.  We touched on these generally.  We reviewed FTC settlements.  We touched on federal privacy laws, which are predominantly sectoral.  On March 24, 2022, the Utah Consumer Privacy Act (UCPA) was signed, our nation’s fourth general state privacy law.  As an instructor, I could not resist presenting this new law to my students, whose heads were likely still spinning from the other privacy laws that I was teaching.  To my credit, I had the good sense to not include the UCPA on the final exam, which featured, of course, consumers in California, Colorado, and Virginia.  Public Act No. 22-15, entitled An Act Concerning Personal Data Privacy and Online Monitoring (CTDPA), was signed by the governor of Connecticut on May 10, 2022.  Luckily for my students, the semester was over, and a future cohort of students would need to show proficiency in understanding the metes and bounds of this new law. 

Is a comprehensive federal privacy law in sight?  Maybe.  H.R. 8152 (American Data Privacy and Protection Act or ADPPA) was introduced on June 21, 2022, referred to the House Committee on Energy and Commerce, and voted to be advanced to the full House of Representatives on a 53-2 basis.  Since then, it appears to have stalled.  In the current draft, the CPPA would have the authority to enforce the ADPPA.  Further, Section 1798.150 of the CPRA (private right of action for data breaches) would not be preempted.

In the meantime, the VCDPA becomes effective on January 1, 2023, the CPA and CTDPA become effective on July 1, 2023, and the UCPA becomes effective on December 31, 2023.  Holistically, and structurally, there are quite a few similarities between the VCDPA, CPA, UCPA, and CTDPA, with the VCDPA as the progenitor, although one should be careful not to assume that if one complies with one, one will comply with the others.  For example, all four use GDPR concepts and terms like data controller (equivalent to a business under the CCPA/CPRA), data processor (equivalent to a service provider under the CCPA/CPRA), and so on.  As intimated, important differences exist among these.  For example, the UCPA applies to controllers and processors with at least $25 million in annual revenue and that either (a) control or process the personal data of at least 100,000 consumers or (b) derive over 50% of their revenue from the sale of personal data and control or process the personal data of at least 25,000 consumers.  In contrast, VCDPA applies the second part of the test, but not the first; there is no $25 million annual revenue threshold.  Further, while both the VCDPA and the UCPA define sensitive personal data (SPD), UCPA requires notice and the right to opt out, while VCDPA requires consent.  VCDPA requires a data protection assessment for high-risk processing.  UCPA does not.  The VCDPA gives the consumer the right to correct inaccuracies.  The UCPA does not.  Notably, it was not until the CPRA that California consumers were given this right.  Both are unfunded, initially, with funding to come from enforcement actions.  Under the UCPA, once the balance in the “Consumer Privacy Account” exceeds $4 million, the balance is transferred to the general fund.  Neither has a private right of action, with enforcement authority vested solely in each state’s Attorney General.

What to do? 

Detailed charts (and re-reading each a few times) help.  More helpful, however, would be to view these laws holistically, preferably in the context of a comprehensive privacy compliance program.  Certainly, companies having to comply with the GDPR were better positioned to comply with the CCPA, and companies having to comply with the CCPA will be better positioned to comply with the CPRA and the VCDPA, CPA, CTDPA, and UCPA.  Each subsequent compliance project becomes a gap analysis followed by an implementation phase.  To that end, the focus should be on compliance building blocks, generally required or helpful for compliance with any modern data privacy law.  These include records of processing activities (ROPAs), procedures for managing data subject requests (DSRs), procedures for managing data incidents, data processing agreements with suppliers, a process to vet suppliers for information security robustness and issues, a process to conduct data privacy impact assessments (DPIAs), internal policies, external notices, training, and so on.  Once the basic processes and documents are in place, then adjustments happen, in accordance with a crisp project plan covering objectives and detailing individual tasks to accomplish these.  The process is iterative, and, theoretically, less painful for each new general privacy law, until there is a comprehensive general federal privacy law, of course.  Good luck!

Illinois Court of Appeals: Statute of Limitations for Most Biometric Privacy Claims Remains at Five Years

In Illinois, the Biometric Information Privacy Act (“BIPA”) regulates the collection and use of “biometric information” such as fingerprints, facial images, and voice records.  It imposes significant penalties and has generated a cottage industry of class action litigation—hundreds of cases have been filed and millions of dollars in liability have been assessed.  It is also the most well known and heavily litigated of a slew of newly enacted, or soon to be passed, state and local laws aimed to regulate biometric information.

Many Illinois defendants had hoped that their liability under BIPA could be limited because, they argued, a one-year statute of limitations should apply to BIPA claims.  But, in a recently issued decision, Tims v. Black Horse Carriers, Inc., 2021 IL App (1st) 200563, the Illinois Court of Appeals rejected this position for a majority of BIPA claims.  It held that a five-year statute of limitations applies to the most frequently cited sections of the statute. Continue Reading

The Only Bi-Partisan Show in D.C.: The U.S. Supreme Court Issues a Decisive Opinion Concerning TCPA Liability in Facebook, Inc. v. Duguid, et al.

In a widely anticipated ruling, the U.S. Supreme Court today ruled that just because a business has calling technology that has the capacity to store and dial multiple numbers – such as a cell phone — does not automatically subject that business to Telephone Consumer Protection Act (“TCPA”) liability for calls (and texts) to consumers that otherwise lack consent.

Beyond other aspects of what constitutes a robo-call, this ruling is likely to limit the number of class actions brought against businesses under TCPA.  Still, for businesses required to comply with consumer protection laws, obtaining and retaining evidence of consumer consent for calls and texts remains the primary business action to limit risk.  Where businesses use vendors to administer call campaigns, we recommend discussing with vendors the impact this decision may have on campaign practices.  As always, contacting experienced counsel to investigate whether creative steps can be taken to incorporate aspects of today’s ruling into your relationships is a wise step to better protect your business.

In an 8-0 opinion, with Justice Alito concurring in the judgment for unanimity, the U.S. Supreme Court reversed and remanded the Ninth Circuit’s decision in Facebook, Inc. v. Duguid, et al.  Slip Op. No. 19-511, 592 U. S. ___ (2021).  In the context of consumer protections ensconced in the TCPA, the Ninth Circuit held that any company maintaining a database that stored consumer phone numbers that could also be programmed to automatically call the numbers stored therein, were operators of “automatic telephone dialing systems” (“ATDS”).  Among other things, the TCPA prohibits unsolicited telemarketing and other calls and text messages from users of an ATDS.  The Ninth Circuit’s conclusion created a rift.  The TCPA’s definition of what constitutes an ATDS was more narrow than the Ninth Circuit’s interpretation.  As Facebook pointed out to the Supreme Court, the Ninth Circuit’s interpretation not only appeared to ignore the TCPA’s complete definition of what constitutes an ATDS – it made ubiquitous forms of technology previously untouched by the TCPA open to that liability. Continue Reading

Seattle & Portland Virtual Cybersecurity Summit Begins Tomorrow

Join me, Stoel Rives’ Chief Information Security Officer (and Global Privacy & Security Blog® author) Jon Washburn, for a panel discussion in which I will partner with top industry CISOs and CIOs to address the most pressing cybersecurity challenges of 2021. Register now for free for the Seattle & Portland Virtual Cybersecurity Summit, March 31 and April 1, 2021, for CPE credits, educational briefings, and three amazing keynote presentations. For more info or to register visit: https://www.dataconnectors.com/events/2021/march/seattle-portland/?=affCISODM.

Don’t let Cyber Insurance be Your Cybersecurity Plan

In a recent letter to insurers, the New York State Department of Financial Services (“NYDFS”) acknowledged the key role cyber insurance plays in managing and reducing cyber risk – while also warning insurers that they could be writing policies that have the “perverse effect of increasing cyber risk.” If a cyber insurance policy does not incentivize the insured to maintain a robust cyber security program, the insurer can end up bearing excessive risk when the customer leans on the policy as their business continuity plan.

You may be wondering “What does this have to do with my business? I don’t do any business in NY state.” However, your insurer might be subject to the NYDFS cybersecurity regulation (23 NYCRR 500) and, if so, likely received this letter.

According to NYDFS, every cyber insurer should have a formal strategy that incentivizes their insureds – through more appropriately priced plans – to “create a financial incentive to fill [cybersecurity] gaps to reduce premiums.” Below is our take on five of the key practices outlined in the NYDFS letter that have potential implications for insureds.

  • Manage and Eliminate Exposure to Silent Cyber Insurance Risk. Up to now, many organizations have leveraged clauses in standard policies to cover ransomware attacks, such as those covering general liability, theft, malpractice and errors. NYDFS advises that “insurers should eliminate silent risk by making clear in any policy that could be subject to a cyber claim whether that policy provides or excludes coverage for cyber-related losses.”  When you next renew your policy, read the fine print carefully to determine if there are any exemptions for cyber-related losses – even if you have a standalone cyber insurance policy. An insurer that was left ‘holding the bag’ for covering a ransomware attack under a policy that wasn’t priced to cover cyber losses is incentivized to update that policy language at the soonest opportunity.
  • Evaluate Systemic Risk. Here, insurers are being advised to “stress test” their coverage to ensure they would remain solvent while covering potentially “catastrophic” cyber events impacting multiple insureds. If you are a cloud or managed services provider and/or are part of other organizations’ supply chains, you should expect to receive more scrutiny from your insurer on the strength of your cyber security program.
  • Rigorously Measure Insured Risk. No surprises here, unless you haven’t been filling out detailed questionnaires about your cyber security program. Expect more scrutiny of your program, and possibly the involvement of auditors to validate your claims. Check your insurance policy to see if investing in a certification program – such as ISO 27001 or HITRUST – might improve your policy premium.
  • Educate Insureds and Insurance Providers. This practice states that “insurers should also incentivize the adoption of better cybersecurity measures by pricing policies based on the effectiveness of each insured’s cybersecurity program.” Take advantage of any educational opportunities your provider offers on cybersecurity best practices and improvements. They might be trying to tell you how you can lower risk – and your rates.
  • Require Notice to Law Enforcement. While this is a best practice, NYDFS is recommending this be more formally required in the policy language.  Involving law enforcement is important when responding to cyber incidents, especially when it comes to investigating the incident and attempting to recover funds. Make sure you involve legal counsel and have a plan for engaging law enforcement in the event of a breach.

Even if your insurer hasn’t received this guidance, they are certainly aware that cyber risk, and the cost of underwriting cyber insurance, continue to increase.  With the cyber insurance market estimated to exceed $20 billion by 2025, and the risk that intermediaries – including insurers – can be liable for ransom payments made to entities sanctioned by the Office of Foreign Assets Control, business leaders should expect that their insurers will be more closely scrutinizing their cyber security plans and controls. Rebuilding encrypted systems and restoring from backup, as opposed to paying ransoms, will need to be the first plan of action.

If your organization is still struggling with the decision whether to invest more in IT security and architecture improvements or continue to rely on insurance as your cyber security plan, the guidance in the NYDFS Cyber Insurance Risk Framework merits a closer look.

While cyber insurance can be essential to helping your organization recover from a data breach, it should not take the place of a strong cyber security program.  At minimum your cyber security program should include a Cyber Security Plan, Business Continuity and Disaster Recovery Plan and an Incident Response Plan. These plans should be tested, reviewed and updated at least annually, preferably in conjunction with a penetration test and vulnerability assessment from a qualified third party.

If you have any questions or would like any additional information about the topics raised in this post, please contact Hunter Ferguson, Jeff Jones or Jon Washburn.

Portland’s New Facial Recognition Ban Increases Litigation Risk, Creates Uncertainty

Is your business using or thinking of using facial recognition technology for activities in Portland, Oregon? Think again.

That’s the message to businesses operating in Portland in a new ordinance that broadly bans the use of facial recognition technology in the city, subject to certain exceptions. The ordinance, which took effect January 1, 2021, restricts private businesses from using automated or semi-automated processes to identify an individual by comparing an image of a person captured through a camera with images of multiple individuals in a database. Due to the expansive language contained in the final version of the ordinance, routine business practices used to support or improve operations are no longer permitted. For example, retailers may have previously used software that compares surveillance video images of individuals as they enter a store with a cloud-based photo database to identify suspected shoplifters. The ordinance now prohibits use of this software.

The law also has teeth. It creates a private right of action, statutory damages of $1,000 per day for each violation, and allows for recovery of attorneys’ fees. Similar to other biometric privacy laws, this ordinance has the potential to trigger a wave of costly class action litigation and upend business operations. This ordinance creates significant risk with use of facial recognition technology, and organizations should proceed with this awareness. The law also raises numerous unanswered questions, as noted below. Continue Reading

Digital Transformation – Cybersecurity Lessons from Recent Lawsuits

Digital transformation,[1] the process of leveraging technology, people and processes to innovate, requires an “all-in, ongoing commitment to improvement.”[2] But the main drivers of digital transformation – data and profits – don’t always mesh seamlessly.

As shown by recent class actions filed against Blackbaud and Morgan Stanley, and a settlement with the New York Attorney General by Dunkin’ Brands, digital transformation has numerous cybersecurity issues that present legal obligations and potential liability.

Blackbaud

In May, Blackbaud, Inc., a company that provides cloud software services to thousands of non-profits including hospitals, suffered a ransomware attack.[3]  In July, it began informing its users of the attack, many of whom used Blackbaud to process personal and sensitive information.

On August 12, the first of many lawsuits was filed against Blackbaud.  Among the allegations in the lawsuit, Blackbaud is accused of failing to properly monitor its computer network and systems, failing to implement policies to secure communications, and failing to train employees.

The five years prior to the attack are telling.  In that timeframe, Blackbaud underwent a digital transformation that involved acquiring numerous other software platforms including a predictive modeling platform, and a software provider focused solely on corporate giving.

Since the ransomware attack, Blackbaud has published cybersecurity improvements that support adherence to industry standards for incident management, employee training, systems and network testing, risk assessments, application security, encryption, and end-user authentication.[4] Continue Reading

LexBlog