The Florida House of Representatives Has Resoundingly Passed HB 969

The Florida House of Representatives has resoundingly passed HB 969, a comprehensive consumer data privacy bill similar to California’s enacted CCPA and Virginia’s VCDPA. HB 969 would give Florida residents a broad right to access, delete, correct, opt-in or opt-out, and stop the sale or sharing of their personal information. It requires business to post privacy policies, maintain written security programs and have procedures to comply when data subjects exercise their rights.

While the bill has the types of exemptions that have by now become commonplace (HIPAA, GLB and the like) it also has a few unusual elements including a broad definition of biometric data that does not contain the typical carve-out for biometric information data subjects voluntarily submit for testing and screening.

The bill would also include a resident’s private right of action. If passed, the Florida law would have the most extensive private right of action of any currently in-effect comprehensive privacy law, including Europe’s GDPR.

HB 969 has now moved to the Florida State Senate. The Florida State Senate is also currently considering passage of related a narrower consumer data privacy bill, SB 1734, which does not include a consumer’s private right of action.

Is Illinois Moving Away from its Strict BIPA Law?


By now, you’ve probably heard of the Illinois Biometric Information Privacy Act (“BIPA”), even if it was just a message you received to the tune of “Facebook users in Illinois may be entitled to payment if their face appeared in a picture on Facebook after June 7, 2011.”

The law, the first in the country purpose-built to regulate only biometric information, is among the strictest biometric laws in the world right now. Among other things, it requires that data subjects be provided with notice and deliver a signed written release (as opposed to the more prevalent electronic consent) before facial recognition, fingerprints or other biometric features can be collected and used. That was the crux of the Facebook case, where the photo-tagging feature we all hate-to-love and love-to-hate, resulted in a $650M class action lawsuit settlement.

But the Illinois statute is not without its critics.

BIPA remains the only state law that allows private individuals to bring a suit and recover up to $5,000 in statutory damages (and much more if actual damages are proved) without having suffered anything approaching the harm required under other state privacy law regimes. As a result, with more than 200 class actions filed, many have expressed concern that BIPA has become good business for class-action attorneys, but bad business for actual businesses, especially Illinois’ small business community.

In an attempt to strike a new balance, on March 9, 2021 the Illinois House judiciary committee advanced House Bill 559 (“HB 559”) which would amend BIPA.

HB 559’s key amendments do the following:

  • permit notice of biometric data collection to be made specifically to affected data subjects, rather than generally to the public
  • allow electronic consent to be used instead of written releases
  • create a one year statute of limitations (currently, there is no BIPA-specific statute of limitations)
  • require a 30 day notice and cure period before private actions can be brought
  • allow an otherwise offending party to prevent suit by private parties, whether as individuals or via a class action, if the noticed violation is cured and certain other conditions are met (including the provision of written assurances)
  • implicitly require that actual damages be shown insofar as it would do away with liquidated (aka “statutory”) damages
  • permit recovery of those actual damages by private individuals for negligent violations
  • consolidate and raise the standard for enhanced damages from intentional or reckless to solely “willful”
  • impose the same implied actual damages requirement for willful violations as is used with negligent violations, but does allow the right to seek recovery of double damages from willful violators; and
  • provide that BIPA will no longer apply to union employees who are covered by collective bargaining agreements.

We will continue to monitor the status of HB 559 and keep a close eye on the legal landscape if the Bill becomes law. In the meantime, it is always a good idea to review the current law and ensure that your company’s practices are aligned.

Sic Semper Privacy: Virginia Becomes Second State to Enact Consumer Privacy Law

Well, we all knew it was coming.

The longer the US continues without a comprehensive federal privacy law to rival Europe’s GDPR, the more the individual states are going to move to fill the void—and also make sure they’re not completely outdone by California where the curtain’s already dropping on CCPA and everyone’s getting ready for the second act of CPRA. It is, however, a bit surprising to see Virginia beat the State of Washington as the next in line. But sure enough, on March 2, 2021, Virginia Governor Ralph Northam signed the Virginia Consumer Data Protection Act (“VCDPA”) into law. Like California’s CCPA replacement, the CPRA, Virginia’s VCDPA will take effect on January 1, 2023.

As with most similar laws, VCDPA gives consumers new rights to access and control the personal data that businesses collect about them. For businesses, the Virginia law imposes new obligations that include:

  • obtaining data subject consent in certain circumstances
  • implementing a security program
  • restricting sales of personal data
  • conducting data impact assessments
  • using specialized contract terms with third parties

The popular media has been quick to run with all manner of such comparisons between California’s current law, the CCPA, and the VCDPA. While there are certainly similarities in a few key areas, a close read of Virginia’s new law suggests that VCDPA resembles the GDPR at least as much as it does CCPA.

For instance, while the VDCPA, like its California analog, requires detailed notices to data subjects, creates various data subject access rights and puts restrictions on the sale of personal information, the Virginia law’s third party oversight, impact assessment and security program obligations are considerably more extensive than what is currently required in California, and much similar to GDPR.

Over the course of the next few weeks, we’ll break down all the major elements of VCDPA. Today, we begin at the beginning with the basics of who and what are covered.

Who does the VCDPA Protect?

Similar to California, Virginia’s new act states that it protects Virginia “consumers.” As with California, however, the word consumer is a bit of a misnomer. Understood colloquially and under many other legal regimes, a consumer is typically a purchaser/user of goods and services. That’s not at all the case under the Virginia law. Under the VCDPA, the word “consumer” actually means “a natural person who is a resident [of Virginia][ . . . ] in an individual or household context [to the exclusion of purely business/employment contexts].”

Who must comply with the VCDPA?

The VCDPA covers all “persons” who either conduct business in Virginia or, as is similar to the standard set by the GDPR, who “target” residents in Virginia if, in both cases, those persons control, process or sell certain prescribed volumes of personal data in the course of a calendar year.

Like CCPA, the Virginia law has certain exemptions to who is covered. These exemptions are, however, notably different from CCPA. Both the California and Virginia laws have HIPAA and GLB exemptions. In California, those exemptions apply only to the affected data itself, not the overall business. In Virginia, the HIPAA and GLB exemptions read more broadly: if your business is governed by HIPAA or GLB, it is entirely exempt from VCDPA.

What is Protected?

The VCDPA protects “personal data” using a fairly straightforward, and by now familiar, definition. To wit: “information that is linked or reasonably linkable to an identified or identifiable natural person.” The list of what’s excluded from that definition is somewhat extensive spanning about 18 separate items that include:

  • business data
  • employment data
  • de-identified data;
  • publicly available data;
  • HIPAA data (note this is separate from and appears supplemental to the exemption for entities governed by HIPAA);
  • human research, public health, patient safety and related data; and
  • data governed by various additional U.S. federal laws, including FERPA, which is the educational equivalent to GLB.

Notably, the business and employment data exemptions in Virginia are baked right into the language of the statute itself. In California, those exemptions exist, for now anyway, only by virtue of special ancillary laws having only temporary effect.

How is it Protected?

In our next installment, we’ll review how VCDPA seeks to protect personal data and where its most extensive obligations can be found. In the meantime, remember our refrain about applying the Pareto Principle to data security and privacy (discussed here among other places). If you take the following steps, your compliance program will be ready for most of whatever Virginia, Washington, Minnesota or any other jurisdiction require:

  • adopt a risk-based technical and administrative data protection program;
  • take the time to actually implement that program (“saying” it is one thing, “doing it” is another);
  • tell your employees and customers what you’re doing with the data you collect about them and why;
  • give your employees and customers some degree of access to, and autonomy over, that data;
  • keep a close eye on third parties (including vendors) with whom you share that data; and
  • respond swiftly to, and be honest with those affected by, unauthorized use if it occurs.

The European Commission Released a Draft Adequacy Decision for the United Kingdom


In case you’ve been busy dodging novel viruses and winter storms, here’s a recap of why that’s important (be forewarned, we’re oversimplifying and condensing quite a bit for brevity).

Among other momentous things that occurred in 2016, the UK voted to leave the European Union in what has been dubbed “Brexit.” Brexit became effective on January 31, 2020 and thereafter EU law and the EU Court of Justice or “ECJ” no longer had precedence over British law and courts. To help ease the impact of that abrupt change, the UK Parliament passed the European Union (Withdrawal) Act 2018 which retains relevant EU law as domestic UK law.

For privacy and data security law purposes, the Withdrawal Act and related regulations did two key things:

  • First, they “froze” the GDPR, in its then-current EU form as of January 31, 2020. That frozen or “EU GDPR” version was then applied to all data received/transferred from the period before Brexit went into effect up to December 31, 2020.
  • Second, from December 31, 2020 and after, they make the GDPR part of domestic UK law and rename it the “UK GDPR.”

The UK GDPR isn’t quite an exact replica of the “frozen” EU GDPR. For instance, it changes the governing and binding interpretive bodies from the European Commission and ECJ, respectively, to the UK Secretary of State and UK courts. The replacement of the ECJ with the UK courts means the UK GDPR will inevitably continue to diverge from EU GDPR over time—though we suspect, that on big issues (like Schrems II which we explain here) the UK courts will follow the Swiss model of hewing closely to the ECJ.

So what does any of this have to do with an adequacy decision by the EU, you ask? Good question.

Recall that under the GDPR personal data can only be transferred out of the European Economic Area in one of two ways, either:

  • through an approved mechanism under GDPR Articles 46 or 49; or
  • if the European Commission has deemed the privacy laws of the destination country to be “adequate”

Since Article 46 mechanisms have been relentlessly (and successfully) attacked by Schrems and his aligned groups for over four years now, and Article 49 is largely untested, adequacy is far and away the preferred basis for transfer. Adequacy decisions are, however, very hard to come by. Up until now, only about a dozen have been granted.

To be sure, for companies governed by the GDPR who regularly move personal data to the UK, the failure of the UK to receive its own adequacy decision would be pretty burdensome. It would mean that long-standing personal data transfer practices would need to be entirely revisited, contracts amended and all manner of other compliance and operational impacts dealt with.

If, on the other hand, the UK receives an adequacy decision, things pretty much remain status quo ante for the foreseeable future. So while there are a few hurdles left before it becomes official, the fact that the EU has issued a draft decision this soon after the magic date of December 31, 2020, is a very good sign.

Watch this space. We’ll keep you updated.

New York Introduces Its Own Version of Illinois’ BIPA

In 2010, Illinois passed the Biometric Information Privacy Act, leading to over one thousand class action complaints between the years 2015 and 2020, alone. On January 6, 2021, the New York state legislature introduced Assembly Bill 27, titled the New York Biometric Privacy Act (“BPA”), which is a carbon copy version of the Illinois law.

New York’s BPA proposes to prohibit private entities from capturing, collecting, or storing a person’s biometrics without first implementing a policy and obtaining the person’s written consent. The New York BPA would provide for the identical remedies as the Illinois version, specifically, a private right of action with the ability to recover $1,000 for each negligent violation, $5,000 for each intentional or reckless violation, along with reasonable attorneys’ fees and costs.

While New York’s BPA is only proposed, if the language of the bill remains unchanged, New York companies can expect a similarly heavy flow of litigation. Companies operating in New York that utilize data that at all resembles biometric data should consider immediate steps towards prospective compliance. Companies should be auditing their practices and begin to develop written procedures so that, in the event New York’s BPA passes as written, exposure is limited from the outset. The language of the bill provides that the BPA shall take effect ninety (90) days after becoming law. We will continue to monitor the progress of the proposed legislation as it moves through the Assembly and the Senate.

California Legislative Update: Prop 24

Apparently there’s some stuff going on with a couple of guys named Joe and Don that’s got everyone distracted for some reason. The cool kids know, however, that the most important thing to happen last night was the passage of Prop 24 in California which means the CCPA is old news and the CPRA is the new game in town.

You read that right. Having just (mostly) figured out what the implementing regulations should be for CCPA, a massive new privacy law that’s only been in effect since January, California voters said, “Eh, know what? Let’s do it all over again.”

We’ll let you get back to clicking around about this Joe and Don thing, but here’s a quick run-down of what the new CPRA adds to the CCPA:

  • specific third-party oversight responsibilities, similar to GDPR;
  • requirements for annual audits and regular risk assessments for certain businesses;
  • requirements when doing “profiling” that are in-line with the GDPR:
  • an entirely new enforcement authority the California Privacy Protection Agency;
  • an expanded private right of action to cover beaches of account access credentials;
  • increased penalties for mishandling of children’s data;
  • a consumer right to correct data; and
  • more specific data retention disclosures

We’ll have more in-depth analysis and thoughts on readiness programs to come in the near future.

California Legislative Update

Just a quick legislative update from everyone’s favorite US privacy jurisdiction, California. Governor Newsom:

Signed AB 1281 – That Act extends the B2B and HR data exemptions under CCPA for another year. This is very good news.

Vetoed AB 1138 – That Act would have given CA a state analog to COPPA and required, among other things, parental consent prior to kids under 13 using social media. In his veto message found here, Newsom said he based his decision on the same reasons many of us lawyers and privacy professionals had been criticizing AB 1138, which is that COPPA already robustly occupies the field and the FTC has an excellent track record of enforcement. A state law analog would have added nothing more than regulatory burden and cost amidst the already challenging pandemic economy.

Proposed Bill to Establish Security Standards for IoT Devices Used by Government Officials Passes House


For many, being able to securely connect, access, and move data across multiple devices is an integral aspect of everyday life. Some of our nation’s lawmakers are wanting to ensure that the internet connected devices that they use have the same established cybersecurity standards that the public has come to expect in the private sector. Lawmakers got one step closer to making that a reality this week.

The U.S. House of Representatives passed the Internet of Things (IoT) Cybersecurity Improvement Act, known as House Bill 1668, earlier this week, which seeks to establish security standards for the federal purchases of internet-connected devices and the private sector groups providing such devices.

Currently, there is no national standard to ensure the security of internet-connected devices purchased by the federal government. Under the proposed law, these internet-connected devices, which would include computers, mobile devices and other devices that have the ability to connect to the internet, would now have to comply with minimum security recommendations issued by the National Institute of Standards and Technology (NIST). The bill does not lay out what those standards should be; rather, it tasks the Office of Management and Budget to oversee that adopted IoT cybersecurity standards are in line with minimum information security requirements.

Devices covered under the bill

The bill would not only cover computers and smart phones used by federal government officials. The legislation defines a covered device to include a physical object that is capable of being in regular connection with the Internet or a network that is connected to the Internet, and has computer processing capabilities of collecting, sending or receiving data. It would not include personal cell phones or personal computers. It also exempts devices that are necessary for “national security” or “research purposes”.

Obligations on the private sector under bill

The bill would require contractors and their subcontractors that provide covered devices to the federal government to notify government agencies of any security vulnerabilities. While security standards are being considered, private sector providers, contractors and subcontractors can look to Standards 29147 and 30111 in the International Standards Organization for guidance since bill drafters explicitly cited to them in the Act. There’s a process for companies to challenge whether their devices are covered under the bill as well.

Cyberthreat on IoT

The Mirai botnet attack in 2016 served as the drive for the Bill’s sponsors. Recall that the Mirai botnet attack left millions in the East Coast, among other locations, without access to many popular websites for a few hours in late October of 2016. The attack blocked unsecured internet connected devices from accessing popular websites such as Twitter, Netflix and the New York Times in order to carry out a cyber attack.

While Mirai primarily impacted internet connected computers, for many, including the IoT Cybersecurity Improvement Act sponsors, the Mirai attack showed just how debilitating a cyber attack can have on a heavily connected internet life, and the havoc attackers can create on unsecured internet connectable devices and the lives that depend on their functionality. Internet connected devices, or IoT devices, are devices which can be controlled or accessed using the internet, including everything from webcams to baby monitors to gaming consoles. It includes any exercise tracker or a programmable lock to your home. According to some estimates, there will be close to 75 billion IoT connected devices by 2025. The IoT Cybersecurity Improvement Act would work toward ensuring the government’s IoT connected devices containing the nation’s top data information are secure.

Up next for the bill

The IoT Cybersecurity Improvement Act heads next to the Senate floor, after passing unanimously by the House.

Up next for you

Gordon & Rees will keep an eye on cutting-edge developments in this space. We can expect similar regulations in the private sector with various guiding authorities, such as NIST, providing similar recommendations.


The more things change, the more they stay the same. On July 16, 2020, the Court of Justice of the European Union (“CJEU”) issued its decision in the so called “Schrems II” case. If you need some background on the case, you can find our original blog post on the case here.  

The two main takeaways of the Schrems II decision are:

  1. 1. The CJEU invalidated the EU-US Privacy Shield framework.
  1. 2. The CJEU reaffirmed the validity of standard contractual clauses (“SCCs”).

While the validity of SCCs were upheld, and remain a viable transfer mechanism, the CJEU holding requires businesses utilizing SCCs to analyze whether the destination country provides an adequate level of data protection.  Where the country doesn’t, the business must provide additional safeguards or suspend the transfer. Similarly, EU data protection authorities must suspend or prohibit a transfer of personal data to a third country if the data protection authority has determined that SCCs cannot be complied with in the third country and data protection cannot be ensured. 

Recall that the Privacy Shield worked together in a closely integrated manner with the GDPR. It was not a separate law or a substitute for GDPR compliance. More specifically, and to use a bit of regulatory jargon (we’ll leave unexplained for now in the interest of brevity), the Privacy Shield had served as what is known as a “partial adequacy decision” falling under GDPR Article 45. In short then, what the CJEU has done in the Schrems II case is take the Privacy Shield, a proven, centralized system for regulatory oversight and enforcement on both sides of EEA-US data transfer equation, and replace it with a system of self-policing by transferors and ad hoc decision making by local EEA authorities.  

That’s all likely to work out about as well as it did in 2015 when the EU-US Safe Harbor was invalidated in the Schrems I case. Back then, data transfers continued (and even increased), through a two year period of ambiguity, confusion and almost complete non-enforcement until the Privacy Shield went into effect to fill the void left by the CJEU’s invalidation of the Safe Harbor.  

So what does all this mean for US businesses who had relied on the Privacy Shield?  Not much over at least the next week or two, and likely longer.  Contracting counter-parties in the EEA, rather than regulators, will be the most likely source of pressure to adopt the SCCs.  The U.S. Department of Commerce, for instance, issued a statement in response to the Schrems II decision informing US businesses that it intends to continue to operate for the time being as if the Privacy Shield remains in effect and, as such, the CJEU decision does not relieve participating businesses of their Privacy Shield obligations. 

If US and EU negotiators can’t work together to fix this soon, companies will need to start looking at alternative to the Privacy Shield such as SCCs, binding corporate rules or the derogations under GDPR Article 49.  Regardless of what happens as a result of Schrems II, US businesses that remember and practice our recurring mantra about applying the Pareto Principle to their data security and privacy compliance obligations will get through this fine. So if you haven’t already:

  • adopt a risk-based technical and administrative data protection program,
  • take the time to actually implement that program (“saying” it is one thing, “doing it” is another)
  • tell your employees and customers what you’re doing with the data you collect about them and why,
  • give your employees and customers some degree of access to, and autonomy over, that data,
  • keep a close eye on third parties (including vendors) with whom you share that data, and
  • respond swiftly to, and be honest with those affected by, unauthorized use if it occurs.

Learn more and contact the Gordon & Rees Privacy, Data & Cybersecurity practice group here.

The Third Annual Review on the U.S.-EU Privacy Shield Notes the U.S. Is Doing Well, Are You?

On October 23, 2019, the European Commission published a report on its third annual review of the Privacy Shield. The results are generally positive with no immediate risk to the Privacy Shield’s existence (as a regulatory matter) for at least another year. While you can read the full report here, the following serves as a brief summary, which will be reviewed in more detail in the weeks to come.

Recall that the Privacy Shield works together in a closely integrated manner with the GDPR. It is not a separate law or a substitute for GDPR compliance. More specifically, and to use a bit of regulatory jargon (we’ll leave unexplained for now in the interest of brevity), the Privacy Shield serves as what is known as a “partial adequacy decision” falling under Article 45 of the GDPR.

Per the US-EU bilateral agreement that resulted in the Privacy Shield, it is subject to annual review by the relevant authority in the EU. If the review goes badly, it would be an existential threat to the Privacy Shield. Thankfully, that did not happen. It is important to note that, this report is, of course, unrelated to the Schrems II case (which we posted on here) and its anticipated follow-on cases which are likely to judicially challenge the Privacy Shield.

Since there’s a lot of confusion, even amongst some practitioners, about what the Privacy Shield is and how it fits in with GDPR, we always feel it’s a good idea to give a reminder whenever we post on the Privacy Shield. So here goes:

Under the Privacy Shield, U.S.-based companies who self-certify can lawfully receive GDPR-governed personal data from companies based in the European Economic Area. Equally as important, Privacy Shield also signals to the marketplace that your company has what we refer to at the end of this post as the “Pareto Principle” of data security and privacy policies – procedures and programs in place that are not only required by GDPR, but are fairly universal across global regulatory regimes. As a result, Privacy Shield self-certification is definitely a plus, but it is not fatal to your company’s ability to receive personal data from the EEA. If you aren’t Privacy Shield self-certified, it just means you can’t rely on GDPR Article 45 to receive personal data.

Instead, you have to look to GDPR Article 46. That Article enumerates a handful of mechanisms that also can be used to lawfully receive EEA personal data transfers. They range from the so-called Standard Contractual Clauses (which are currently under attack in Schrems II) to a costly and complex mechanism called Binding Corporate Rules.

The key take away from today’s report is this: For the third year in a row, Privacy Shield has proven its viability. Becoming Privacy Shield self-certified is worth considering if your business requires regular receipt of GDPR-governed data. It also has some independent value beyond EEA transfers insofar as it shows your company’s security and privacy practices have at least some minimum level of maturity. As we all know and preach, it is essential in today’s global privacy evolution to ensure the development, implementation and continued monitoring and improvement of sound data security and privacy policies and practices.

Should you have any questions before our more detailed post is published, please contact Rich Green for more information.