The Third Annual Review on the U.S.-EU Privacy Shield Notes the U.S. Is Doing Well, Are You?

On October 23, 2019, the European Commission published a report on its third annual review of the Privacy Shield. The results are generally positive with no immediate risk to the Privacy Shield’s existence (as a regulatory matter) for at least another year. While you can read the full report here, the following serves as a brief summary, which will be reviewed in more detail in the weeks to come.

Recall that the Privacy Shield works together in a closely integrated manner with the GDPR. It is not a separate law or a substitute for GDPR compliance. More specifically, and to use a bit of regulatory jargon (we’ll leave unexplained for now in the interest of brevity), the Privacy Shield serves as what is known as a “partial adequacy decision” falling under Article 45 of the GDPR.

Per the US-EU bilateral agreement that resulted in the Privacy Shield, it is subject to annual review by the relevant authority in the EU. If the review goes badly, it would be an existential threat to the Privacy Shield. Thankfully, that did not happen. It is important to note that, this report is, of course, unrelated to the Schrems II case (which we posted on here) and its anticipated follow-on cases which are likely to judicially challenge the Privacy Shield.

Since there’s a lot of confusion, even amongst some practitioners, about what the Privacy Shield is and how it fits in with GDPR, we always feel it’s a good idea to give a reminder whenever we post on the Privacy Shield. So here goes:

Under the Privacy Shield, U.S.-based companies who self-certify can lawfully receive GDPR-governed personal data from companies based in the European Economic Area. Equally as important, Privacy Shield also signals to the marketplace that your company has what we refer to at the end of this post as the “Pareto Principle” of data security and privacy policies – procedures and programs in place that are not only required by GDPR, but are fairly universal across global regulatory regimes. As a result, Privacy Shield self-certification is definitely a plus, but it is not fatal to your company’s ability to receive personal data from the EEA. If you aren’t Privacy Shield self-certified, it just means you can’t rely on GDPR Article 45 to receive personal data.

Instead, you have to look to GDPR Article 46. That Article enumerates a handful of mechanisms that also can be used to lawfully receive EEA personal data transfers. They range from the so-called Standard Contractual Clauses (which are currently under attack in Schrems II) to a costly and complex mechanism called Binding Corporate Rules.

The key take away from today’s report is this: For the third year in a row, Privacy Shield has proven its viability. Becoming Privacy Shield self-certified is worth considering if your business requires regular receipt of GDPR-governed data. It also has some independent value beyond EEA transfers insofar as it shows your company’s security and privacy practices have at least some minimum level of maturity. As we all know and preach, it is essential in today’s global privacy evolution to ensure the development, implementation and continued monitoring and improvement of sound data security and privacy policies and practices.

Should you have any questions before our more detailed post is published, please contact Rich Green for more information.

Seventh Circuit Limits FTC’s Monetary Restitution Powers

The ability of the Federal Trade Commission (“FTC”) to obtain monetary restitution for consumers just took a major loss from the Seventh Circuit Court of Appeals. This federal appellate court ruled that Section 13(b) of the FTC Act only provides that the FTC can obtain restraining orders and injunctions but it does not state that the FTC can obtain equitable monetary relief for consumers, including but not limited to ex parte asset freezes to e-commerce merchants’ banking accounts. Prior to this decision, it was implied that the FTC could obtain monetary restitution relief for consumers from Section 13(b) of the FTC Act.

In this case (FTC vs Credit Bureau Center), the FTC showed the court that an e-commerce credit bureau retailer deceived consumers into enrolling in its service by posting misleading statements about receiving a “free” credit report (when in fact in was not free), and thereby deceptively leading consumers into purchasing recurring monthly credit monitoring service. The federal district court held that this e-commerce retailer had violated the FTC Act and other consumer protection laws, entered summary judgment in favor of the FTC, and ordered the e-commerce retailer to pay equitable monetary relief to consumers. This decision was appealed to the Seventh Circuit Court of Appeals who affirmed the FTC’s power to obtain restraining order and injunctions, but specifically ruled that since Section 13(b) of the FTC Act does not state that the FTC can obtain monetary restitution for consumers, the FTC cannot do so under Section 13(b).

This is a huge decision because under its current practices, the FTC may no longer be able to rely on Section 13(b) of the FTC Act to obtain monetary restitution for consumers arising from false and misleading statements, and deceptive acts or practices, e.g., ROSCA violations and data breaches. This decision by the Seventh Circuit (jurisdiction over the federal district courts in Illinois, Indiana and Wisconsin) is the first federal court of appeals decision to limit the FTC’s ability to obtain monetary restitution for consumers under Section 13(b) of the FTC Act, creating a circuit split among the federal appellate courts.

Given the huge impact that this federal appellate opinion has on the FTC’s monetary restitution powers, it is foreseeable that this decision will be appealed to the US Supreme Court, who will ultimately determine the FTC’s powers under Section 13(b) of the FTC Act. If the Supreme Court agrees with the Seventh Circuit, then the FTC’s ability to obtain monetary restitution under Section 13(b) will be impacted severely.

In the interim, expect the FTC to seek monetary restitution for consumers under other provisions of the FTC Act (e.g., Sections 5(m)(1)(B) and 19) and other statutes that the FTC administers and enforces.

For guidance through the legal and regulatory compliance land mines of FTC Compliance, ROSCA and data breaches, do not hesitate to contact Mark Ishman, a member of Gordon Rees’ Advertising and E-Commerce and Privacy, Data & Cybersecurity Practice Groups.

How Many Schrems Does It Take to Stop a Data Transfer?

The so-called “Schrems II” case was heard earlier this week. It’s impossible to give this topic the treatment it deserves in a single blog post. So for now, here’s a quick FAQ:

What’s this case about?

Collecting personal data from the European Economic Area (aka, the “EEA”) and transferring to other countries is restricted by law. It can be done, but companies have to use certain statutorily prescribed mechanisms. Those, more or less, have been the rules of the game since at least 1995 continuing through today under the new GDPR which you’ve probably heard a lot about.

The prescribed mechanisms have varied over the years, but one constant has been what are known as “Standard Contractual Clauses” or “SCCs.” SCCs are a set of data protection contract terms that have been pre-approved by the EU data protection regulators. In the “old days” (by which we mean the mid- to late 1990s) they were called “model clauses.”

If each of the EEA- and US-based counterparties to a data transfer transaction agree to bind themselves to the SCCs, then an otherwise prohibited transfer becomes permissible.

In simplest terms, the Schrems II case is trying to stop companies from being able to do that. The plaintiff’s claim is that the SCCs are not valid under EU law because they fail to provide adequate levels of protection for personal data.

Why do they call it Schrems II?

Schrems is the surname of an EU qualified attorney and political and privacy activist. He and the ecosystems of activist organizations around him are serial plaintiffs. This is their second (and definitely not final) attack on EU-US data transfers.

Back under the old 1995 law, one way to conduct a permitted personal data transfer was to use the EU-US Safe Harbor Framework. If a company took a couple of (pretty minimal) steps and signed up with the US Department of Commerce to be part of the Safe Harbor, it could receive personal data from the EEA.

Spurred on by the intelligence agency surveillance scandals that occurred during the Obama administration, Schrems, then a law student, brought a series of cases trying to invalidate the EU-US Safe Harbor. After a few procedural losses and a bit of forum shopping, he finally succeeded in 2015. That case became instantly known as “Schrems I” because Schrems and his supporters were already preparing their challenge to the SCCs. And, again, that’s exactly what’s happening now under Schrems II.

Didn’t the EU-US Privacy Shield replace the Safe Harbor

Yes. A detailed analysis of the Privacy Shield (and its all-important relationship to the GDPR) is beyond the scope of this post, so here’s the summary version:

The Privacy Shield is considered a “partial adequacy decision” under GDPR Article 45. As such, it allows companies to collect/transfer EEA personal data to the US as long as the US-based recipient company is Privacy Shield self-certified.

But this case isn’t about the Privacy Shield (at least not nominally—more on that in a minute) or even GDPR Article 45. As stated in the prior two FAQs, this case is about one of the other prescribed mechanisms, the long-standing SCCs which have been in existence for nearly 25 years and today fall under the aegis of GDPR Article 46.

That said, while we’re still waiting on our own confirmation, it’s being reported by reliable news sources that, in open court this past Tuesday, Schrems’ lawyers asked the court to also invalidate the EU-US Privacy Shield—despite not having actually pled or argued for it previously (in fact there is an entirely separate case for that) and despite the fact that it derives from a statutory mechanism (GDPR 45) that is separate and distinct from the SCCs (which, again, are GDPR 46).

What happens if the European Court of Justice invalidates the SCCs

Déjà vu all over again. Things will very likely look pretty much the same as they did in 2015 when the Schrems I court invalidated the Safe Harbor. Which means there will be a long interregnum during which there will be less regulation, more unfettered transfers and lots of confusion.

You see, like the too-clever-by-half Wile E. Coyote character of Warner Brothers cartoon fame, in the first case that bears his name, Schrems thought he was going to dynamite, and thereby halt, EU-US data transfers by invalidating the Safe Harbor. But in the end, the only thing that went up in smoke was his goal of protecting data transfers.

Invalidating the Safe Harbor didn’t stop transfers out of Europe to the US at all. Instead, the result in Schrems I combined with the already looming specter of Schrems II, led companies to conclude that European law was, to put it colloquially, a hot, unenforceable mess.

EU regulators, already under-staffed, under-funded and overwhelmed, were more or less paralyzed after Schrems I. So responsible, law abiding companies had to more or less make it up as they went along. Most did their best to self-regulate and relied on SCCs. Others, knowing Schrems II was imminent and SCCs thereby in doubt, used ad hoc data export/import contracts. Meanwhile, the less law abiding were all too happy to flout the spirit of the law entirely and were doing pretty much whatever they wanted with impunity.

That same environment of confusion and virtual lawlessness, rather than Schrems’ goal of stopping or better protecting US transfers, will play out again if the Schrems II court invalidates SCCs. It’ll happen a thousand-fold if the Schrems II court decides, sua sponte, to invalidate the Privacy Shield too

What can we do now to prepare?

For starters, keep reading this blog! In addition to that, remember our recurring mantra about applying the Pareto Principle to data security and privacy compliance.

Sure it’s true that there are variations between laws and some laws have real quirks (CCPA anyone?!). But it’s even more true that just about every data sec or privacy law (from HIPAA to the NY Cyber-reg to GDPR) has the following (or a very similar) set of building blocks at its foundation:

  • adopt a risk-based technical and administrative data protection program,
  • tell your employees and customers what you’re doing with the data you collect about them and why,
  • give your employees and customers some degree of access to and autonomy over that data,
  • keep a close eye on third parties (including vendors) with whom you share that data, and
  • respond swiftly to, and be honest with those affected by, unauthorized use if it occurs.

So put that foundation in place, and check on it periodically, and you’ll be well on your way to achieving 80% compliance no matter what the Schrems II court decides.

New Massachusetts Law Creates More Stringent Notification Requirements for Data Breach Incidents

While we’ve all been busy keeping an eye on California’s CCPA mess and the brewing federal privacy legislation, Massachusetts enacted some amendments to its already stringent consumer-protection oriented privacy laws. (See MGL c.93H)

As a result of the amendments, effective April 11, 2019, Massachusetts’ general breach notification statute will include the following new requirements:

  1. Consent to Access Credit Reports – Before getting hold of a consumer’s credit report for most non-credit purposes, third parties must obtain the consumer’s consent. In the process, they also need to disclose the reason they’re seeking access.
  2. Security Freezes – Consumer reporting agencies can no longer charge a fee to consumers to place, lift, or remove a security freeze on their credit reports.
  3. Credit Monitoring Services – Companies experiencing a security breach involving social security numbers must offer affected MA residents free credit monitoring services for at least 18 months (or 42 months if the company is a consumer reporting agency). Additionally, companies that experience a security breach must file a report with the Attorney General and Department of Consumer Affairs and Business Regulation certifying their credit monitoring services comply with state law.
  4. No Waiver – Individuals affected by breaches can no longer be required to waive their private right of action as a condition to getting credit monitoring services.
  5. Breach Notice Obligations – Notice to the Attorney General and Department of Consumer Affairs and Business Regulation must include additional information such as the person responsible for the breach (if known), the type of personal information compromised, and whether the entity has a written information security program in place. Notice to consumers must include the name of the parent or affiliated corporation if the entity that experienced the breach is owned by another entity.
  6. No Delay in Notice to Residents – Notice to residents cannot be delayed on the grounds that the total number of residents affected has not been ascertained. If and when additional information is obtained, additional notice must be provided as soon as practicable and without unreasonable delay.

It’s not clear how these requirements will work in practice, but for those whose business activities expose them to Massachusetts law, existing incident response and management policies should be revisited by the end of March to make sure they reflect these new obligations.

Google Faces European Consumer Group Complaints Alleging GDPR Violations for Improper Collection of Location Tracking Data

On Tuesday, November 27th, consumer groups filed complaints with the data protection authorities in seven European countries, accusing Google of improperly collecting location tracking data in violation of the new General Data Protection Regulation (“GDPR”). The complaints, filed in the Czech Republic, Greece, Netherlands, Norway, Poland, Slovenia, and Sweden, cite to a study by the Norwegian Consumer Council, which reviews the various methods used to track consumers’ location when they use Google services on their smart phones.

Consumer groups claim that Google has been using a variety of techniques to “push or trick” users into allowing themselves to be tracked when they use Google services. These techniques include “withholding or hiding information, deceptive design practices, and bundling of services.” Complainants specifically allege that tracking is accomplished through the “Location History” and “Web & App Activity” features built into Google accounts, and these issues are particularly pronounced on mobile devices that use the Android platform.

The complainants go on to allege that Google does not have a valid legal basis for processing users’ location data and is processing personal information in violation of GDPR. Assuming Google will attempt to rely on consumer consent, complainants argue consent from Google users is inadequate because users are not given sufficient information to make informed choices, default settings are hidden, and users are repeatedly nudged to turn on features that track location.

In response to a request for comment, a Google spokesperson said: “Location History is turned off by default, and you can edit, delete, or pause it at any time. If it’s on, it helps improve services like predicted traffic on your commute.

“If you pause it, we make clear that—depending on your individual phone and app settings—we might still collect and use location data to improve your Google experience.”

These recent complaints are significant for several reasons. First, GDPR only recently took effect on May 25, 2018. Enforcement to date has been limited and there is little legal precedent that can be relied on to ascertain a potential outcome for these complaints. It is difficult to predict how the data protection authorities in these seven countries will respond.

Second, penalties for violations of GDPR are prohibitive. Current regulations provide for fines of up to 4% of global annual revenue, so Google, and its parent company Alphabet, could face fines in the billions of dollars.

Third, Google is facing lawsuits in United States federal court over the same location tracking data. The plaintiffs in those suits allege Google continued to track users’ locations through their phone, even after location tracking features were disabled. Google has filed a motion to dismiss, which will be heard in January 2019. It is unclear what impact, if any, these new complaints in the European Union will have on ongoing US litigation, and vice versa.

Three Key Requirements Imposed by Colorado’s New Consumer Data Privacy Statute

Be careful what you ask for (and maintain) about Colorado residents…especially if you don’t have the proper data security policies in place. On September 1, 2018, Colorado’s new privacy law, HB 18-1128, goes into effect, imposing new requirements on any business or government entity that maintains, owns, or licenses personal identifying information about Colorado residents.

The new law imposes three key requirements on businesses subject to the rule:

  1. Reasonable security procedures and practices must be implemented that are proportionate to the nature of the personal identifying information maintained and the nature and size of the business’s operations.
  2. Written policies for the destruction and proper disposal of paper and electronic documents containing personal identifying information must be developed.
  3. Breach notification procedures must be followed, including adhering to a 30-day time period by which notification must be completed.

Business that do not already have written data disposal and security policies should act quickly to ensure that they are compliant with the nuances of the new law.

Colorado’s breach notification requirement imposes a more aggressive requirement for notifying affected residents than requirements under the Health Insurance Portability and Accountability Act (HIPAA) and virtually any other U.S. state. A business must provide written notification with certain information to affected residents in the most expedient time possible and without unreasonable delay, but not later than 30 days after the point in time when there is sufficient evidence to conclude that a security breach has occurred. For breaches believed to have affected 500 residents or more or 1000 residents or more, businesses must notify the Colorado Attorney General and certain consumer reporting agencies, respectively.

Reflective of the shift towards providing consumers with more control over their personal information, the bill is codified under the Colorado Consumer Protection Act (CCPA) and potentially creates a private right of recourse against businesses who misuse a resident’s information. CCPA causes of action oftentimes include assertion of a right to triple damages and reasonable attorneys’ fees. Additionally, the Colorado Attorney General may bring civil, or in some cases criminal, actions for violation of the law.

The frequently unforgiving nature of civil monetary penalties imposed by the HHS Office of Civil Rights (OCR) for HIPAA violations should be cautionary. But, not only is there great risk of exposure for unprepared or noncompliant businesses facing enforcement by state and federal regulatory agencies, now more than ever, individual or class action liability seems to be on the horizon. Last, but not least, businesses never envision themselves as “the ones” making headlines about their data breaches…until it happens…and happens quickly.

What if I already comply with other state or federal privacy laws?

The new law indicates that businesses already regulated by other state or federal law are in compliance if adhering to such regulator’s procedures for the protection and disposal of personal identifying information. If the business operates in interstate, international and/or online commerce involving Colorado residents, however, a thorough review of policies and procedures is recommended to ensure that various applicable laws are reconciled.

Recommendations:

Businesses subject to the privacy law should take the following steps, at a minimum, to ensure that they are prepared to comply.

  1. Entities should know and map the flow of data both internally and outside of their business, whether in paper or electronic format. Inventories of hardware and other electronic portable devices where electronic media is stored should be routinely tracked.
  2. Employees must be routinely trained in policies. Handbooks should be updated and whether to require nondisclosure and confidentiality agreements assessed. Appropriate protocols for the destruction and disposal of personal identifying information must be implemented for current and departing employees.
  3. Third-party service vendors should be identified and communicated with regularly to obtain assurances of compliance. Contractual documents should memorialize vendors’ obligations.
  4. Businesses, including HIPAA covered entities, should rework their data breach policies and ensure that third-party vendor agreements or business associate agreements reflect Colorado’s more stringent breach notification timeline of 30 days.

Conclusion:

There is no uniform mechanism for determining how best to implement the necessary measures. Legal counsel specializing in data privacy and security law are instrumental resources when ensuring that adequate measures are taken to navigate compliance with state and federal laws, especially in today’s rapidly changing environment.

Trial and Error: VPN Continues to Disappoint

The last time I wrote I said I would be trying Nord VPN to see how well it worked to allow me to access bank and office email when traveling. Today, I’ll tell you why I gave up using it. This may tell you more about me, however, than about Nord VPN. My primary reason for using an IPN was to be able to access bank sites from hotel rooms. (I’d hate to think the stock market fell and I couldn’t sweat the details that evening!)

I found it too difficult to use such sites after I logged in. Many times, my fix to turn the VPN on to log in then turn it off to download transactions into my financial software. Some banks regard the use of an IPN as a red flag for fraud, particularly if you appear to be logging in from a foreign country.

(I haven’t found that myself).

I looked on the internet to see what I could do and was disheartened by the complexity of it all.

Maybe I am spoiled by the ease of using an iPhone but I was hoping this would work without having to troubleshoot settings.

Bottom line: VPNs do not appear to be a ready and easy way to safely use unprotected Wi-Fi connections. Your cellular phone connection is safe.

(I sure hope so.)

If you can’t use your laptop via cellular, you can use your phone to change your password, use laptop on an unsecure network, then use phone to change password back.

(Or am I missing some other problem?)

California’s Mini-GDPR? The Newly-Enacted California Consumer Privacy Act of 2018

On June 28, 2018, California passed the so-called California Consumer Privacy Act of 2018 (“CCPA”), changing the landscape of privacy laws and compliance for many years to come. The new law gives Californians more control over the information businesses collect on them, and imposes new requirements and prohibitions on businesses. Non-compliance with and violations of the CCPA will also expose businesses to penalties and, because the CCPA provides for a private right of action, the risk of private law suits.

Effective Date:

The new law (full text available here) goes into effect on January 1, 2020.

Potential Liability:

The CCPA is similar to Europe’s General Data Protection Regulation (“GDPR”), which went into effect on May 25, 2018. Much like the GDPR, the cost of noncompliance can be staggering. The CCPA imposes penalties of $750 per consumer per incident (e.g., $750,000 for an incident involving 1,000 consumers) or actual damages, whichever is greater.

As for penalties assessed against businesses, the highest amount is $7,500 per violation, notwithstanding penalties under California’s Unfair Business Practices Act. While at first the penalties and damages under the CCPA may seem minimal, they can add up to enormous amounts, depending on the number of violations, number of consumers, and the amount of actual damages.

What is “Personal Information”?

The CCPA derives from the California Constitution’s inalienable right of privacy. The Legislature reasoned that Californians’ ability “to control the use, including the sale, of their personal information” is fundamental to protecting their right of privacy. For purposes of the CCPA, “personal information” is defined as “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household” such as name, internet protocol (IP) address, email address, postal address, driver’s license number, social security number, and passport information. Publically available information (i.e., information lawfully made available by federal, state, or local government records) is expressly excluded from the CCPA’s definition of “personal information.”

What “Businesses” Are Covered?

The CCPA broadly applies to “businesses” that operate for-profit and (1) have an annual gross revenue of more than $25 million, (2) buy, receive or share for commercial purposes, or sells personal information of 50,000 of more consumers, households, or devices, or (3) derive 50% or more of their annual revenue from selling consumers’ personal information. The CCPA also applies to entities that share common branding with a qualifying “business” and that controls or is controlled by that business.

Summary of Consumer Rights, and Business Requirements and Prohibitions:

The following table highlights the CCPA’s most important consumer rights, as well as business requirements and prohibitions.

CCPA Consumer Rights CCPA Business Requirements and Prohibitions
Consumers may request that a business disclose:

(a) the categories and specific pieces of personal information that it collects about the consumers;

(b) the categories of sources from which that information is collected;

(c) the business purposes for collecting or selling the information; and

(d) the categories of third parties with which the information is shared.

Businesses are required to make disclosures about the information they collect and the purpose for which it is used.
Consumers may request that a business selling consumers’ personal information, or disclosing it for business purposes, disclose (a) the categories of information it collects, and (b) the categories of information and the identity of third parties to which the information was sold or disclosed. Businesses are required to provide this information in response to a verifiable consumer request.
Consumers may opt out of the sale of personal information by a business. Businesses are prohibited from discriminating against a consumer for exercising this right, including by charging the consumer who opts out a different price or providing the consumer a different quality of goods or services, except if the difference is reasonably related to value provided by the consumer’s data.

However, businesses may offer financial incentives for collection of personal information.

Businesses are prohibited from selling the personal information of a consumer under the age of 16, unless affirmatively authorized (known as “the right to opt in”).

The CCPA is considered one of the toughest data privacy laws in the United States and will dramatically impact how businesses handle data. A more detailed analysis of the CCPA, and how it may impact our clients will be published shortly. To be included on our distribution list, please contact Susan Orona. In the meantime, to get more information about the CCPA, including assistance on updating your processes to comply in advance of the January 1, 2020, effective date, please contact Andy Castricone, Craig Mariam or Christina Vander Werf.

What’s ‘Hot’ in GDPR this Week

Here’s a quick Friday afternoon post on five noteworthy developments the first week after GDPR go-live:

  • Surprising no one, Google figured out a way to monetize the GDPR through compliant ads https://tinyurl.com/y9xzxrhs
  • And just as unsurprisingly, Max Schrems figured out a way to monetize Google (and others) by suing for billions under the GDPR https://tinyurl.com/yazdrbg4
  • Japan took one step closer to getting an adequacy decision, we all knew this would progress post-GDPR what’s surprising is how fast (keep an eye on the PPC rulemaking) https://tinyurl.com/JapanEUadequacy.
  • Both the US Department of Commerce and the US Chamber of Commerce are picking fights with the European Commission over GDPR’s extra-territoriality and un-intended consequences, among other things https://tinyurl.com/yb7kw8xl and https://tinyurl.com/y8vxqeg4
  • But one US Senator apparently thinks Commerce and the Chamber are getting it wrong and introduced a resolution to prove it https://tinyurl.com/y9xawr9c

GDPR Go-Live: The End of the Beginning

Today is May 25th. Unless you’ve been living in a cave without a hotspot for the last year, you know that means today is the go-live date for Europe’s new General Data Protection Regulation or “GDPR.” With its controversial extraterritorial reach, the GDPR has been causing much commotion around the world and along with that commotion, a whole lot of breathless hyperbole in the popular and professional trade media.

We haven’t done much writing on any of it in this space because, well, we’ve been busy doing GDPR preparedness work for our clients. And lots of it! (Article 28 anyone?) But the occasion of the go-live date has given us a brief respite, so here’s a quick run down on what’s going to happen now that the law is finally in effect, and what to do if you’ve, well, done nothing so far.

What’s going to happen on May 26th?

We can say for certain that the sun is going to rise, the earth is going to rotate on its axis and life will go on. There’s been so much myth and hype about the GDPR it seems worth pointing all that out. More importantly, it’s also worth pointing out you’re not going to wake up tomorrow morning with the equivalent of a subpoena from an EU member-state data protection regulator in your mailbox. To date, more than half the EU member states had not adopted the GDPR into law (which doesn’t affect its validity, but does raise questions about enforcement), and in a recent survey of most of the relevant regulators, about two-thirds said they won’t be ready to start enforcement activities any time soon. Among those regulators who do feel ready, most have stated publicly that there will be few fines in 2018 unless something is very wrong.

So come dawn tomorrow, things will feel an awfully lot like any other Saturday. If your company’s been doing its GDPR homework for the last years/months that will be especially true. If not, then keep reading….

We haven’t done anything to prepare. Now what?

You have some work to do and soon. That said, we’re calling for clients newly discovering GDPR to act with thoughtful urgency, not panic.

The first thing you’ll need to do is determine whether you’re subject to the GDPR. There are two ways than can happen: direct and indirect. If it meets the requirements for being a data “controller” or “joint controller” that is “established” in the EU, your company is directly covered. If it does not meet those requirements, but does meet the requirements of a data “processor,” your company will be indirectly covered.

How Do We Know If We’re A “Controller” Who’s “Established” In the EU?

The language of the GDPR can make this a difficult question to answer, particularly with regard to the “established” element. There are, however, a few obvious tests. For instance, if you answer “yes” to any of the following questions you are likely directly covered:

  • do you have a physical presence in the EU?
  • do you have employees or paid contractors in the EU?
  • do you sell products that are designed to meet EU market requirements (220 volt products are a simple example)
  • are any of your sales and marketing activities purposefully directed at the EU market? Some examples of being purposefully directed include if you:
    • have distributors/resellers in the EU
    • accept Euros or member state currency
    • translated your website, brochures, product manuals or other collateral or documentation into member state languages
  • do you monitor the behavior of customers based in the EU?
    Some examples of what it means to “monitor behavior” include:

    • use of technologies to track EU website users
    • using predictive analytics to anticipate buying patterns
    • operating affinity or loyalty programs in the EU

It Looks Like We Are a Controller Established in the EU. Are we in Trouble?

Based on what the regulators are saying about enforcement, as long as well-planned steps are taken to immediately start a compliance program, your company will probably be ok in the very near-term. Below is a brief, simplified list of what you’ll need to accomplish for GDPR compliance:

  • identify and assess risks by personal data types
  • identify who you share personal data with and where it’s stored
  • determine which of the six GDPR-permitted reasons you are relying on to possess personal data
  • update public privacy policies and internal adverse event policies and procedures, especially regarding response and notification
  • be able to respond to requests from people whose personal data you hold (such as providing copies or erasing their data)
  • review/amend your vendor agreements and remediate any gaps between existing terms and those GDPR requires

We are not Directly Covered. How do we Determine if we are Indirectly Covered?

This analysis is a bit easier than the direct coverage analysis, but there are still many variations and nuances. The easiest way to determine whether your company is indirectly covered is if you collect (via the phone, internet etc.) personal data (which, be forewarned, is very broadly defined under the GDPR) from your customers’ employees, clients, etc. You will also be indirectly covered as a processor if all of the following are true:

  • your customers collect, for themselves or for their own upstream customers, personal data from employees, consumers or others in the EU,

then

  • send all or part of that personal data (again, broadly defined) to your company no matter where it’s located including in the United States,

and

  • you “process” it on behalf of that customer, noting that “processing” is also very broadly defined to include recording, organizing, structuring, storing, transmitting, adapting and the like.

We Are Indirectly Covered, What Do We Need To Do?

As with companies who newly discover they are directly covered, if you’re indirectly covered it’s time for thoughtful urgency, but not panic. As an indirectly covered entity, your company’s GDPR obligations will come in the form of so-called “flow-downs” from the obligations that directly covered entities have with respect to their vendors, agents, and sometimes even their affiliates, known under GDPR as “processors.”

Directly covered entities do have a small degree of latitude in determining which obligations to flow-down and how to do so, based on the nature and types of work you do for them. At a minimum, however, a directly covered entity will require you to enter into a written contract, or if you already have one, add an addendum, under which the directly covered entity “instructs” you in what elements of their personal data you can process and the scope of your authorization to so.

You also should expect directly covered entities to impose most of the following obligations on you (at least some of which you may be able to satisfy if you are ISO 27001 certified or receive unqualified SOC 2, Type 2 reports):

  • restrict you from subcontracting without their consent
  • require you to obtain confidentiality commitments from employees who are directly involved with the “processing” for that covered entity
  • implement data security safeguards to protect their personal data (which may include encryption)
  • assist them in meeting their own GDPR obligations to provide data subjects with access to their data and the right to have it deleted

Some processors choose to be proactive and send their own form of Data Protection Agreement or GDPR policy statement to their customers. This can be a viable strategy, but should be assessed on a case-by-case basis.