Seventh Circuit Applies Spokeo and Requires Actual Injury to Establish Article III Standing in FACTA Case

On December 13, 2016, the Seventh Circuit Court of Appeals became the first post-Spokeo circuit court to address the issue of Article III standing in a putative class action brought for an alleged violation of the Fair and Accurate Credit Transactions Act (“FACTA” or “the Act”), 15 U.S.C. § 1681c(g), which is itself an amendment to the Fair Credit Reporting Act (“FCRA”). Generally, FACTA prohibits a vendor or retailer who accepts a credit or debit card as a means of payment from printing more than the last five (5) digits of the card number or the expiration date upon any receipt provided to the cardholder at the point of the sale or transaction. 15 U.S.C. § 1681c(g)(1). Willful violations of the Act could subject a defendant to any actual damages sustained by the consumer, or statutory damages of not less than $100.00 and not more than $1,000.00. 15 U.S.C. § 1681n(a). The Act also provides for the potential recovery of punitive damages along with reasonable attorney’s fees and costs. Id. Thus, per the plain language of the statute, actual damages are not necessarily a precondition for a FACTA suit. Aggregated statutory damages in a class claim, as one might imagine, could prove ruinous for a defendant.

In Spokeo v. Robbins, the United States Supreme Court held that a plaintiff could not establish Article III standing by relying solely on a “bare procedural violation” divorced from any real-world harm, because “Article III standing requires a concrete injury even in the context of a statutory violation.” In the five months since Spokeo was decided however, district court decisions as to whether a plaintiff may enjoy standing to bring actions premised upon statutory violations alone have been far from consistent.

In Meyers v. Nicolet Restaurant of De Pere, the Seventh Circuit dismissed a plaintiff’s putative class claim for lack of Article III standing, as he sought only those damages that are statutorily provided-for under FACTA. More specifically, Mr. Meyers alleged that, after dining at Nicolet Restaurant of De Pere, he was given a receipt that did not truncate the expiration date of his credit card. He subsequently filed suit on behalf of all customers who had similarly been provided with receipts that were not compliant with FACTA’s requirements. While Mr. Meyers admitted seeking only statutory damages, he argued that standing was conferred upon him because, in enacting FACTA, Congress granted him the legal right to receive a receipt that truncated his credit card’s expiration date. The Seventh Circuit disagreed, finding it significant that Mr. Meyers discovered the violation immediately, and that no one ever saw the violative receipt. The Seventh Circuit found it difficult to imagine how the presence of the expiration date could have increased the risk that Mr. Meyer’s identity would be compromised, and accordingly held that, without a showing of injury apart from the failure to truncate a credit card’s expiration date, the injury-in-fact requirement under Article III could not be satisfied.

While district courts continue to interpret Spokeo in cases implicating various “no-injury” consumer and privacy statutes, this decision provides defendants with additional grounds to potentially move for dismissal. Conversely, plaintiffs are sure to use it is a roadmap to creatively tailor pleadings to establish an injury in fact.

The Seventh Circuit’s opinion in Meyers v. Nicolet Restaurant of De Pere can be found here.

Arizona Voter Registration Database Hacked by Email Designed to Look Like Employee

In this contentious election year, foreign hackers have taken a keen interest in the U.S. electoral system. Perhaps most memorable was this summer’s high-profile assault on Democratic National Committee computers, which exposed a number of unsavory emails and forced DNC Chairwoman Debbie Wasserman Schultz to step down. But state voter registration databases have also become popular targets for hackers looking to disrupt confidence in this year’s elections; over two dozen states have seen some form of cyberattack on their election systems this year. An apparent hacking attempt in June 2016 caused Arizona’s voter registration system to shut down for almost a week while state and federal officials investigated the source of the hack. The FBI later attributed the breach to Russian hackers.

Speaking at the Cambridge Cyber Summit this month, Arizona Secretary of State Michele Reagan revealed that the malware was traced to a highly sophisticated email designed to look like it came from an employee. Hackers used the email to obtain the username and password for a single election official, giving them access to Arizona’s entire voter registration database, which houses the personal information of more than four million Arizona residents. According to Secretary Reagan, election officials have taken several steps to protect Arizona’s election system from additional cyberattacks, including requiring employees to implement new and stronger passwords and multifactor authentication. Although Secretary Reagan has been adamant that hackers did not gain access to any mechanism for tallying votes, the mere possibility that election results could be compromised may be enough to cast doubt on this election, which some (including one major party candidate) have already alleged is “rigged.” This latest revelation from Arizona officials serves as yet another example of the importance of creating a culture of data security in the workplace and training employee–in all industries–to recognize the signs of fraudulent emails.

See Secretary of State Reagan’s complete interview here.

Failure to Update Business Associate Agreement Leads to Health System’s Settlement with OCR

A hospital’s breach notification to the Department of Health and Human Services, Office of Civil Rights (“OCR”) led to a Resolution Agreement, payment of $400,000 and a Corrective Action Plan for an east coast health system. On September 23, 2016, OCR issued a press release advising that Woman & Infants Hospital of Rhode Island (“WIH”) a member of Care New England Health System (“CNE”) notified OCR of a reportable breach in November of 2012, stemming from its discovery that unencrypted backup tapes containing electronic Protected Health Information (“PHI”) were missing from two of its facilities. CNE provides centralized corporate support to the covered entities under its common ownership and control, including technical support and information security for WIH’s information systems, as its business associate. Although WIH had in place a business associate agreement (“BAA”) with CNE, it was dated from March of 2005 and had not been updated since implementation and enforcement of the HIPAA Omnibus Final Rule.

OCR’s investigation of WIH’s HIPAA Compliance program, triggered by the report of the missing tapes, uncovered the outdated BAAs. WIH updated their BAA on August 28, 2015, as a result of OCR’s investigation. OCR then determined that from September 23, 2014, the date enforcement of the Final Rule began, until August 28, 2015, WIH impermissibly disclosed the PHI of at least 14,004 individuals to its business associate when WIH provided CNE with access to PHI without obtaining satisfactory assurances, in the form of a written business associate agreement, that CNE would appropriately safeguard the PHI. The settlement was reached without any admission of liability by CNE or WIH.

The settlement is a jolt to many covered entities and their business associates for a number of reasons. The key take-aways are: (1) There is an inference in the OCR’s actions that a well worded BAA, wherein the business associates agrees to abide by the specifications required by the Privacy and Security Rules, is sufficient to satisfy the covered entity’s obligation to obtain “satisfactory assurances” the business associate will appropriately safeguard the PHI (meaning those often lengthy and burdensome security questionnaires or audits business associates are being asked to complete may be unnecessary and not required); (2) documentation of intent and action, including policies, procedures and BAAs, is extremely important in establishing HIPAA Compliance (i.e., the fact that the mistake occurred—tapes went missing—is being treated as the result of the absence of a written agreement, justifying the enforcement action, when in reality it is likely, or at least conceivable, that human error, inadvertence or lack of attention is the root cause and this could have occurred even if an updated BAA was in place and being followed); and (3) policies, procedures and continuous training and retraining of the workforce handling PHI is imperative to a successful HIPAA compliance program, and remains on the radar of any OCR investigation.

A copy of the Resolution Agreement and Corrective Action Plan may be found on the OCR website at http://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/agreements/wih.
OCR’s sample BAA may be found at http://www.hhs.gov/hipaa/for-professionals/covered-entities/sample-business-associate-agreement-provisions/index.html.

Privacy of Nonparty Patients

The public has a right to every man’s evidence, unless that evidence is protected by a constitutional, common-law, or statutory privilege. How should this doctrine apply where a litigant seeks discovery of the identity of a nonparty patient who may have been a witness to negligence or malpractice? At what point is the right to evidence trumped by a patient’s right to privacy? When addressing such questions, courts distinguish the situation where disclosure of a nonparty patient’s identity would reveal nothing more than the fact that the person was a patient from the situation where such disclosure would reveal the nature of the person’s ailment or treatment.

Thus, an Arizona court allowed discovery of the identity of a hospitalized patient who may have witnessed events relevant to a malpractice claim brought on behalf of his hospital roommate. The court allowed such discovery on the basis that revealing that a person was a patient in a particular hospital room on a particular day would not reveal anything of importance about the nature of his ailments or treatment.1 Along similar lines, a New York court allowed discovery of the identities of nonparty patients in an emergency room because, due to wide range of services and medical conditions treated in emergency room, disclosure of their identities would not violate their right to keep their personal health information confidential.2

In contrast, a New York court did not allow discovery of the identities of patients in a cardiac rehabilitation center who may have witnessed an injury that was the subject of a lawsuit.3 This court did not allow such discovery because it necessarily would have revealed the nature of their ailment. It would have revealed “that they were undergoing treatment for cardiac-related conditions.” One might expect a court following this reasoning to bar discovery of the identity of a nonparty patient if it required revealing that they were receiving treatment in a particular part of a hospital (such as cancer radiation) or were hospitalized in a facility that provided a particular kind of care (such as a cancer or orthopedic specialty hospital).
_______________________________________________________________________
1 Carondelet Health Network v. Miller, 221 Ariz. 614, 212 P.3d 952 (App. 2009).
2 Rabinowitz v. St. John’s Episcopal Hospital, 24 A.D.3d 530, 808 N.Y.S.2d 280, 282 (2005).
3 Gunn v. Sound Shore Med. Ctr., 5 A.D.3d 435, 772 N.Y.S.2d 714, 715 (2004).

OCR Provides Further Clarification on Charging Flat Rate for Copies of PHI

The Office of Civil Rights (OCR) at the Department of Health and Human Services recently provided further clarification about the amount that an individual may be charged for a copy of their protected health information (PHI). After releasing guidance earlier this year about individuals’ rights under HIPAA to access and obtain a copy of their health information, OCR provided clarification in response to questions it received after releasing the guidance. In a new frequently asked question, OCR clarifies that $6.50 is not the maximum amount that can be charged to provide individuals with a copy of their PHI. Rather, OCR states that charging a flat fee of $6.50 is an option available to those covered entities (or business associate acting on behalf of the covered entity) that do not want to calculate the allowable fees for providing individuals with copies of their PHI as provided by the Privacy Rule.

Arizona Anesthesia Group Notifies 882,590 Patients of Data Breach

Valley Anesthesiology and Pain Consultants (“VAPC”), a physician group of more than 200 anesthesiologists and pain management specialists with several locations near Phoenix, Arizona, began notifying patients on August 11, 2016, of a potential data breach involving protected health information (“PHI”), despite the fact their retained forensic consultant found no evidence that the information on the computer system was accessed. However, the consultant was unable to definitively rule that out after investigation, and it did confirm that an individual gained access to a system containing PHI. The physician group elected to take the proactive route of notifying affected individuals. The forensic firm was apparently called in shortly after VAPC learned on June 13, 2016, that a third party may have gained unauthorized access to VAPC’s computer system on March 30, 2016, including records of 882,590 current and former patients, employees and providers.

On its website, VAPC says they value their relationship with patients and so decided to mail the notification letters. Law enforcement was also advised, and a dedicated call center has been set up to answer patients’ questions. Patients have been advised to review the statements they receive from their health insurer and to advise the insurer of any unusual activity. The computer system accessed is believed to have contained patient names, limited clinical information, name of health insurer, insurance identification numbers, and in some instances, social security numbers (“SSN”). No patient financial information was included in the computer systems. For providers, the information included credentialing information such as names, dates of birth, SSN, professional license numbers, DEA (Drug Enforcement Agency) and NPI (National Provider Identifier) numbers, as well as bank account information and potentially other financial information. The employee records on the system included names, dates of birth, addresses, SSNs, bank account information and financial information. Individuals that had their SSN or Medicare number exposed are being offered credit monitoring and identity theft protection services.

The circumstances of the incident illustrate the quandary regarding the presumption that it is a reportable breach if you can’t prove there was no access to the information, and the interplay between the HIPAA Security Rule and the Privacy Rule. Here, it was apparently established the system’s security was breached, but unclear whether personal health information was accessed once the unauthorized individual was in the system.

More information is available on VAPC’s website: https://valley.md/securityupdate.

EU-US Privacy Shield – How to Opt In and Self Certify

The Privacy Shield provides a means to transfer EU personal data in accordance with certain EU data privacy principles.

As of August 1, 2016, US companies may self-certify as a means of complying with EU data protection laws when transferring EU personal data from the EU to the US. (For back ground information on the EU-US Privacy Shield, see March 2016 Article.)

Companies should consider self-certifying to the Privacy Shield if they desire to minimize their exposure to liability on many fronts, e.g., regulatory compliance with the EU Data Protection Directive, federal and state laws, and minimizing risks to data breach/regulatory compliance litigation. Additionally, by operating in accordance with these data privacy principles, companies will be building goodwill with their consumers and business partners.

Pre-Certification Assessment/Audit

Prior to self-certifying, companies need to engage in a self-assessment/audit to determine whether their current business practices meet the minimum standards set forth in the Privacy Shield framework. There will likely be some work involved for must companies to self certify to the Privacy Shield, but it is certainly manageable when proper resources are allocated to address the self certification requirements.

Although not a complete and extensive list of all of the pre-certification logistical requirements, the following are required to self-certify to the Privacy Shield.

First, companies will need to assess their external and internal privacy policies, and their EU personal data collection, processing, storage and transfer procedures. Each policy and procedure will need to be compliant with the 7 Privacy Shield Principles, and as applicable, the 16 Supplemental Privacy Shield Principles. A summary of these principles can be found at the US Department of Commerce.

Second, once this assessment/audit is complete, companies will likely need to update all of their privacy policies and procedures and contracts with their business partners. If companies self certify to the Privacy Shield by September 30, 2016, they will be provided with a 9-month grace period to update their contracts with their business partners.

Third, the Privacy Shield requires companies to implement specific complaint and dispute policies and procedures, which include replying promptly to all complaints, identifying a point of contact person/officer for complaints and provide an independent recourse resolution mechanism to EU consumers.

Fourth, companies are required to notify the public that they are self certifying to the Privacy Shield. This reference includes publishing the Privacy Shield logo and required self certifying language to their websites, and appointing a person who is responsible for self-compliance.

Self-Certifying to the Privacy Shield

Once companies complete their pre-certification assessment/audit, then they will be ready to certify to the Privacy Shield.

Self-certification to the Privacy Shield requires companies to submit a written application/certification to the US Department of Commerce. There is also a required fee to self-certify to the Privacy Shield. See Federal Register July 22, 2016 Cost Recovery Fee Schedule for the EU-U.S. Privacy Shield Framework Notice.

Post Certification

After self-certifying to the Privacy Shield, companies must walk the walk. This requires a coordinated effort to comply with their Privacy Policy and maintain good standing on the Privacy Shield list of self-certifying companies.

Additionally, companies must self-certify each year with the US Department of Commerce, which means self-certifying to the Privacy Shield is a constant, ongoing process.

For guidance through the legal and regulatory compliance land mines of self-certifying, do not hesitate to contact Mark Ishman, a member of Gordon Rees’ Privacy & Data Security Practice Group.

Ransomware: Preparing for the Storm That’s A Brewin’

On July 11, 2016, the Office for Civil Rights (“OCR”) published guidelines for ransomware attack prevention and recovery, including the role HIPAA has in assisting covered entities and business associates prevent and recover from such attacks, and how HIPAA breach notification processes should be managed in response to a ransomware attack. According to the OCR report, there have been 4,000 daily ransomware attacks since early 2016, up 300% from 2015. Earlier this week a healthcare IT Security Consultant told me the chatter he hears is the hackers are working on stronger, more aggressive, more deadly hacks to unleash, and he fears a hacking storm a brewin’. Time to get serious and batten down the hatches, folks!

8-3The OCR report describes what a ransomware attack is, and explains that maintaining strict HIPAA Security Rule compliance can help prevent the introduction of malware, including ransomware. Some of the required security measures discussed include:

  • Implementing a security management process, which includes conducting a risk analysis and taking steps to mitigate or remediate identified threats and vulnerabilities;
  • Implementing processes to guard against and detect malicious software;
  • Training users on malicious software protection; and
  • Implementing access controls.

Ransomware gets into your system, denies you access to your data (usually through encryption), and then directs you to pay a ransom to the hacker in order to receive a decryption key. For this reason, maintaining frequent backups and ensuring the ability to recover data from backups is crucial to surviving a ransomware attack. HIPAA compliance helps protect entities because the Security Rule requires covered entities and business associates to implement a data backup plan as part of an overall contingency plan, which includes periodic testing of the plan to be sure it works.

The presence of ransomware – or any malware – is considered a security incident and triggers the need to initiate security incident response and reporting procedures. Based upon an analysis of the investigation results, breach notification may be required. Additionally, if there is an impermissible disclosure of protected health information (“PHI”) in violation of the privacy rule, there is a presumed breach which may trigger notification. Whether or not the presence of ransomware would be a breach under HIPAA Rules is thus fact specific. However, unless the entity demonstrates there is a “…low probability that the PHI has been compromised,” a breach of PHI is presumed to have occurred and the entity must comply with the applicable breach notification provisions.

Further information and a copy of the OCR report can be found here.

For Now, Emails Stored on Foreign Servers Are Immune to Warrant Searches

On July 14, 2016, the Second Circuit Court of Appeals ruled in the potentially groundbreaking Microsoft v. United States case that the federal government cannot compel companies to turn over emails stored on servers located outside the United States. In today’s border-shrinking digital world, the Second Circuit’s ruling raises a slew of questions (that will no doubt be litigated extensively in the coming years) and more than a few concerns.

In December 2013, the United States government sought to execute a search warrant pursuant to Section 2703(a) of the Stored Communications Act (“the “SCA”) to seize the contents of an email account of a suspected participant in a narcotics ring, which was stored on Microsoft’s servers in Ireland. Microsoft refused to turn over the extraterritorial emails, and was held in contempt for failing to comply with a search warrant.

Initially, the Southern District of New York ruled that Section 2703 of the SCA applies extraterritorially, and ordered Microsoft to release the sought-after emails. On appeal, however, the Second Circuit held that Section 2703 of the SCA “does not authorize courts to issue and enforce against U.S.‐based service providers warrants for the seizure of customer e‐mail content that is stored exclusively on foreign server.”

In reversing the district court, even after noting the presumption against extraterritoriality, the Second Circuit relied heavily upon the fact that the SCA, passed in 1986, was drafted when computers were in their infancy, foreign-communicating servers did not exist, and very few lawmakers were familiar with the concept of the Internet. The Second Circuit also found persuasive the fact that the SCA’s warrant provision that allows the government to require disclosure of electronically stored communications, like any other search warrant and unlike subpoenas, is restricted by the Fourth Amendment to domestic applications only.

In the concurrence to the Microsoft opinion, the Second Circuit acknowledges that the SCA does not protect emails and other information stored on domestic servers. In fact, the Court notes, nothing prevents private companies from transferring electronically stored communications stored on foreign servers to American-based servers with the click of a button, which would give the federal government the opportunity to execute a properly obtained search warrant lawfully.

At minimum, this case signals to Congress the urgent need to updated outdated statutes like the SCA that have been rendered obsolete by decades of warp-speed technological breakthroughs and advancement. In 1986, the concept of cloud storage, extraterrestrial servers and fast-speed internet was the stuff of science fiction novels. Today, such technology is used by virtually every business and by a large percentage of the world’s population. The Second Circuit has signaled to Congress that the time to weigh privacy interests against the government’s legitimate need for evidence is now.

‘The Dark Overlord’ Places Healthcare Databases on Dark Web

Once again news reports teach us that the time to have your robust data privacy and security program in place and continually monitored was yesterday!

On June 26 it was reported on DataBreaches.net that 655,000 patient records from three different healthcare databases were up for sale on the dark net. According to reports on the DeepDotWeb, at least one of the hacked entities was using SRS EHR v.9 patient management software. DeepDotWeb also reports that the hacker communicated with them over an encrypted Jabber conversation, and included images from the largest database hack from the hacker’s internal network. The seller/hacker asked the website to add a note to the breached companies: “Next time an adversary comes to you and offers you an opportunity to cover this up and make it go away for a small fee to prevent the leak, take the offer. There is a lot more to come.”

Apparently it was shortly after that a fourth stolen database consisting of a reported 9.3 million individuals records from a health insurer went up for sale. The hacker taking credit for all refers to himself as “The Dark Overlord”. He claims to have contacted the entities to warn them about the vulnerabilities of their systems, and offered to fix or reveal the problems, for an undisclosed amount, which the healthcare organizations declined. In other words, the hacker offered the stolen data back to its owners for an extorted ransom. When the demand was not paid the hacker moved on to Plan B – sell the data on the dark web. The hacker offered the data from the four hacked healthcare organizations for prices ranging from $96,000 to $490,000 in bitcoin.

In the past week two of The Dark Overlord’s targets – Athens Orthopedic Clinic in Georgia and a Missouri group of clinics owned by Dr. Scott Van Ness – have been identified. The hacker accessed electronic medical records of both targets using the credentials of a third-party vendor. Personal information of current and former patients was breached, including names, addresses, social security numbers, dates of birth and telephone numbers, and in some cases diagnoses and partial medical history. Athens Orthopedic Clinic is advising its current and past patients to place a fraud alert on their credit reports with the major credit bureaus. This notice, however,  is alleged to have materialized only after events of last weekend, when 500 patients records from Athens Orthopedic Clinic appeared on Pastebin, with a note to their CEO to “pay the [expletive omitted] up.”

Notably, according to reports on Databreaches.net, both entities have acknowledged that the attacker likely got access by an unnamed third party contractor (presumably the EMR vendor). Databreaches.net claims however that neither entity mentioned the ransom demands or that patient data was being dumped in public and was still up for sale on the dark net. Athens Orthopedic Clinic apparently did work to get the information removed from Pastebin, but the other group’s data was still posted as of July 16.

Several lessons- or at least questions- must be in the minds of any healthcare organization as they learn of these events. First is to question of whether your own data security is protected from such attacks, or are you vulnerable as well? How safe is your EMR system? How closely do you audit and monitor the third party vendors you contract with? Second, and something I think every organization should have at least a working framework to use for analysis in the event they find themselves the recipient of a post-breach ransom demand, is, what will your response be in the event you receive such a demand?