Standing Reconsidered: Fero v. Excellus Health Plan

An interesting decision from Judge Elizabeth Wolford of the Western District of New York has revived a data breach claim against Excellus Health Plan.  The court had previously dismissed claims by plaintiffs who did not allege any actual misuse of there personal data for lack of standing.  Plaintiffs moved for reconsideration based on the Second Circuit’s subsequent decision in Whalen v. Michaels Stores, Inc, 689 F. App’x 89 (2d Cir. 2017).  In Whalen, the Second Circuit upheld the dismissal of a data breach claim for lack of standing where the plaintiff failed to alleged any actual misuse of her stolen credit card information, but suggested that a claim might proceed if actual misuse of the data could be shown.  Even though the Second Circuit’s ruling in Whalen was an unpublished summary order, Judge Wolford read it to suggest that the Second Circuit would likely find standing in a data breach claim if actual data misuse could be shown.  Fero, at 14.  The plaintiffs submitted with their motion for reconsideration evidence that the plaintiffs’ PII was available on the Dark web.  Plaintiffs also presented an expert report concerning the hacking methods through which plaintiffs’ data was exfiltrated from Excellus, and offering the following conclusion:  “it is my opinion to a reasonable degree of scientific certainty that PII and PHI maintained on the Excellus network was targeted, collected, exfiltrated, and put up for sale o[n] DarkNet by the attacker for the purpose of, among other things, allowing criminals to purchase the PII and PHI to commit identity theft.”  Fero, at 17.  Based on this information and its reading of the Whalen summary order, the court granted the motion for reconsideration and denied the defendant’s motion to dismiss for lack of standing.

I think Judge Wolford overread the Whalen summary order and I’m not sure what the plaintiffs’ evidence shows in relation to standing.  I would agree with plaintiffs’ expert that the the reason the PII appears on the Dark Web is that someone is trying to sell it.  This does not prove, however, that it was actually sold, or that, even if it was sold, the sale caused a legally compensable harm to the plaintiffs.  This last issue is going to become central to the standing issue in data breach cases.  Is the disclosure or sale of a person’s PII in itself a legally cognizable harm?  This would suggest an individual has a property-like right in his or her PII or that there is some kind of compensable dignitary or emotional harm for mere disclosure of PII, which is not the typical framework for American privacy law.  Even in cases where it is clear that PII has been improperly used, through fraudulent credit card charges, some courts have found no standing if the issuing bank reimburses the cardholder.  As Dark Web search reports become routine forms of evidence in these cases, courts will need to grapple more directly with the question whether there is some kind of inherent harm in the disclosure or sale of PII.

[google-drive-embed url=”https://drive.google.com/file/d/1zS5wFihU83YBG4VPgNEfhKj0T3BjEYAE/preview?usp=drivesdk” title=”ferooriginal.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

CFAA Beacon Bill

Representatives Tom Graves (R-GA) and Kyrsten Sinema (D-AZ) have introduced a bill to amend the Computer Fraud and Abuse Act. The bill, titled the “Active Cyber Defense Certainty Act,” would allow the defensive use of “beaconing” technology (see H.R. 4036).  A “beacon” is a program that causes traffic to leave a network at regular intervals.  Beacons are frequently employed by malware to signal to the the malware’s proprietor that a network has been compromised so that it can be accessed or connected to a botnet.  H.R. 4036 would allow the defensive use of malware to beacon information that could identify an attacker.  It would immunize this activity from criminal CFAA culpability, but not from civil liability.  It would require notification to the FBI National Cyber Investigative Joint Task Force  before using a defensive beacon and would establish a voluntary pilot program through which the FBI would approve specific tools prior to notification.

There is some good reason for relaxing the CFAA in defensive contexts, but the requirement of prior FBI authorization seems highly problematic to me.  Essentially, this amounts to the cyber-deputization of private entities, which raises privacy and oversight concerns.

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xUU5xVWZ3SFdxSU0/preview?usp=drivesdk” title=”hr4036.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

 

Published
Categorized as CFAA

Microsoft and the Law of the Cloud: to the Supreme Court

Last year I wrote about Microsoft’s Stored Communications Act litigation.  The dispute has now worked its way up to the Supreme Court.  Andrew Keane Woods offers a good primer on the case on the Lawfare Blog.  I generally agree with Andrew’s take:  (1) the extraterritoriality issues do not seem to raise major sovereignty concerns; and (2) it is not really a “privacy” case.  It’s also interesting, as Andrew notes, that Silicone Valley seems uncertain about how to approach this dispute.  But here’s where I might go a bit further than Andrew:  the extraterritoriality issues do not raise major sovereignty concerns unless you think the cloud is really something different.  The Supreme Court continues to make Internet-exceptionalist noises, such as Justice Kennedy’s ode to the Net in the Packingham case last year:

While we now may be coming to the realization that the Cyber Age is a revolution of historic proportions, we cannot appreciate yet its full dimensions and vast potential to alter how we think, express ourselves, and define who we want to be. The forces and directions of the Internet are so new, so protean, and so far reaching that courts must be conscious that what they say today might be obsolete tomorrow.

Packingham v. North Carolina, 137 S. Ct. 1730, 1736 (2017).  The cloud, of course, is just a marketing term for storing stuff and running apps on the Internet.  In my view, the Court should avoid rhapsodizing about the cloud or the Internet in the Stored Communications Act context, apply ordinary principles of extraterritoriality to find that Microsoft was required to produce the records in this case, and leave further tinkering with the statutory framework to Congress.

Bot Code, Norms, and Law

There’s a good post on Dark Reading by Ido Safruti about norms and etiquette for bot code.  According to Imperva’s most recent bot traffic report, bots comprise the majority of Internet traffic.  May bots are intentionally disruptive or misleading — for example, bots that create comment link spam on blogs.  Others are useful — for example, they, allow a search engine to index web pages.  Even useful bots can be disruptive, such as by using up site capacity,  and the robots.txt standard has been developed so that site owners can limit or exclude bot traffic.

Safruti provides the following guidelines for ethical bot code:

1.  Declare who you are;
2. Provide a method to accurately identify your bot;
3.  Follow robots.txt;
4.  Don’t be too aggressive.  

These are sound guidelines, but my lawyer Spidey sense wonders how they might translate into legal norms, or whether they should become legal norms.  The most immediate way in which guidelines like this can become part of legal norms is through a contractual terms of use.  I’m not sure a terms of use would be enforceable either as a legal or practical matter against unwanted bots, not least because the measure of contractual damages would be unclear.  There’s an interesting 2001 case in the First Circuit finding a Computer Fraud and Abuse Act violation for bot use, but the facts are quirky and it seems to me perhaps wrongly decided.  Perhaps guidelines like Safruti’s provide a standard of care for a tort claim if an unwanted bot causes a business interruption, though in states where the economic loss doctrine applies this would produce an difficult question about whether slowing a website is a kind of compensable property damage.  Guidelines like this could also be incorporated into a regulatory regime, which the Internet community as a whole might not find palatable.

 

Cybersecurity and Social Media Use by Sex Offenders: Packingham v. North Carolina

This week the U.S. Supreme Court decided Packingham v. North Carolina, a first amendment challenge to a state statute that prohibited convicted sex offenders from accessing certain “commercial social networking” sites.  I include cases like this that involve the protection of minors, harassment, stalking, and the like under the rubric of “cybersecurity” because these issues of personal online safety relate to the stability and security of the “place” we call “cyberspace.”  In fact, it’s in these kinds of cases that the courts often grapple with the “place-ness” of cyberspace.  That grappling is central to the majority and concurring opinions in Packingham.

Many states have statutes that prohibit or limit registered sex offenders from accessing Internet content, including social media.  North Carolina’s statute defined “social networking Website” broadly.  It arguably would have covered not only sites such as Facebook and Twitter, but also shopping, news, health,  career, or other sites with comment boards.

Packingham was convicted in 2002 of sexual contact with a minor.   As a result, he was required to register as a sex offender, and was barred from accessing “social networking Websites” under the North Carolina statute.  In 2010, Packingham received a traffic ticket, which subsequently was dismissed.  He posted a religiously-themed message on his Facebook page celebrating the dismissal.  As a result of this posting, he was convicted of violating the social media statute.

A unanimous 8-Justice Supreme Court (Justice Gorsuch did not participate in the case) struck down North Carolina’s statute as unconstitutional under the First Amendment.  Justice Kennedy wrote the Court’s opinion, but Justice Alito wrote a concurrence, joined by Justices Roberts and Thomas, disagreeing with some of Justice Kennedy’s reasoning.  Both opinions are notable for their Internet exceptionalism, but Justice Kennedy’s opinion seems like an exceptional kind of exceptionalism.

Both the majority and concurring opinions applied the same legal doctrine based on the assumption that the North Carolina statute is “content neutral.”  Government regulation that restricts the time, place or manner of speech, but not the content of speech, generally is subject to “intermediate scrutiny” by the courts.  Under intermediate scrutiny, the regulation must “be narrowly tailored to serve a significant governmental interest,” which means the regulation “must not burden substantially more speech than is necessary to further the government’s legitimate interests.”  Opinion at 6 (internal quotations omitted).  All of the Justices agreed that the protection of children online is a significant government interest but that the North Carolina statute burdened substantially more speech than necessary to further online child protection.  Justices Kennedy and Alito disagreed somewhat, however, on how to frame the question of how much speech was burdened.

Justice Kennedy suggested that “[a] fundamental principle of the First Amendment is that all persons have access to places where they can speak and listen, and then, after reflection, speak and listen once more.” Id. at 4.  Justice Kenendy makes clear that he views cyberspace as such a “place”:

While in the past there may have been difficulty in identifying the most important places (in a spatial sense) for the exchange of views, today the answer is clear.  It is cyberspace — the “vast democratic forums of the Internet” in general . . . and social media in particular.

Id. at 5.  Not only is cyberspace one of the most important “places” of civil discourse today, according to Justice Kennedy

While we now may be coming to the realization that the Cyber Age is a revolution of historic proportions, we cannot appreciate yet its full dimensions and vast potential to alter how we think, express ourselves, and define who we want to be.  The forces and directions of the Internet are so new, so protean, and so far reaching that courts must be conscious what they say today might be obsolete tomorrow.

Id. at 6.

In his concurrence, Justice Alito complained that “[t]he Court is unable to resist musings that seem to equate the entirety of the internet with public streets and parks.”  Alito Concurring Opinion, at 1.  For Justice Alito, it was clear that the North Carolina statute would prohibit access to many websites that provided little or not risk of child exploitation.  However, he argued that “if the entirety of the internet or even just ‘social media’ sites are the 21st century equivalent of public streets and parks, then States may have little ability to restrict the sites that may be visited by even the most dangerous sex offenders.”  Id. at 10.

It seems, then, that Justice Alito took a more cautious, less exceptionalist line than Justice Kennedy.  At the conclusion of his concurrence, however, Justice Alito agreed that “[c]yberspace is different from the physical world. . . .”  Id. at 11.  For Justice Alito, this difference warrants careful evaluation of individual cases, “one step at a time.”  Id. at 11.

My own sense of the judicial role, combined with the dynamic nature of the Internet, leads me to agree more with Justice Alito than Justice Kennedy.  There is something different about cyberspace, and this difference does make the nexus between liberty and security — not least as that nexus involves the freedoms of speech and association — exceedingly difficult.  But these difficulties are not anything new for courts.  In cases that implicate technological change, a court’s job is to understand how the particular technology at issue in a particular case or controversy relates to the legal doctrine applicable to that particular case or controversy.  Hand-waving over the word “cyberspace” is no excuse for sloppy judging.

Perhaps equally importantly, because the balance between liberty and security in cyberspace is difficult, courts should be careful about usurping legislative judgment.  Critics of sex offender social media bans often point to social science research that suggests restricting social media use has no effect on recidivism or child safety and that “sex offenders” cannot be treated as a homogeneous group.  The primary risk to children, according to some of this literature,  is from adult men who are not pathologically pedophiles but who groom adolescent girls out of a sense of power or danger.   I wonder if such studies are too focused on recidivism rather than on the possible deterrent effect for potential first-time offenders.  Even more significantly, I also think such studies can overlook the harm caused to children in the production of child pornography and the role of social networking sites and technologies in facilitating child pornography collection exchanges.  Indeed, even the authors of such research have noted that “[t]he development of new technologies and social media often outpaces the study of its use in the commission of crimes, which poses a unique challenge for further study.”  Chan, McNeil and Binder, Sex Offenders in the Digital Age, The Journal of the American Academy of Psychiatry and the Law 44:3 (2016).  Legislatures might be even better positioned than courts to evaluate and adjust to social science and other research that can inform policy.

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xdUpubGNxaWxQTkU/preview?usp=drivesdk” title=”packingham.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

Slides on Cybersecurity and Legal Ethics

I’m also speaking later with Brett Harris on cyber security and legal ethics.  Here are our slides.

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xbFVLaTVHNDYtZHM/preview?usp=drivesdk” title=”Final Cyber 2017 Presentation.ppt” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/vnd.ms-powerpoint” width=”100%” height=”400″ style=”embed”]

Tabletop for NJSBA Second Annual Cybersecurity Conference

Here is a tabletop exercise I drafted that we’ll be running at the Second Annual NJSBA Cybersecurity Conference.

Acme Corp. manufactures and sells industrial control systems (ICS).  ICS devices integrate computer chips, hardware and software and can be programmed to monitor, regulate and control various components of commercial manufacturing, assembly and packaging plants.  For example, the following video shows an Acme ICS serving as the controller for water bottling plant:

ACME’s ICS devices are network enabled and come bundled with a software suite that allows users to monitor and control the devices through a web interface.

Acme also provides installation and maintenance services for its ICS equipment.  Each ICS device must be configured for the systems it will control, which involves the creation of custom computer code.  The computer code, and sometimes the hardware, must periodically be updated if the underlying system configuration changes or if Acme develops performance enhancements, bug fixes, or security patches.  In a larger installation, Acme’s fees for installation and maintenance can exceed the costs of the initial hardware purchase, and the total contract price can exceed ten million dollars.

Acme maintains detailed information about each of its installations, including specific configuration information, networking details, and backup copies of computer code.  This information is stored in numerous documents in a variety of formats, including, for example, Word documents, Excel spreadsheets, Powerpoints, e-mails, and plain text files, on systems used by various Acme business units.  Files may reside on individual computer hard drives, internal company file servers, portable media (such as thumb drives), company-owned and personal laptops, smartphones and tablets, and commercial cloud-based storage such as Google Drive and Dropbox.

ISSUE 1:  A number of management-level Acme employees recently received emails purporting to have been sent by Sol Fish, Vice President for Client Relations at Acme.  The emails instruct the recipients to log into a newly-established sales database through a hyperlink in the email using their existing Acme network log-in credentials.  Fish did not send these emails, however, nor has Acme created any new sales database.  Meanwhile, Fish has received an email from Carl Kent, a business reporter for the Broad Street Journal, inquiring about the fact that the full technical specifications for an ICS installation at the Port Newark were posted this morning on a number of business and government blogs.  In fact, Acme won a contract to improve the automation of shipping cranes and other devices at the Port.  The contract was controversial because of unsubstantiated allegations of bid rigging, cost overruns, and other political complaints.  The full technical specifications are confidential for security concerns among other reasons.  An obvious inference is that the spearphising attack may have allowed someone to obtain and post the confidential specifications.

ISSUE 2:  In addition, Fish has received an angry call from Bill Brazos, the CEO of Consolidated Fulfillment Centers, Inc.  Consolidated owns and operates large warehouse and fulfillment centers for major online retail companies.  Brazos claims that an Acme ICS system installed at a Consolidated facility in Edison, NJ contained a vulnerability that allowed hackers to obtain information concerning consumers to whom products were being distributed through the Consolidated facility.   Brazos says “millions” of customer accounts may have been compromised.

Implementing ABA Formal Opinion 477

Background

On May 4, 2017, the ABA released Formal Ethics Opinion 477, “Securing Communication of Protected Client Information” (attached at the end of this post).  This Opinion updates Formal Ethics Opinion 99-413, issued in 1999, which concluded that lawyers could use unencrypted email to communicate with clients.  Those of us who were practicing in 1999 will remember the difficulty the then-still-new phenomenon of ubiquitous email communication created for lawyers’ obligations of confidentiality.  The ABA has revisited the question because of new concerns about cybersecurity and client confidentiality.

Opinion 477 does not mandate any specific cybersecurity measures, but instead requires “reasonable efforts” to ensure client confidentiality when using any form of electronic communication, including text messaging, cloud-based document sharing, or other services, in addition to email.  The “reasonable efforts” requirement is consistent with Model Rule 1.6(c) concerning inadvertent disclosure of client information.  The Opinion adopts the factors set forth in Comment 18 to Model Rule 1.6(c) as guidelines for “reasonable efforts”:

(1) The sensitivity of the information;
(2) The likelihood of disclosure if additional safeguards are not employed;
(3) The cost of employing additional safeguards;
(4) The difficulty of implementing the safeguards; and
(5) The extent to which the safeguards adversely affect the lawyer’s ability to represent clients (e.g., by making a device or important piece of software excessively difficult to use).

Opinion 477, lines 108-114.

For “routine communication with clients,” Opinion 477 reaffirms the conclusion of Opinion 99-413 that unencrypted email is generally acceptable, “presuming the lawyer has implemented basic and reasonably available methods of common security measures.”  Id., lines 130-136.  However, Opinion 477 requires a more extensive, fact-based risk assessment for other kinds of communications.

Steps for Compliance

Basic Technical Measures

There are some technical measures every lawyers should take to secure electronic communications with clients and that Opinion 477 seems to assume are normally reasonable.  These generally are technologies and policies that every law firm already should be using:

      • Sound password and access policies;
      • Appropriately configured firewalls;
      • Use of a VPN for communications outside a secure office network;
      • Encryption of data at rest, at least for sensitive client information;
      • Secure file sharing portals, at least for sensitive documents;
      • Appropriate BYOD policies.

Education and Training

Like any good cybersecurity compliance program Opinion 477 suggests that lawyers and their support staff must obtain some training about cyber hygiene.  Id., lines 149-200.  This does not mean lawyers need to obtain expert cybersecurity certification credentials, but it does mean every lawyer must obtain at least a general understanding of how computers and computer networks function, of common types of cybersecurity threats and how to mitigate them, and of the proper use and implementation of the kinds of technologies and policies mentioned above.  A firm should be able to document the content and frequency of such training for its personnel.

Inventories and Audits

A key part of a strong cybersecurity program that is often overlooked is to inventory computer networks and systems and to audit compliance policies.  A firm should know:

      • Its network configuration;
      • Exactly which devices are connecting to the network;
      • Open ports on the network;
      • The volume of traffic flowing over the network.

A number of software tools are available to help automate this inventory and monitoring process and to raise red flags if unusual patterns occur.  If the firm is relying on an outside vendor for network support, the vendor should be able to provide this information.

In addition, a firm should maintain centralized cybersecurity compliance and breach response policies, which should regularly be reviewed by attorneys and staff.  A law firm’s cybersecurity compliance should include tiered security measures based on specific types of client information regularly handled or with the potential to be handled in the course of the firm’s practice.

Due Diligence on Vendors

The Opinion also requires attorneys to conduct due diligence on vendors that provide communications technology.  The auditing checklists here likely are more extensive than the current practices of many law firms.  See Opinion, 477, lines 267-312.  Attorneys should remember that these requirements relate to their ISPs, web hosting companies, cloud storage providers, email providers, outside experts who handle electronic client information, e-discovery providers, and other vendors.  It can be helpful to develop standardized checklists and questionnaires for gathering this information.

Conclusion

ABA Opinion 477 makes clear that law firms must follow up-to-date, comprehensive cybersecurity compliance practices.  While many firms likely already use some basic security technologies, Opinion 477 makes cybersecurity a high priority for competency in the practice of law.

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xRjJyRlZXSnZwem8/preview?usp=drivesdk” title=”ABAFormalOpinion477.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

Fourth Circuit Revives Wikimedia NSA Case

Yesterday the Fourth Circuit reinstated a case brought by the Wikimedia Foundation concerning the National Security Agency’s bulk “Upstream” surveillance program.  Under the Upstream program, the NSA collects traffic on the U.S. Internet backbone.  The Government claims that this collection is targeted to specific queries relating to terror investigations and other intelligence matters.  As a result, the government claimed, it is unlikely that any communications involving Wikimedia were reviewed by the NSA as part of the Upstream program, and therefore Wikimedia lacks standing to assert its claims.  The district court, relying on Clapper v. Amnesty International USA, 133 S. Ct. 1138 (2013), agreed and Granted the government’s motion to dismiss on the pleadings.  The Fourth Circuit reversed.

Wikimedia alleged that  that, because of the way packets travel over the network, the NSA necessarily must collect substantially all the the international text-based communications traveling  over high-capacity cables, switches and routers in the U.S.  The Government argued that this was a speculative assertion that should not be taken at face value even at the pleading stage.  However, Wikimedia also alleged that, given the enormous number of Internet communications involving Wikimedia each year — a number Wikimedia put at over one trillion — it is nearly certain that the NSA has collected and reviewed communications involving Wikimedia even if the NSA’s data collection were limited to one trunk line.  As the Complaint put it, “even if one assumes a 0.00000001% chance  . . . of the NSA copying and reviewing any particular communication, the odds of the government copying and reviewing at least one of the Plaintiffs’ communications in a given one-year period would be greater than 99.9999999999%.”  Complaint, 46-47.  

The Government disputed these factual statistical assertions as well, but the Fourth Circuit found them plausible enough that the case should proceed.  The Fourth Circuit noted that “[w]e would never confuse the plausibility of this conclusion with that accorded to Newton’s laws of motion,” but noted that the standard is merely reasonable plausibility.  Opinion, at 26.  The Fourth Circuit did, however, uphold the dismissal of what it termed that “Dragnet” allegations because the Complaint did not contain specific enough factual assertions about the actual scope of the NSA’s surveillance activity.

The Fourth Circuit makes some interesting interpretive moves in this Opinion relating to how Clapper should apply in cases involving bulk surveillance claims and large Internet entities.  Wikimedia’s “statistical” argument seems dubious, and it seems that under the Fourth Circuit’s analysis any entity with a large Internet presence would have standing to challenge a surveillance program.  Perhaps that is a good policy result, but it does not seem consistent with Clapper.

The Fourth Circuit’s Opinion is below:

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xWm81WlFScWlDY3c/preview?usp=drivesdk” title=”Wikimedia-ca4-20170523.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

Facebook and Terrorism: Cohen v. Facebook and Force v. Facebook

It’s well-known that Facebook, Twitter, YouTube, and other social media platforms are used for propaganda and recruiting purposes by terrorist groups such as ISIL.  A number of Jewish groups filed lawsuits alleging that Facebook should be held civilly liable for facilitating terrorist attacks against Jews.  Two of these cases recently were dismissed by Judge Nicholas Garaufis in the U.S. District Court for the Eastern District of New York.  A copy of Judge Garaufis’ Memorandum and Order is available below.

In Cohen v. Facebook, the plaintiffs asserted negligence and civil conspiracy theories under Israeli and U.S. law.  That case was removed to federal court by Facebook.  In Force v. Facebook, the plaintiffs asserted claims under the federal “Providing Material Support to Terrorists” statute, 18 U.S.C. § 2339A and the civil remedies provision for terrorist acts, 18 U.S.C.  §  2333A, as well as for negligence and other breaches of duty under Israeli law. Copies of the Cohen and Force Complaints are available below.

Judge Garaufis dismissed the Cohen case for lack of standing because the individual plaintiffs asserted only a threat or fear of possible future harm.  He also dismissed the Force case under the immunity provision of section 230 of the Communications Decency Act, 47 U.S.C. § 230(c)(1).  This provision states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”  Id.  

The Second Circuit has established a three-part test for determining whether section 230 immunity applies:  the law “shields conduct if the defendant (1) is a provider or user of an interactive computer service, (2) the claim is based on information provided by another information content provider and (3) the claim would treat [the defendant] as the publisher or speaker of that information.”  FTC v. LeadClick Media, LLC, 838 F.3d 158, 173 (2nd Cir. 2016).

The primary issue in these cases was whether the third element would be satisfied.  Here, the focus is on whether the provider exercises “a publisher’s traditional editorial functions — such as deciding whether to publish, withdraw, postpone, or alter content.”  Id. at 174.  The plaintiffs in the Force case argued that Facebook was not acting as a publisher but rather was providing content-neutral services in support of terrorist activities by Hamas.  The court rejected this argument and found the section 230 immunity applies to Facebook. Memorandum and Order, at 17-23.

The plaintiffs in the Force case also raised a creative argument:   section 230 should not apply because the terrorist acts occurred in Israel and there is a presumption against extraterritoriality.  Judge Garaufis also rejected this argument and held that the focus of section 230 is to limit civil liability of internet service providers and that the relevant events relating to such liability involve the location of the speaker.  Since Facebook is a U.S. corporation, Judge Garaufis held that section 230 did not require extraterritorial application in this case even though the terrorist acts happened in Israel. Memorandum and Order, at 23-27.

Judge Garaufis’ interpretation of section 230, including the question of extraterritoriality raised by this case, seems correct.  Section 230, however, was a legislative solution to Internet publisher liability in a simpler age, before the explosion of social media platforms and their cooptation by terrorists.  There may be good policy arguments today for imposing some legal duties on social media sites to screen for materials that incite violence and terrorism.

 

Cohen and Force Opinion

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xeHRxOEZZdkFJYXc/preview?usp=drivesdk” title=”Cohen v. Facebook.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

 

Cohen Complaint

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xWkVUb21jM2dUaDQ/preview?usp=drivesdk” title=”Cohen v. Facebookcomplaint.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

 

Force Complaint

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xN28zMFhXa3piUGM/preview?usp=drivesdk” title=”force v facebook.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

WannaCry Ransomware and Legal Fault

The WannaCry Ransomware attack has spread throughout the world over the past week.  Fingers are pointing at Microsoft for the vulnerability in earlier versions of Windows, at the NSA for creating the leaked exploit, and at North Korea for allegedly perpetrating the attack.  There is blame to go around, but if we were to assess comparative fault the victim is also substantially to blame, for at least two reasons, one obvious and one less obvious:

First, the obvious reason:  the attack affected older versions of Windows, including Windows XP, which has not been supported by Microsoft since 2014.  However frustrating Microsoft’s update and support cycle might seem, and whatever transaction and opportunity costs are involved in switching an organization to a newer OS, it is negligent to continue using an outdated, unpatchable OS.

Second, the less obvious reason:  the attack exploited Port 445, a networking port used by those older versions of Windows for peer-to-peer connections with printers and the like.  A basic component of any cybersecurity compliance program — in addition to using updated, patched software — is to conduct regular port audit scans and to configure firewalls to block unnecessary ports.  Given the low cost of this kind of precaution, failure to conduct port audits is almost certainly negligence.

 

Trump Cybersecurity Executive Order

President Trump Signing an Earlier Executive Order (Img Src = ZDNet)

President Trump signed today a long-awaited Executive Order on Cybersecurity.  I think it is mostly a non-event.  There are some helpful provisions, including a requirement that government agencies implement the NIST Framework.  Otherwise, it requires  a series of executive reports on cybersecurity preparedness, generally within 90 days of the Order.  As others have noted, those reports are likely to show that government cybersecurity is significantly lacking because of outdated infrastructure.  The real test will come when changes must be implemented and government cyber infrastructure moves towards a more centralized cloud-based model.

The text of the Order is below.

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xVGNIR3ZWdGI1eDQ/preview?usp=drivesdk” title=”Trump-cybersecurity-executive-order.pdf” icon=”https://drive-thirdparty.googleusercontent.com/16/type/application/pdf” width=”100%” height=”400″ style=”embed”]

DTSA Statistics

Introduction

Trade secrets are important to cybersecurity because many data breaches involve trade secret theft.  The Defend Trade Secrets Act of 2016 (DTSA) amended the Espionage Act of 1996 to provide a federal private right of action for trade secret misappropriation.   Some commentators opposed the DTSA in part because it seems redundant in light of state trade secret law and could lead to unnecessary litigation and restrictions on innovation.  Now that the DTSA has been in effect for nearly a year, I conducted an empirical study of cases asserting DTSA claims (with the able help of my research assistant, Zach Hansen).  This post summarizes the results of that study.

Methodology

We ran keyword searches in the Bloomberg Law federal docket database to identify cases asserting DTSA claims in federal courts.  It is not possible to search only on the Civil Cover Sheet because there is no discrete code for DTSA claims.  Our search ran from the effective date of the DTSA (May 26, 2016) through April 21, 2017 (just prior to our symposium on the DTSA at Seton Hall Law School).  After de-duping, we identified 280 unique Complaints, which we coded for a variety of descriptive information.  Our raw data is available online.

Findings

This chart shows the number of filings by district:

We were not surprised to see that the Northern and Central Districts of California, Southern District of New York, or District of Massachusetts were among the top five.  We were surprised, however, to see the Northern District of Illinois tied for first.  This could reflect the influence of the financial services industry in Chicago, but further research is required.

The next chart shows the number of filings by month:

It is interesting to note the decline in filings following the initial uptick after the May 26, 2016 effective date.  Perhaps this reflects a slight lull during the summer months.  Filings then remained relatively steady until March, 2017, when they increased significantly.  This could have something to do with the quarterly business cycle or bonus season, since many of the cases (as discussed below) involve employment issues.  Or, it could reflect a random variation given the relatively small sample size.

We next examined other claims filed along with the DTSA counts in these Complaints:

We excluded from this chart related state law trade secret claims.  Not surprisingly, nearly all the cases included claims for breach of contract.  As noted above, trade secret claims often arise in the employment context in connection with allegations of breach of a confidentiality agreement or covenant not to compete.  Another finding of note was that a fair number of cases assert Computer Fraud and Abuse Act claims, although the number is not as high as expected.  Most trade secret cases today involve exfiltration of electronic information, but perhaps many cases do not involve hacking or other access techniques that could run afoul of the CFAA.

We also noted a smaller but not insignificant number of cases asserting other intellectual property claims, including trademark, copyright and patent infringement.  Since many documents taken in alleged trade secret thefts are subject to other forms of intellectual property — particularly copyright — this may show that some lawyers are catching on to the benefit of asserting such claims along with DTSA claims.

Finally, our review of case status revealed the following:

  • 198 cases in various pre-trial stages
  • 61 cases dismissed
  • 5 preliminary injunctions
  • 4 final judgments, including 2 permanent injunctions
  • 3 default judgments
  • 1 case sent to compulsory arbitration
  • 8 undetermined / miscellaneous

At first blush, the number of cases dismissed seems high, given that none of the cases have been pending for more than a year.  We assume the vast majority of these cases settled, though further investigation is required.  In contrast, the number of preliminary injunctions granted seems very low.  Again, further investigation is required, but so far it does not seem that the DTSA is resulting in the kind of preliminary injunction practice we expected to see under a federal trade secret statute.

Published
Categorized as DTSA

Why Education and Training Matter to Cybersecurity Compliance

Cybersecurity is an overwhelming problem – so overwhelming that it seems impossible to address.  From the legal and compliance perspective, the problem is compounded by a lack of clear regulatory rules or judicial precedent about what kinds of measures might be sufficient to mitigate the risk of liability for a data breach or other cybersecurity incident.  One important step every business can take, however, is to implement a cybersecurity compliance training program.

Training as a Component of Legal Compliance

The “gold standard” for managing cybersecurity risk is the NIST Cybersecurity Framework.  The NIST Framework identifies four “tiers” of cybersecurity compliance, with Tier 1 representing the lowest degree of compliance and Tier 4 the highest.  A principle driver of how an organization can move up from Tier 1 through Tier 4 is organizational knowledge.  In Tier 1, according to the Framework,

There is limited awareness of cybersecurity risk at the organizational level and an organization-wide approach to managing cybersecurity risk has not been established. The organization implements cybersecurity risk management on an irregular, case-by-case basis due to varied experience or information gained from outside sources. The organization may not have processes that enable cybersecurity information to be shared within the organization.

Id. at 10.  In contrast, at Tier 4,

There is an organization-wide approach to managing cybersecurity risk that uses risk-informed policies, processes, and procedures to address potential cybersecurity events. Cybersecurity risk management is part of the organizational culture and evolves from an awareness of previous activities, information shared by other sources, and continuous awareness of activities on their systems and networks.

Id. at 11.  In order to move up through the Tiers, an organization must ensure that “[a]pplicable information from organizational privacy policies is included in cybersecurity workforce training and awareness activities.”  Id. at 16.

The FTC also emphasizes the importance of cybersecurity training.  In the Opinion of the Commision in In the Matter of LabMD, Inc., FTC Docket No. 9357 (July 29, 2016), the Commission found that

LabMD did not employ basic risk management techniques or safeguards such as automated intrusion detection systems, file integrity monitoring software, or penetration testing. It also failed to monitor traffic coming across its firewalls. In addition, LabMD failed to provide its employees with data security training. And it failed to adequately limit or monitor employees’ access to patients’ sensitive information or restrict employee downloads to safeguard the network.

Id. at 11-12 (emphasis in original).  Concerning training, the FTC noted, “[e]ven where basic hardware and software data security mechanisms are in place, there is an increased likelihood of exposing consumers’ personal information if employees are not adequately trained.”  Id. at 14.  The Eleventh Circuit recently stayed the FTC’s Order in LabMD over concerns about the Commission’s statutory authority over general cybersecurity issues.  See LabMD v. Federal Trade Commission, No. 16-16270-D,  Slip Op., (11th Cir. Nov. 11, 2016).  Meanwhile, the FTC continues aggressively to pursue cybersecurity enforcement actions.  However the Eleventh Circuit litigation turns out, the FTC’s emphasis on cybersecurity training will continue to inform standards of legal liability, both before the FTC and other authorities.

The emphasis on training is also evident in the recently proposed New York State Department of Financial Services Cybersecurity Regulations.  See New York State Department of Financial Services Proposed 23 NYCRR 500 (Dec. 28, 2016).  The NYDFS regulations created national headlines because they will cover a wide array of entities, including most of the U.S. and multinational banking sector, with any connection to financial service business in New York.  The regulations state that every covered organization must “provide cybersecurity personnel with cybersecurity updates and training sufficient to address relevant cybersecurity risks” and “verify that key cybersecurity personnel take steps to maintain current knowledge of changing cybersecurity threats and countermeasures.” Id., §  500.10  The proposed NYDFS Regulation further states that all covered entities must “provide for regular cybersecurity awareness training for all personnel that is updated to reflect risks identified by the Covered Entity in its Risk Assessment.”  Id., § 500.14.

The NIST, FTC, and NYDFS sources cited above are only a few recent prominent examples of why cybersecurity training is important.  The importance of adequate cybersecurity training will continue to resonate through statutory, regulatory and case law developments concerning cybersecurity liability for many years to come.

Training About What, For Whom, By Whom?

This discussion of “training” raises the obvious question of what kind of content the training should include, who should receive training, and who should perform the training.  There is no one-size-fits-all answer to these questions.  Obviously, Information Technology and Security professionals will need highly specialized technical training, which may come in the form of advanced degrees or industry certifications in the details of network configuration, digital forensics and the like.  But perhaps less obviously, all members of the organization, from the C-Suite to operations to sales, should receive cybersecurity training appropriate to their functions.

General cybersecurity training should cover concepts such as organizational risks from cyber threats, basic principles of cyber risk measurement, common types of cyber attacks, good cyber hygiene, procedures for reporting cybersecurity incidents, and awareness of the organization’s legal and regulatory environment relating to cybersecurity risks.  The LabMD case supplies one cautionary tale about how training could have helped:  the breach in that case resulted from LabMD’s billing manager using Peer-to-Peer software to download music while at work, and the resulting costs of the FTC action helped bankrupt the company.  See LabMD v. Federal Trade Commission, No. 16-16270-D,  Slip Op., (11th Cir. Nov. 11, 2016) (noting that “[t]he costs of complying with the FTC’s Order would cause LabMD irreparable harm in light of its current financial situation.”).   Perhaps if that billing manager had known about the enormous vulnerabilities presented by P2P software she would not have used it at work and the company would still be in business.  Another good example relates to a common kind of “social engineering” attack.  Cyber criminals sometimes leave USB memory sticks containing malware in open areas such as parking lots and reception areas.  Employees who find these “lost” memory sticks are often compelled by curiosity to plug them in – after all, perhaps they contain racy photos from the boss’s party last weekend, or secret documents worth millions! – but once plugged in they unleash havoc on the company network.  A good training program will highlight this kind of risk and will connect the risk to a compliance program that provides clear procedures for the handling and disposal of orphaned USB sticks.

The final question is who should perform the training.  The first requirement, of course, is that the trainers are thoroughly knowledgeable about cybersecurity risks, compliance procedures, and the organization’s legal and regulatory environment.  Technical professionals need technical training, but for most people in an organization, the training required is more policy oriented.  This means that not only IT, but also the organization’s risk management, human resources, and legal functions should become involved in crafting and delivering the training.  Since cybersecurity training should be connected to an organization’s comprehensive cybersecurity policy, and since a proper cybersecurity policy should flow from the Board of Directors, inside and/or outside counsel should play a key role in this process.  Legal counsel can ensure that the organization’s cybersecurity program is consistent with the organization’s legal and regulatory environment, and can also, if appropriate, seek to protect elements of the program within the attorney-client and work product privileges in the event of an investigation or dispute.

Conclusion

Cybersecurity risks cannot be ignored.  This is true not only as a practical matter, but also as a legal and compliance issue.  The need for cybersecurity training at all levels of an organization is embedded in the emerging regulatory consensus about what is required to satisfy an organization’s basic legal obligations.  Legal counsel can play an important role in helping shape and deliver an organization’s cybersecurity training program.

Presentation on Cybersecurity and the Economic Loss Doctrine

Here are the slides for my presentation on cybersecurity and the economic loss doctrine at the NJICLE 2016 Cybersecurity Conference.

[google-drive-embed url=”https://drive.google.com/file/d/0BzS0leqU862xTlpKWTJ2OGNzQVE/preview?usp=drivesdk” title=”eclossdatabreachicle.pptx” icon=”https://ssl.gstatic.com/docs/doclist/images/icon_10_powerpoint_list.png” width=”100%” height=”400″ style=”embed”]