CFAA Beacon Bill

Representatives Tom Graves (R-GA) and Kyrsten Sinema (D-AZ) have introduced a bill to amend the Computer Fraud and Abuse Act. The bill, titled the “Active Cyber Defense Certainty Act,” would allow the defensive use of “beaconing” technology (see H.R. 4036).  A “beacon” is a program that causes traffic to leave a network at regular intervals.  Beacons are frequently employed by malware to signal to the the malware’s proprietor that a network has been compromised so that it can be accessed or connected to a botnet.  H.R. 4036 would allow the defensive use of malware to beacon information that could identify an attacker.  It would immunize this activity from criminal CFAA culpability, but not from civil liability.  It would require notification to the FBI National Cyber Investigative Joint Task Force  before using a defensive beacon and would establish a voluntary pilot program through which the FBI would approve specific tools prior to notification.

There is some good reason for relaxing the CFAA in defensive contexts, but the requirement of prior FBI authorization seems highly problematic to me.  Essentially, this amounts to the cyber-deputization of private entities, which raises privacy and oversight concerns.


Microsoft and the Law of the Cloud: to the Supreme Court

Last year I wrote about Microsoft’s Stored Communications Act litigation.  The dispute has now worked its way up to the Supreme Court.  Andrew Keane Woods offers a good primer on the case on the Lawfare Blog.  I generally agree with Andrew’s take:  (1) the extraterritoriality issues do not seem to raise major sovereignty concerns; and (2) it is not really a “privacy” case.  It’s also interesting, as Andrew notes, that Silicone Valley seems uncertain about how to approach this dispute.  But here’s where I might go a bit further than Andrew:  the extraterritoriality issues do not raise major sovereignty concerns unless you think the cloud is really something different.  The Supreme Court continues to make Internet-exceptionalist noises, such as Justice Kennedy’s ode to the Net in the Packingham case last year:

While we now may be coming to the realization that the Cyber Age is a revolution of historic proportions, we cannot appreciate yet its full dimensions and vast potential to alter how we think, express ourselves, and define who we want to be. The forces and directions of the Internet are so new, so protean, and so far reaching that courts must be conscious that what they say today might be obsolete tomorrow.

Packingham v. North Carolina, 137 S. Ct. 1730, 1736 (2017).  The cloud, of course, is just a marketing term for storing stuff and running apps on the Internet.  In my view, the Court should avoid rhapsodizing about the cloud or the Internet in the Stored Communications Act context, apply ordinary principles of extraterritoriality to find that Microsoft was required to produce the records in this case, and leave further tinkering with the statutory framework to Congress.

Bot Code, Norms, and Law

There’s a good post on Dark Reading by Ido Safruti about norms and etiquette for bot code.  According to Imperva’s most recent bot traffic report, bots comprise the majority of Internet traffic.  May bots are intentionally disruptive or misleading — for example, bots that create comment link spam on blogs.  Others are useful — for example, they, allow a search engine to index web pages.  Even useful bots can be disruptive, such as by using up site capacity,  and the robots.txt standard has been developed so that site owners can limit or exclude bot traffic.

Safruti provides the following guidelines for ethical bot code:

1.  Declare who you are;
2. Provide a method to accurately identify your bot;
3.  Follow robots.txt;
4.  Don’t be too aggressive.  

These are sound guidelines, but my lawyer Spidey sense wonders how they might translate into legal norms, or whether they should become legal norms.  The most immediate way in which guidelines like this can become part of legal norms is through a contractual terms of use.  I’m not sure a terms of use would be enforceable either as a legal or practical matter against unwanted bots, not least because the measure of contractual damages would be unclear.  There’s an interesting 2001 case in the First Circuit finding a Computer Fraud and Abuse Act violation for bot use, but the facts are quirky and it seems to me perhaps wrongly decided.  Perhaps guidelines like Safruti’s provide a standard of care for a tort claim if an unwanted bot causes a business interruption, though in states where the economic loss doctrine applies this would produce an difficult question about whether slowing a website is a kind of compensable property damage.  Guidelines like this could also be incorporated into a regulatory regime, which the Internet community as a whole might not find palatable.


Cybersecurity and Social Media Use by Sex Offenders: Packingham v. North Carolina

This week the U.S. Supreme Court decided Packingham v. North Carolina, a first amendment challenge to a state statute that prohibited convicted sex offenders from accessing certain “commercial social networking” sites.  I include cases like this that involve the protection of minors, harassment, stalking, and the like under the rubric of “cybersecurity” because these issues of personal online safety relate to the stability and security of the “place” we call “cyberspace.”  In fact, it’s in these kinds of cases that the courts often grapple with the “place-ness” of cyberspace.  That grappling is central to the majority and concurring opinions in Packingham.

Many states have statutes that prohibit or limit registered sex offenders from accessing Internet content, including social media.  North Carolina’s statute defined “social networking Website” broadly.  It arguably would have covered not only sites such as Facebook and Twitter, but also shopping, news, health,  career, or other sites with comment boards.

Packingham was convicted in 2002 of sexual contact with a minor.   As a result, he was required to register as a sex offender, and was barred from accessing “social networking Websites” under the North Carolina statute.  In 2010, Packingham received a traffic ticket, which subsequently was dismissed.  He posted a religiously-themed message on his Facebook page celebrating the dismissal.  As a result of this posting, he was convicted of violating the social media statute.

A unanimous 8-Justice Supreme Court (Justice Gorsuch did not participate in the case) struck down North Carolina’s statute as unconstitutional under the First Amendment.  Justice Kennedy wrote the Court’s opinion, but Justice Alito wrote a concurrence, joined by Justices Roberts and Thomas, disagreeing with some of Justice Kennedy’s reasoning.  Both opinions are notable for their Internet exceptionalism, but Justice Kennedy’s opinion seems like an exceptional kind of exceptionalism.

Both the majority and concurring opinions applied the same legal doctrine based on the assumption that the North Carolina statute is “content neutral.”  Government regulation that restricts the time, place or manner of speech, but not the content of speech, generally is subject to “intermediate scrutiny” by the courts.  Under intermediate scrutiny, the regulation must “be narrowly tailored to serve a significant governmental interest,” which means the regulation “must not burden substantially more speech than is necessary to further the government’s legitimate interests.”  Opinion at 6 (internal quotations omitted).  All of the Justices agreed that the protection of children online is a significant government interest but that the North Carolina statute burdened substantially more speech than necessary to further online child protection.  Justices Kennedy and Alito disagreed somewhat, however, on how to frame the question of how much speech was burdened.

Justice Kennedy suggested that “[a] fundamental principle of the First Amendment is that all persons have access to places where they can speak and listen, and then, after reflection, speak and listen once more.” Id. at 4.  Justice Kenendy makes clear that he views cyberspace as such a “place”:

While in the past there may have been difficulty in identifying the most important places (in a spatial sense) for the exchange of views, today the answer is clear.  It is cyberspace — the “vast democratic forums of the Internet” in general . . . and social media in particular.

Id. at 5.  Not only is cyberspace one of the most important “places” of civil discourse today, according to Justice Kennedy

While we now may be coming to the realization that the Cyber Age is a revolution of historic proportions, we cannot appreciate yet its full dimensions and vast potential to alter how we think, express ourselves, and define who we want to be.  The forces and directions of the Internet are so new, so protean, and so far reaching that courts must be conscious what they say today might be obsolete tomorrow.

Id. at 6.

In his concurrence, Justice Alito complained that “[t]he Court is unable to resist musings that seem to equate the entirety of the internet with public streets and parks.”  Alito Concurring Opinion, at 1.  For Justice Alito, it was clear that the North Carolina statute would prohibit access to many websites that provided little or not risk of child exploitation.  However, he argued that “if the entirety of the internet or even just ‘social media’ sites are the 21st century equivalent of public streets and parks, then States may have little ability to restrict the sites that may be visited by even the most dangerous sex offenders.”  Id. at 10.

It seems, then, that Justice Alito took a more cautious, less exceptionalist line than Justice Kennedy.  At the conclusion of his concurrence, however, Justice Alito agreed that “[c]yberspace is different from the physical world. . . .”  Id. at 11.  For Justice Alito, this difference warrants careful evaluation of individual cases, “one step at a time.”  Id. at 11.

My own sense of the judicial role, combined with the dynamic nature of the Internet, leads me to agree more with Justice Alito than Justice Kennedy.  There is something different about cyberspace, and this difference does make the nexus between liberty and security — not least as that nexus involves the freedoms of speech and association — exceedingly difficult.  But these difficulties are not anything new for courts.  In cases that implicate technological change, a court’s job is to understand how the particular technology at issue in a particular case or controversy relates to the legal doctrine applicable to that particular case or controversy.  Hand-waving over the word “cyberspace” is no excuse for sloppy judging.

Perhaps equally importantly, because the balance between liberty and security in cyberspace is difficult, courts should be careful about usurping legislative judgment.  Critics of sex offender social media bans often point to social science research that suggests restricting social media use has no effect on recidivism or child safety and that “sex offenders” cannot be treated as a homogeneous group.  The primary risk to children, according to some of this literature,  is from adult men who are not pathologically pedophiles but who groom adolescent girls out of a sense of power or danger.   I wonder if such studies are too focused on recidivism rather than on the possible deterrent effect for potential first-time offenders.  Even more significantly, I also think such studies can overlook the harm caused to children in the production of child pornography and the role of social networking sites and technologies in facilitating child pornography collection exchanges.  Indeed, even the authors of such research have noted that “[t]he development of new technologies and social media often outpaces the study of its use in the commission of crimes, which poses a unique challenge for further study.”  Chan, McNeil and Binder, Sex Offenders in the Digital Age, The Journal of the American Academy of Psychiatry and the Law 44:3 (2016).  Legislatures might be even better positioned than courts to evaluate and adjust to social science and other research that can inform policy.

Tabletop for NJSBA Second Annual Cybersecurity Conference

Here is a tabletop exercise I drafted that we’ll be running at the Second Annual NJSBA Cybersecurity Conference.

Acme Corp. manufactures and sells industrial control systems (ICS).  ICS devices integrate computer chips, hardware and software and can be programmed to monitor, regulate and control various components of commercial manufacturing, assembly and packaging plants.  For example, the following video shows an Acme ICS serving as the controller for water bottling plant:

ACME’s ICS devices are network enabled and come bundled with a software suite that allows users to monitor and control the devices through a web interface.

Acme also provides installation and maintenance services for its ICS equipment.  Each ICS device must be configured for the systems it will control, which involves the creation of custom computer code.  The computer code, and sometimes the hardware, must periodically be updated if the underlying system configuration changes or if Acme develops performance enhancements, bug fixes, or security patches.  In a larger installation, Acme’s fees for installation and maintenance can exceed the costs of the initial hardware purchase, and the total contract price can exceed ten million dollars.

Acme maintains detailed information about each of its installations, including specific configuration information, networking details, and backup copies of computer code.  This information is stored in numerous documents in a variety of formats, including, for example, Word documents, Excel spreadsheets, Powerpoints, e-mails, and plain text files, on systems used by various Acme business units.  Files may reside on individual computer hard drives, internal company file servers, portable media (such as thumb drives), company-owned and personal laptops, smartphones and tablets, and commercial cloud-based storage such as Google Drive and Dropbox.

ISSUE 1:  A number of management-level Acme employees recently received emails purporting to have been sent by Sol Fish, Vice President for Client Relations at Acme.  The emails instruct the recipients to log into a newly-established sales database through a hyperlink in the email using their existing Acme network log-in credentials.  Fish did not send these emails, however, nor has Acme created any new sales database.  Meanwhile, Fish has received an email from Carl Kent, a business reporter for the Broad Street Journal, inquiring about the fact that the full technical specifications for an ICS installation at the Port Newark were posted this morning on a number of business and government blogs.  In fact, Acme won a contract to improve the automation of shipping cranes and other devices at the Port.  The contract was controversial because of unsubstantiated allegations of bid rigging, cost overruns, and other political complaints.  The full technical specifications are confidential for security concerns among other reasons.  An obvious inference is that the spearphising attack may have allowed someone to obtain and post the confidential specifications.

ISSUE 2:  In addition, Fish has received an angry call from Bill Brazos, the CEO of Consolidated Fulfillment Centers, Inc.  Consolidated owns and operates large warehouse and fulfillment centers for major online retail companies.  Brazos claims that an Acme ICS system installed at a Consolidated facility in Edison, NJ contained a vulnerability that allowed hackers to obtain information concerning consumers to whom products were being distributed through the Consolidated facility.   Brazos says “millions” of customer accounts may have been compromised.

Implementing ABA Formal Opinion 477


On May 4, 2017, the ABA released Formal Ethics Opinion 477, “Securing Communication of Protected Client Information” (attached at the end of this post).  This Opinion updates Formal Ethics Opinion 99-413, issued in 1999, which concluded that lawyers could use unencrypted email to communicate with clients.  Those of us who were practicing in 1999 will remember the difficulty the then-still-new phenomenon of ubiquitous email communication created for lawyers’ obligations of confidentiality.  The ABA has revisited the question because of new concerns about cybersecurity and client confidentiality.

Opinion 477 does not mandate any specific cybersecurity measures, but instead requires “reasonable efforts” to ensure client confidentiality when using any form of electronic communication, including text messaging, cloud-based document sharing, or other services, in addition to email.  The “reasonable efforts” requirement is consistent with Model Rule 1.6(c) concerning inadvertent disclosure of client information.  The Opinion adopts the factors set forth in Comment 18 to Model Rule 1.6(c) as guidelines for “reasonable efforts”:

(1) The sensitivity of the information;
(2) The likelihood of disclosure if additional safeguards are not employed;
(3) The cost of employing additional safeguards;
(4) The difficulty of implementing the safeguards; and
(5) The extent to which the safeguards adversely affect the lawyer’s ability to represent clients (e.g., by making a device or important piece of software excessively difficult to use).

Opinion 477, lines 108-114.

For “routine communication with clients,” Opinion 477 reaffirms the conclusion of Opinion 99-413 that unencrypted email is generally acceptable, “presuming the lawyer has implemented basic and reasonably available methods of common security measures.”  Id., lines 130-136.  However, Opinion 477 requires a more extensive, fact-based risk assessment for other kinds of communications.

Steps for Compliance

Basic Technical Measures

There are some technical measures every lawyers should take to secure electronic communications with clients and that Opinion 477 seems to assume are normally reasonable.  These generally are technologies and policies that every law firm already should be using:

      • Sound password and access policies;
      • Appropriately configured firewalls;
      • Use of a VPN for communications outside a secure office network;
      • Encryption of data at rest, at least for sensitive client information;
      • Secure file sharing portals, at least for sensitive documents;
      • Appropriate BYOD policies.

Education and Training

Like any good cybersecurity compliance program Opinion 477 suggests that lawyers and their support staff must obtain some training about cyber hygiene.  Id., lines 149-200.  This does not mean lawyers need to obtain expert cybersecurity certification credentials, but it does mean every lawyer must obtain at least a general understanding of how computers and computer networks function, of common types of cybersecurity threats and how to mitigate them, and of the proper use and implementation of the kinds of technologies and policies mentioned above.  A firm should be able to document the content and frequency of such training for its personnel.

Inventories and Audits

A key part of a strong cybersecurity program that is often overlooked is to inventory computer networks and systems and to audit compliance policies.  A firm should know:

      • Its network configuration;
      • Exactly which devices are connecting to the network;
      • Open ports on the network;
      • The volume of traffic flowing over the network.

A number of software tools are available to help automate this inventory and monitoring process and to raise red flags if unusual patterns occur.  If the firm is relying on an outside vendor for network support, the vendor should be able to provide this information.

In addition, a firm should maintain centralized cybersecurity compliance and breach response policies, which should regularly be reviewed by attorneys and staff.  A law firm’s cybersecurity compliance should include tiered security measures based on specific types of client information regularly handled or with the potential to be handled in the course of the firm’s practice.

Due Diligence on Vendors

The Opinion also requires attorneys to conduct due diligence on vendors that provide communications technology.  The auditing checklists here likely are more extensive than the current practices of many law firms.  See Opinion, 477, lines 267-312.  Attorneys should remember that these requirements relate to their ISPs, web hosting companies, cloud storage providers, email providers, outside experts who handle electronic client information, e-discovery providers, and other vendors.  It can be helpful to develop standardized checklists and questionnaires for gathering this information.


ABA Opinion 477 makes clear that law firms must follow up-to-date, comprehensive cybersecurity compliance practices.  While many firms likely already use some basic security technologies, Opinion 477 makes cybersecurity a high priority for competency in the practice of law.

Fourth Circuit Revives Wikimedia NSA Case

Yesterday the Fourth Circuit reinstated a case brought by the Wikimedia Foundation concerning the National Security Agency’s bulk “Upstream” surveillance program.  Under the Upstream program, the NSA collects traffic on the U.S. Internet backbone.  The Government claims that this collection is targeted to specific queries relating to terror investigations and other intelligence matters.  As a result, the government claimed, it is unlikely that any communications involving Wikimedia were reviewed by the NSA as part of the Upstream program, and therefore Wikimedia lacks standing to assert its claims.  The district court, relying on Clapper v. Amnesty International USA, 133 S. Ct. 1138 (2013), agreed and Granted the government’s motion to dismiss on the pleadings.  The Fourth Circuit reversed.

Wikimedia alleged that  that, because of the way packets travel over the network, the NSA necessarily must collect substantially all the the international text-based communications traveling  over high-capacity cables, switches and routers in the U.S.  The Government argued that this was a speculative assertion that should not be taken at face value even at the pleading stage.  However, Wikimedia also alleged that, given the enormous number of Internet communications involving Wikimedia each year — a number Wikimedia put at over one trillion — it is nearly certain that the NSA has collected and reviewed communications involving Wikimedia even if the NSA’s data collection were limited to one trunk line.  As the Complaint put it, “even if one assumes a 0.00000001% chance  . . . of the NSA copying and reviewing any particular communication, the odds of the government copying and reviewing at least one of the Plaintiffs’ communications in a given one-year period would be greater than 99.9999999999%.”  Complaint, 46-47.  

The Government disputed these factual statistical assertions as well, but the Fourth Circuit found them plausible enough that the case should proceed.  The Fourth Circuit noted that “[w]e would never confuse the plausibility of this conclusion with that accorded to Newton’s laws of motion,” but noted that the standard is merely reasonable plausibility.  Opinion, at 26.  The Fourth Circuit did, however, uphold the dismissal of what it termed that “Dragnet” allegations because the Complaint did not contain specific enough factual assertions about the actual scope of the NSA’s surveillance activity.

The Fourth Circuit makes some interesting interpretive moves in this Opinion relating to how Clapper should apply in cases involving bulk surveillance claims and large Internet entities.  Wikimedia’s “statistical” argument seems dubious, and it seems that under the Fourth Circuit’s analysis any entity with a large Internet presence would have standing to challenge a surveillance program.  Perhaps that is a good policy result, but it does not seem consistent with Clapper.

The Fourth Circuit’s Opinion is below:

Facebook and Terrorism: Cohen v. Facebook and Force v. Facebook

It’s well-known that Facebook, Twitter, YouTube, and other social media platforms are used for propaganda and recruiting purposes by terrorist groups such as ISIL.  A number of Jewish groups filed lawsuits alleging that Facebook should be held civilly liable for facilitating terrorist attacks against Jews.  Two of these cases recently were dismissed by Judge Nicholas Garaufis in the U.S. District Court for the Eastern District of New York.  A copy of Judge Garaufis’ Memorandum and Order is available below.

In Cohen v. Facebook, the plaintiffs asserted negligence and civil conspiracy theories under Israeli and U.S. law.  That case was removed to federal court by Facebook.  In Force v. Facebook, the plaintiffs asserted claims under the federal “Providing Material Support to Terrorists” statute, 18 U.S.C. § 2339A and the civil remedies provision for terrorist acts, 18 U.S.C.  §  2333A, as well as for negligence and other breaches of duty under Israeli law. Copies of the Cohen and Force Complaints are available below.

Judge Garaufis dismissed the Cohen case for lack of standing because the individual plaintiffs asserted only a threat or fear of possible future harm.  He also dismissed the Force case under the immunity provision of section 230 of the Communications Decency Act, 47 U.S.C. § 230(c)(1).  This provision states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”  Id.  

The Second Circuit has established a three-part test for determining whether section 230 immunity applies:  the law “shields conduct if the defendant (1) is a provider or user of an interactive computer service, (2) the claim is based on information provided by another information content provider and (3) the claim would treat [the defendant] as the publisher or speaker of that information.”  FTC v. LeadClick Media, LLC, 838 F.3d 158, 173 (2nd Cir. 2016).

The primary issue in these cases was whether the third element would be satisfied.  Here, the focus is on whether the provider exercises “a publisher’s traditional editorial functions — such as deciding whether to publish, withdraw, postpone, or alter content.”  Id. at 174.  The plaintiffs in the Force case argued that Facebook was not acting as a publisher but rather was providing content-neutral services in support of terrorist activities by Hamas.  The court rejected this argument and found the section 230 immunity applies to Facebook. Memorandum and Order, at 17-23.

The plaintiffs in the Force case also raised a creative argument:   section 230 should not apply because the terrorist acts occurred in Israel and there is a presumption against extraterritoriality.  Judge Garaufis also rejected this argument and held that the focus of section 230 is to limit civil liability of internet service providers and that the relevant events relating to such liability involve the location of the speaker.  Since Facebook is a U.S. corporation, Judge Garaufis held that section 230 did not require extraterritorial application in this case even though the terrorist acts happened in Israel. Memorandum and Order, at 23-27.

Judge Garaufis’ interpretation of section 230, including the question of extraterritoriality raised by this case, seems correct.  Section 230, however, was a legislative solution to Internet publisher liability in a simpler age, before the explosion of social media platforms and their cooptation by terrorists.  There may be good policy arguments today for imposing some legal duties on social media sites to screen for materials that incite violence and terrorism.


Cohen and Force Opinion


Cohen Complaint


Force Complaint

WannaCry Ransomware and Legal Fault

The WannaCry Ransomware attack has spread throughout the world over the past week.  Fingers are pointing at Microsoft for the vulnerability in earlier versions of Windows, at the NSA for creating the leaked exploit, and at North Korea for allegedly perpetrating the attack.  There is blame to go around, but if we were to assess comparative fault the victim is also substantially to blame, for at least two reasons, one obvious and one less obvious:

First, the obvious reason:  the attack affected older versions of Windows, including Windows XP, which has not been supported by Microsoft since 2014.  However frustrating Microsoft’s update and support cycle might seem, and whatever transaction and opportunity costs are involved in switching an organization to a newer OS, it is negligent to continue using an outdated, unpatchable OS.

Second, the less obvious reason:  the attack exploited Port 445, a networking port used by those older versions of Windows for peer-to-peer connections with printers and the like.  A basic component of any cybersecurity compliance program — in addition to using updated, patched software — is to conduct regular port audit scans and to configure firewalls to block unnecessary ports.  Given the low cost of this kind of precaution, failure to conduct port audits is almost certainly negligence.