Monthly Archives: March 2019

5 posts

Is My Toaster a Computer? The Computer Fraud and Abuse Act’s Definition of “Protected Computer” in the Age of the Internet of Things

By TJ Wong, CLS ’20

The Internet of Things (IOT) has made a lot of things in life much easier, including making the perfectly cooked toast.[1]The concept refers to the development of internet-connected versions of ordinary objects, which span to everything from coffeemakers to smart cars.[2]While offering endless benefits to daily life, the rapid rise of IOT has generated serious implications for existing computer crime laws, namely the Computer Fraud and Abuse Act (CFAA). Enacted as an amendment to existing laws addressing computer-related criminal activity,[3]the CFAA prohibits accessing a computer without authorization or exceeding authorized access and obtaining information from any “protected computer,” under § 1030(a)(2).[4]Courts have often easily accepted the broad definition of “protected computer” in light of other, more prominent limitations of the CFAA;[5]however, the seemingly endless proliferation of IOT devices justifies reconsidering “protected computer” as a worthy limitation on the breadth of conduct criminalized by the statute.

The CFAA currently defines “protected computer” as a computer that is “exclusively for the use of a financial institution or the United States Government,” or one that is “used in or affecting interstate or foreign commerce or communication.”[6]While the CFAA originally covered only important “federal interest” computers,[7]courts across the country have since interpreted “protected computer” to encompass any computer with an internet connection.[8]Furthermore, a “computer” is defined to essentially cover any device that processes or stores data,[9]including computer networks, databases, cell phones, MP3 players, refrigerators, and temperature control units.[10]As the definition covers anything with a microchip,[11]it includes all IOT devices feeding us data online, such as fitness watches and voice assistants. In the age of IOT, the CFAA’s definition of “protected computers” expands to cover items beyond the plain meaning of the term, as how toasters or refrigerators are not typically viewed in society as “computers.” As “Congress enacted the CFAA in 1984 primarily to address the growing problem of computer hacking,”[12]it seems unlikely that this dramatic expansion was contemplated.

This ever-expanding coverage of the term “protected computer” raises issues of vagueness and overly broad criminalization over the scope of the CFAA. For example, in conjunction with the Ninth Circuit’s interpretation of “without authorization” to cover common practices like password sharing, the extremely broad definition of “protected computer” contributes to potentially criminalizing individuals that share accounts over IOT devices.[13]Instead of using “protected computer” to serve as a significant limitation on the CFAA, courts have dedicated more attention to determining the scope of “without authorization or exceeds authorized access,” as well as relying on prosecutorial discretion to check arbitrary enforcement.[14]However, this approach may prove untenable in light of the uncertainty surrounding other CFAA terms, as circuits have been split over the proper interpretations of “without authorization” and “exceeds authorized access.”[15]

If it is not given enough attention, the expansive definition of “protected computers” could lead to unintended consequences after a circuit declares its stance on “access” or “authorization.” For example, the Second Circuit in United States v. Valle, 807 F.3d 508, 524 (2d Cir. 2015), held that “one ‘accesses a computer without authorization’ if he accesses a computer without permission to do so at all,”[16]along with interpreting “exceeds authorized access” as a limitation on access and not on use.[17]Consider a company that gives employees IOT devices such as voice assistants, indoor security cameras, smart keychains, or fitness watches. Under the Vallecourt’s interpretations of access and authorization, an IT employee of this company, who has a single proper purpose to access the devices for maintenance or troubleshooting, could potentially be free from CFAA liability after observing and obtaining very personal information on other employees – e.g., biometric data, location tracking, online retail or medicine orders, or even video feeds.[18]These cases also involve radically distinct types of information without the statute, as currently constructed, being able to adequately account for these differences.

The broadening scope of “protected computers” to cover everything from computers of financial institutions and the U.S. government to fitness watches, baby monitors, and home thermostats also creates problems for the penalty structure of the CFAA. The CFAA criminalizes intentionally accessing a computer without authorization or exceeding authorized access and obtaining information from any protected computer;[19]however, as mentioned above, the types of information that can be obtained from “protected computers” can be drastically and increasingly different. Nonetheless, a statutory maximum of one-year imprisonment and fines applies, with heightened penalties reserved for offenses with commercial purposes, offenses in furtherance of other unlawful conduct, information valued above $5,000, and individuals with prior CFAA convictions.[20]As such, the statutory penalties could fail to reflect whether the information improperly accessed and obtained was secrets from a government or bank computer, private biometric information, logs of someone’s shopping history, or records of one’s intimate at-home behaviors.[21]By establishing workable and meaningful distinctions between types of internet-connected devices, the CFAA could be more effective in criminalizing and deterring malicious conduct. With the rapid, constant innovation in technology, Congress may never be able to create definitions that stand the test of time; however, in light of the Internet of Things, it’s time that the CFAA is reevaluated to distinguish between traditional “computers” and smart toasters.

 

[1]Roberto Baldwin, The world now has a smart toaster(Jan. 4, 2017), https://www.engadget.com/2017/01/04/griffin-connects-your-toast-to-your-phone/

[2]Jacob Morgan, A Simple Explanation Of ‘The Internet Of Things’(May 14, 2014), https://www.forbes.com/sites/jacobmorgan/2014/05/13/simple-explanation-internet-things-that-anyone-can-understand/#5b3598c11d09

[3]OFFICE OF LEGAL EDUC., EXEC. OFFICE FOR U.S. ATT’YS, PROSECUTING COMPUTER CRIMES 1-2 (2010), http://www.justice.gov/criminal/cybercrime/docs/ccmanual.pdf(last visited Mar. 29, 2019).

[4]18 U.S.C. § 1030(a)(2)(C).

[5]Namely, interpretations of “without authorization or exceeding authorized access” and prosecutorial discretion. See generallyUnited States v. Yücel, 97 F.Supp.3d 419 (S.D.N.Y.2015); LVRC Holdings LLC v. Brekka, 581 F.3d 1127 (9thCir. 2009); WEC Carolina Energy Solutions LLC v. Miller, 687 F.3d 199 (4thCir. 2012); United States v. Rodriguez, 628 F.3d 1258 (11thCir. 2010); United States v. John, 597 F.3d 263 (5thCir. 2010); Int’l Airport Ctrs. v. Citrin, 440 F.3d 418 (7thCir. 2006); EF Cultural Travel BV v. Explorica, Inc., 274 F.3d 577 (1stCir. 2001).

[6]18 U.S.C. § 1030(e)(2).

[7]Orin S. Kerr, Vagueness Challenges to the Computer Fraud and Abuse Act, 94 Minn. L. Rev. 1561, 1563 (2010).

[8]See Yücel, 97 F.Supp.3d 418-19 (collecting cases and noting “widespread agreement in the case law”).

[9]18 U.S.C. § 1030(e)(1). NOTE: stated exceptions for automatic typewriters, hand held calculators, or “other similar device[s].

[10]SeeUnited States v. Kramer, 631 F.3d 900, 902 (8thCir. 2011);see alsoUnited States v. Nosal, 844 F.3d 1024, 1032 (9thCir. 2016) (Nosal II); United States v. Mitra, 405 F.3d 492, 495 (7thCir. 2005).

[11]Kerr, supranote 7, at 1572-72.

[12]United States v. Nosal, 676 F.3d 854, 858 (9thCir. 2012).

[13]Nosal, 844 F.3d 1024, 1050-51 (Nosal II) (Reinhardt, J., dissenting).

[14]Yücel, 97 F.Supp.3d 419; see supranote 5.

[15]Tiffany Curtiss, Computer Fraud and Abuse Act Enforcement: Cruel, Unusual, and Due for Reform, 91 Wash. L. Rev. 1813, 1823 (2016).

[16]United States v. Valle, 807 F.3d 508, 524 (2d Cir. 2015).

[17]Id.at 527-28.

[18]These are types of data possibly acquired from popular IOT devices, such as voice assistants, indoor security cameras, smart keychains, and fitness watches.

[19]18 U.S.C. § 1030(a)(2)(A)-(C).

[20]18 U.S.C. § 1030(c)(2)(A)-(C).

[21]Id.; see supranote 18.

Stopping the Clock of Supervised Release Terms

By Brannock Furey, CLS ’20

On Tuesday, the Supreme Court will hear oral arguments in Mont v. United States, a supervised release case that turns on two seemingly minor details: the use of the present tense in a federal statute, and a period of 24 days.[1]

Supervised release is designed to help monitor federal offenders after prison, and can result in reincarceration if any conditions set by the court are violated.[2]At issue in Mont is whether a federal district court had jurisdiction to revoke an individual’s supervised release after the initial date on which his supervision was scheduled to end.[3]Citing its decision in United States v. Goins as precedent, the 6th Circuit affirmed the district court’s determination that it could impose the revocation on March 30, despite the fact that the date on which his supervised release was set to expire––March 6––had passed three weeks prior.[4]

As the 6th Circuit noted in Mont, however, a clear circuit split exists between the 6th Circuit’s decision in Goinsand the D.C. Circuit’s decision inUnited States v. Marshregarding the interpretation of 18 U.S.C. § 3624(e), the provision that controls whether a term of supervised released may be suspended.[5]This led to the Supreme Court granting cert.

In their briefs submitted to the Court, Mont’s attorneys and the government disagree as to how § 3624(e) should be construed––and, consequently, over when the clock for a supervised release term should stop and start.[6]Section 3624(e) states that “supervised release does not run during any period in which the person is imprisoned in connection with a conviction for a Federal, State, or local crime unless the imprisonment is for a period of less than 30 consecutive days.”[7]

Mont’s lawyers, citing to Marsh, argue that since the statute uses the present tense, the imprisonment must take place after a conviction in order for the tolling provision to apply.[8]The government sharply contests this emphasis on the present-tense, and points to the phrase “in connection with” to support its argument that the provision can apply to imprisonment both before and after a conviction.[9]

In contrast to Marsh, the government takes the 6th Circuit’s position in Goins.[10]There, the appellate court held a defendant’s supervised-release period should pause during any pretrial detention for an indictment later resulting in a conviction, and where that incarceration was credited as time served.[11]Since the state court credited Mont’s pretrial detention to his 6-year state prison term, the government argues that his incarceration should fall within § 3624(e), and that the federal court therefore had jurisdiction on March 30 due to the tolling of Mont’s supervised release in October 2016.[12]

With more than 8 in 10 federal offenders undergoing supervised release after serving prison sentences, the Court’s decisions regarding the small distinctions at issue in Mont can have large implications for a great number of incarcerated individuals returning to society.[13]

 

 

[1]139 S.Ct. 451 (Mem), 202 L.Ed.2d 346.

[2]Congressional Research Service, Supervised Release (Parole): An Overview of Federal Law, 1.

[3]United States v. Mont, 723 Fed.Appx 325, 325 (6th Cir. 2018).

[4]Id. at 326, 328-29 (citing United States v. Goins, 516 F.3d 416 (6th Cir. 2008)).

[5]Id. at 330 (referring to United States v. Marsh, 829 F.3d 705 (D.C. Cir. 2016)).

[6]Fiona Doherty, Argument preview: Justices address circuit split on whether a period of pretrial imprisonment can toll a term of federal supervised release, SCOTUSblog (Feb. 21, 2019, 12:19 PM), https://www.scotusblog.com/2019/02/argument-preview-justices-address-circuit-split-on-whether-a-period-of-pretrial-imprisonment-can-toll-a-term-of-federal-supervised-release/.

[7]18 U.S.C. § 3624(e).

[8]Doherty, supranote 9.

[9]Id.

[10]Id.

[11]Mont, 723 Fed.Appx at 328 (citing Goins, 516 F.3d at 417).

[12]Doherty, supranote 9.

[13]Id.

Putting Words into Action: California Governor’s Pursuit to Make Mandatory Affordable Housing Actually Mean Mandatory

By Malina Welman, CLS ’20

The 1926 U.S. Supreme Court landmark decision in Euclid v. Ambler Realty Co.— the one where Justice Sutherland infamously called apartment houses a “parasite”— is charged with tacitly giving municipalities across the country the green light to impose exclusionary restrictions on land.[1]Through the use of “snob zoning,” the Court’s decision empowered local governments for decades thereafter to exercise their police powers in order to keep out poor, more diverse city dwellers from white, wealthy suburbs.[2]

Since then, however, many of America’s municipalities have turned over a new leaf, so to speak, by enacting inclusionary policies that instead foster the creation of affordable homes for low- and moderate-income households. In fact, the Lincoln Institute of Land Policy has identified over five hundred programs across twenty-seven states, with New Jersey and California accounting for 65% of all inclusionary programs.[3]Yet, for some state-wide mandates, such as the one in California, these measures have often proved to be more symbolic than they have been effective in making good on their promise to create needed affordable housing opportunities.

In an article reviewing the impact of California’s inclusionary housing law (now over fifty years old), Liam Dillion pointed out that the fundamental problem with the law is that it merely required local governments to produce “prodigious reports to plan for housing – but [did not actually] hold them accountable for any resulting home building.”[4]The result: municipalities developed and adopted plans to build new affordable units in their communities with “‘no intention,’” as Councilman Herb Perez of Bay Area suburb Foster City admitted, “‘of actually building [them].’”[5]Indeed, the city recently went more than five years without approving new development projects despite the high demand for housing.[6]

In an effort to get California back on track, the state Legislature then passed 15 housing-related bills in 2017 to address housing affordability issues.[7]Two noteworthy bills were AB1505, which would require developers to include below-market units in new rental housing projects, and SB 35, which would require cities to approve projects such projects as long as they complied with local zoning regulations and a number of other requirements.[8]Nevertheless, it appears that communities are still attempting to evade their obligation through delay tactics. Such tactics often take the form of prolonged litigation that becomes costly enough to bankrupt developers, thereby resulting in a victory for neighbors without securing a favorable judgment from the court. In Cupertino, the affluent Silicon Valley neighborhood has been doing just that by going head-to-head with developers for over a decade about plans to build up to 2,400 new homes in a nearly-vacant shopping mall with a large percentage reserved for low-income residents.[9]Even in light of California’s new laws, Cupertino residents are resorting to other measures to defer the city council’s approval of the project through a voter referendum. However, their endeavors may ultimately be futile as developers are planning to go ahead with the project.[10]

Now, California’s new governor, Gavin Newsom, is stepping into the arena looking to end an era of inaction on the part of the state’s cities and counties to provide affordable housing. During his first state budget proposal in January, Governor Newsom came out swinging with, what Liam Dillon called, “a radical new step: punishing communities that block homebuilding by withholding state tax dollars.”[11]However, the governor is not waiting around to see whether municipalities heed his words. Weeks later, Governor Newsom filed a lawsuit against Huntington Beach for rezoning a parcel of land to restrict low-income housing.[12]Although it is still too early to tell what impact Governor Newsom’s aggressive policies will have on the affordable housing crisis in the state, it is an approach worth trying.

 

[1]Euclid v. Ambler Realty Co., 272 U.S. 365, 391, 394 (1926).

[2]See Elizabeth Winkler, “‘Snob Zoning’ is Racial Segregation by Another Name,” Washington Post(Sept. 25, 2017) https://www.washingtonpost.com/news/wonk/wp/2017/09/25/snob-zoning-is-racial-housing-segregation-by-another-name/?utm_term=.86f600b1f0ec.

[3]Rick Jacobus, “Inclusionary Housing: Creating and Maintaining Equitable Communities,” Lincoln Institute of Land Policy (2015), https://www.lincolninst.edu/sites/default/files/pubfiles/inclusionary-housing-full_0.pdf.

[4]Liam Dillon, “California Lawmakers Have Tried for 50 Years to Fix the State’s Housing Crisis. Here’s Why They’ve Failed,” Los Angeles Times(June 29, 2017), https://www.latimes.com/projects/la-pol-ca-housing-supply/.

[5]Id.

[6]Id.

[7]“California Governor Signs Inclusionary Zoning Bill Into Law,” National Apartment Association (Oct. 10, 2017), https://www.naahq.org/news-publications/california-governor-signs-inclusionary-zoning-bill-law.

[8]Id.; Liam Dillon, “New Law Could Break the Stalemate Over Housing On the Site of a Near-Vacant Cupertino Mall,” Los Angeles Times(Dec. 16, 2018), https://www.latimes.com/politics/la-pol-ca-state-law-housing-cupertino-20181216-story.html.

[9]Dillon, “New Law Could Break the Stalemate Over Housing On the Site of a Near-Vacant Cupertino Mall,” Los Angeles Times(Dec. 16, 2018), https://www.latimes.com/politics/la-pol-ca-state-law-housing-cupertino-20181216-story.html.

[10]Id.

[11]Liam Dillon, “Gov. Gavin Newsom Threatens to Cut State Funding From Cities That Don’t Approve Enough Housing,” Los Angeles Times (Jan. 10, 2019), https://www.latimes.com/politics/la-pol-ca-gavin-newsom-housing-money-budget-20190110-story.html.

[12]Allysia Finley, “California’s Liberal Governor Hauls a Conservative City to Court,” Wall Street Journal (Feb. 1, 2019), https://www.wsj.com/articles/californias-liberal-governor-hauls-a-conservative-city-to-court-11549060961.

Fifth Amendment 2.0: The “Testimonial” Nature of Biometric Features

By Christian Martinez, CLS ’20

“I plead the fifth.” Commonly used in American procedural dramas, this familiar phrase refers to one’s right under the Fifth Amendment not to be compelled in any criminal case to be a witness against oneself.[1]Specifically, the right against self-incrimination prevents the “state from: compelling a defendant to make a testimonial communication to the state that is incriminating.”[2]The Supreme Court has long held that testimonial communications were not limited to oral communications[3]but also includes physical acts that “relate a factual assertion or disclose information.”[4]For example, the act of producing incriminating documents in response to a subpoena is “testimonial” because the very act of producing those documents communicates that the individual knew such documents existed and either possessed or controlled them.[5]Similarly, the act of providing a password or combination to a digital device is also “testimonial” for Fifth Amendment purposes.[6]

Despite this, state and federal courts have held that the compelled use of an individual’s biometric features (e.g. fingerprint) to unlock a digital device is not testimonial and therefore not protected by the Fifth Amendment.[7]However, Magistrate Judge Kandis Westmore of the Northern District of California has recently ruled otherwise.[8]The Government submitted to Judge Westmore warrant applications that included, among other things, a request to compel certain individuals to use their biometric features to unlock digital devices.[9]In denying the application, Judge Westmore held that an individual’s use of biometric features to unlock a digital device is a testimonial communication protected by the Fifth Amendment.[10]Specifically, Judge Westmore reasoned that if an individual could not be compelled to provide a password under the Fifth Amendment, then an individual also could not be compelled to provide a biometric feature to unlock a device because it serves the same function as a password.Furthermore, a successful finger or thumb scan confirms ownership or control over the device and would “expose to the government far more than the most exhaustive search of a house[,]” including medical and financial records (which many smartphone apps provide and protect with biometric features).[11]

Judge Westmore’s decision is a resounding reminder that the law must be quicker to adapt to technological advancements. Under the rule adopted by most courts, there is a meaningful distinction under the Fifth Amendment between using a password to protect your smartphone and using your fingerprint to protect the very same smartphone. The rule has turned what seems like an arbitrary decision for most – whether to use a passcode or biometric feature to secure a digital device – into one with significant privacy ramifications. Worse still, if manufacturers of digital devices opt to remove the password option for securing devices, leaving only the biometric options, this rule would severely limit (if not eliminate) any Fifth Amendment protections as to these devices. As biometrics become increasingly integrated with daily life, it is necessary for courts to avoid the mechanical application of predigital rules to post-digital problems.[12]

 

[1]U.S. Const. amend. V.

[2]State v. Diamond, 905 N.W.2d 870, 873 (Minn. 2018).

[3]Schmerber v. California, 384 U.S. 757, 763-64 (1966).

[4]Doe v. United States (Doe II), 487 U.S. 201, 209-10 (1988).

[5]U.S. v. Hubbell, 530 U.S. 27, 36 (2000).

[6]In re Grand Jury Subpoena Duces Tecum Dated Mar. 25, 2011, 670 F.3d 1335, 1346 (11th Cir. 2012).

[7]See e.g. In re the Search of [Redacted], 317 F.Supp.3d 523, 535-36 (D.D.C. 2018); In re the Search Warrant Application for [Redacted], 279 F.Supp.3d 800, 807 (N.D. Ill. 2017); State v. Diamond, 905 N.W.2d 870, 875 (2018).

[8]In re Residence in Oakland, California, No. 4-19-70053, 2019 WL 176937, at *3 (N.D. Cal. Jan. 10, 2019).

[9]Id.at *1.

[10]Id.at *3.

[11]Id.at *4.

[12]Riley v. California, 573 U.S. 373, 406-07 (2014).

Liability for a defectively-designed algorithm: Wickersham v. Ford

By Zane Muller, CLS ’20

The past few years have witnessed a dramatic increase in the prevalence and sophistication of algorithms. Advances in machine learning (sometimes called artificial intelligence) have delivered new applications in areas as diverse as credit risk evaluation, criminal sentencing and winning the ancient Chinese strategy game Go.  While they have long been incorporated into software and web interfaces, machine learning algorithms are increasingly used to improve consumer products, and consumers increasingly encounter them in the physical world.  As algorithms further permeate our everyday lives, the law will increasingly have to decide how to handle losses that arise when algorithms fail.

Design defects are intuitive in the case of, say, a lawnmower; but how is a machine learning algorithm “designed”?  In broad terms, machine learning refers to an automated process for identifying relationships between variables in a data set and making predictions based on those relationships.[1] Those relationships accumulate into a “model”, or algorithm, which can then be used to make predictions or decisions based on new data.[2] Their design involves two stages: “playing with the data” and “running the model.”[3] In the first stage, designers choose a set of data, determine an outcome goal (ie, “identify the likelihood that a given borrower will default”), and then train the model through various iterations until it independently delivers predictions in line with empirical results. In the second stage, designers “set it loose” in the world to interpret newly-gathered data and use it to deliver predictions or decisions, periodically refining or adjusting it based on the accuracy of results.

Will the law recognize and remedy injuries caused by a “defective” algorithm?  This  question arose in Wickersham v. Ford.[4]  In that case, the plaintiff’s husband committed suicide in the wake of an automobile accident that left him with continuous, extreme pain and debilitating injuries, including the loss of an eye.[5]  One of the  plaintiff’s expert witnesses stated that the cause of this injury was a 146-millisecond delay in the deployment of the seatbelt pre-tensioner and  side airbag.  A second expert witness testified that the cause of this delay was a defect in the design of the car’s Restraint Control Module (RCM), an electronic component that receives sensor data, processes it with an algorithm, and then determines whether and when to pre-tension seat belts and deploy airbags in anticipation of a collision.

In the instant case, the plaintiff’s expert witness alleged that Ford was negligent in designing its algorithm. More specifically, he claimed that the RCM was not properly calibrated for the type of crash the plaintiff’s husband experienced, and that his injuries could have been avoided if Ford had conducted more thorough testing.[6]

One challenge facing plaintiffs is that algorithms are “black boxes” whose workings are often opaque even to their designers. Here, the plaintiff was able to overcome this hurdle because her expert had experience working with a similar algorithm for General Motors.  Furthermore, the court held that the plaintiff alleged sufficiently particular and concrete facts to sustain a claim and denied Ford’s motion for summary judgment.[7]Wickersham presents a case where the causation of an injury by an algorithm’s failure is fairly straightforward; other plaintiffs, whose injuries may be less traceable to algorithmic design, may have a harder time overcoming summary judgment.  Not all algorithm design flaws will be as clear-cut as the failure to quickly deploy an airbag, but could nonetheless cause equally or more serious harms.  As algorithms further penetrate the physical world, these issues will only become more prominent and challenging for courts and lawmakers to resolve.

[1]Kevin P. Murphy, Machine Learning: A Probabilistic Perspective, 1 (2012).

[2]Michael Berry & Gordon Linoff, Data Mining Techniques: For Marketing, Sales, and Customer Relationship Management, 8-11 (2004).

[3]David Lehr & Paul Ohm, Playing With the Data: What Legal Scholars Should Learn about Machine Learning.51 U.C. Davis L. Rev. 653, 670 (2017).

[4]Wickersham v. Ford Motor Company, 194 F.Supp.3d 434 (2016).

[5]Id.at 435.

[6]Id. at 438.

[7]Id. at 436.