FCLJ is excited to present the third and final Issue of Volume 74 of the Federal Communications Law Journal. This Issue features a practitioner Article, three student Notes, and the Journal’s Annual Review of notable communications law. Issue Three covers several topics relating to consumer privacy, technological advancements, social media regulation, and more. Our authors present a myriad of ideas and arguments that demonstrate the need for federal regulators and legislative bodies to take action in the rapidly expanding field of communications and technology law.
This issue begins with an Article written by Scott Jordan. Jordan, a computer science professor at the University of California, Irvine, discusses consumer privacy—primarily, he addresses user notice and choice requirements as incorporated in Europe’s General Data Privacy Protection Regulation and the California Consumer Privacy Act. Jordan critiques aspects of each law, suggesting that neither are sufficient for consumers’ protection. Instead, Jordan proposes his own statutory text to resolve the current inadequacies in consumer privacy regulations.
This Issue also features three student Notes. The first Note, written by Jaylla Brown, posits that the use of biometric facial recognition—a technology police use to identify potential suspects with varying accuracy— should be disclosed to defendants under the rule in Brady v. Maryland. Brown argues that facial recognition system misidentifications disparately impact women and people of color, and that disclosing its use under Brady’s due process requirements is imperative to formulating a misidentification defense.
The second Note, written by Veronica Lark, focuses on the Fourth Amendment’s protection of consumer privacy as applied to blockchain transactions. Lark urges that the third-party doctrine, which states that consumers have no expectation of privacy in third-party entities, should not be applied to blockchain transactions. The Journal’s third and final Note of Volume 74 is authored by Jadyn Marks. Marks addresses the ever-relevant topic of social media regulation, focusing her analysis on Facebook, Twitter, and Parler’s internal policies for regulating political advertising, misinformation, and disinformation. Marks argues in favor of federal legislation that would permit the Federal Trade Commission to regulate these entities, citing section 230 and public policy as justifications for her proposition.
Finally, this Issue features our Annual Review of notable court decisions pertinent to our field. This year’s review contains six case briefs summarizing the relevant issues and analysis presented in each case. I sincerely appreciate each Journal member who authored these case briefs and their contribution to communication law scholarship.
On behalf of the outgoing members of Volume 74, I would like to thank The George Washington University Law School, our faculty advisors, and the Federal Communications Bar Association for their support over the past year. Our publication has sincerely benefitted from your guidance and assistance. On my own behalf, I would like to thank the Volume 74 Editorial Board, Associates, Members, and authors who made this Volume possible. We have been honored to provide quality scholarship to the communications field and beyond, and are confident the Volume 75 Editorial Board will continue the Journal’s excellence.
The Journal is committed to providing its readership with scholarly analysis and thought leadership on topics relevant to communications and information technology law and related policy issues. The Journal thus welcomes any submissions for publication, which may be directed
to email@example.com for consideration. Any further questions or comments may be directed to firstname.lastname@example.org. This Annual Review Issue and our archives are available at http://www.fclj.org.
By Scott Jordan
It is time for the United States Congress to pass a comprehensive consumer privacy law. The European General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) serve as starting points for several recent bills. However, neither the GDPR nor the CCPA mandate that users be given choices based on whether or not their personal information is reasonably identifiable, or based on whether or not their personal information is used for tracking. As a result, the GDPR fails to effectively incentivize use of pseudonymization, and the CCPA fails to effectively disincentivize tracking. This article develops classifications of personal information based on the degree of identifiability of this information, and creates a choice framework that, unlike the GDPR or the CCPA, utilizes all three options: mandating use through terms and conditions, requiring an opt-out choice, and requiring opt-in consent. This article develops corresponding notice requirements that enable consumers to make informed choices over the collection, use, and sharing of their personal information. These proposals can be used to create policy options in between those options offered by the GDPR and the CCPA.
By Jaylla Brown
Law enforcement has used facial recognition technology for years to aid in the criminal investigative process without regulation. Facial recognition technology is most inaccurate when attempting to identify people of darker skin tones and women. These uniquely vulnerable classes of defendants are entitled to access evidence of poor algorithm quality and police misuse of facial recognition technology under the Brady rule. Lynch v. Florida is the only case that examines police use of facial recognition through the Brady doctrine, but the Florida Court of Appeals dismissed its application in this context. While scholars have presented the Brady rule as a solution for inaccessibility to facial recognition evidence, my Note focuses on the heightened need for access to this evidence for defendants susceptible to misidentification.
This paper explains how facial recognition technology disproportionately misidentifies people based on race and gender and why any evidence indicating this occurrence satisfies the elements of the Brady rule. Until the racial disparities of facial recognition technology are solved, or restrictions are placed on how police use this technology, the Brady rule could provide opportunity for a fair trial when the only defense is misidentification by technology designed and used to disproportionately identify Black and brown people.
Building Blocks of Privacy: Why the Third-Party Doctrine Should Not Be Applied to Blockchain Transactions
By Veronica Lark
This paper draws a distinction between blockchains and cryptocurrency exchanges, and it shows how this distinction should alter the third-party doctrine analysis under the Fourth Amendment. By nature, blockchain is not a third-party entity—it is distinct from third-party cryptocurrency exchanges. However, courts have applied the third-party doctrine in cases implicating cryptocurrency exchanges, even when the malicious behavior occurred off of the exchange, on the decentralized blockchain, making this distinction a moot point. Carpenter proposes a framework concerning emerging technology that could easily be applied to the distinction between blockchain and exchanges. Other scholars in the privacy space have distinguished the problem of equating secrecy with privacy—a problem that is upheaved by blockchain’s transparency and lack of third-party ownership—but there is also a need to distinguish blockchains from exchanges which use Know-Your-Customer protocols. With this in mind, requiring that law enforcement acquire a search warrant to pursue Coinbase customers seems to be the best way to resolve the issue. Consumers should have a reasonable expectation of privacy in their blockchain transactions because personally identifiable information is not shared with the blockchain, it is only shared with a cryptocurrency exchange. Consumers should not be subject to a general warrant simply for having a Coinbase account.
By Jadyn Marks
The tumultuous 2020 election brought to light several prevalent social and political issues, including the spread of misinformation and disinformation on social media. At present, social media sites are virtually unregulated in this area due to protections from section 230 of the Communications Decency Act of 1996. Due to Big Tech’s minimal competition, social media companies can make virtually any rules they like and remain competitive as social media sites. Furthermore, social media companies whose business models thrive on engagement and hits are disincentivized to remove or flag disinformation when it increases engagement and thus increases profits. Inaccurate information about procedural aspects of elections, including locations of polling places, registration and voter eligibility, and the status of ongoing elections lead to voter disenfranchisement and have concerning implications for American democracy. To combat the promulgation of procedural election disinformation on social media websites, Congress should pass legislation enabling the Federal Trade Commission to promulgate regulations regarding paid-for advertising containing procedural election information. The FTC should then conduct hearings to help identify regulations that social media sites must take, as well as best practices that social media sites are advised, but not required to take.
FCC v. Prometheus Radio Project
141 S. Ct. 1150 (2021)
NetChoice, LLC v. Paxton
2021 WL 5755120 (W.D. Tex. 2021)
AT&T Services, Inc. v. FCC
21 F.4th 841 (D.C. Cir. 2021)
Colon v. Twitter, Inc
14 F.4th 1213 (11th Cir. 2021)
Mahanoy Area School District v. B.L.
141 S. Ct. 2038 (2021)
ACA Connects v. Bonta
24 F.4th 1233 (9th Cir. 2022)