Cooper Davis Act Would Force Tech Cos. to Flag Users for Drugs
Internet drug sales have skyrocketed in recent years, allowing powerful narcotics to be peddled to American teenagers and adolescents. It’s a trend that’s led to an epidemic of overdoses and left countless young people dead. Now, a bill scheduled for a congressional vote seeks to tackle the problem, but it comes with a major catch. Critics worry that the legislative effort to crack down on the drug trade could convert large parts of the internet into a federal spying apparatus.
The Cooper Davis Act was introduced by Kansas Republican Sen. Roger Marshall and New Hampshire Democrat Sen. Jeanne Shaheen in March and has been under consideration by the Senate Judiciary Committee for weeks. Named after a 16-year-old Kansas boy who died of a fentanyl overdose two years ago, the bipartisan bill, which the committee is scheduled to vote on Thursday, has spurred intense debate. Proponents say it could help address a spiraling public health crisis; critics, meanwhile, see it as a gateway to broad and indiscriminate internet surveillance.
Gizmodo spoke with the American Civil Liberties Union and the Electronic Frontier Foundation—two organizations involved in the policy discussions surrounding the bill. Both groups expressed concern over the impact the proposed law could have on internet privacy. “There are some very real problems with this bill—both in how it’s written and how it’s conceptualized,” said India McKinney, an analyst with the EFF.
Critics argue that, at its worst, the bill would effectively “deputize” internet platforms as informants for the DEA, creating an unwieldy surveillance apparatus that may have unintended consequences down the line.
The Problem: The Amazon-ification of Drug Dealing
The Cooper Davis Act seeks to solve a very real problem: The ease with which drugs can now be purchased online. Back in the day, buying drugs used to be a slog. First, you had to know a guy—typically not a super pleasant or well-groomed one. Then, you had to meet up at said guy’s apartment or a street corner, where your plug would dole out the goods. It was an entire ordeal, filled with paranoia and inconvenience. But these days, buying drugs is a lot simpler. In fact, to hear federal officials tell it, buying narcotics is currently about as easy as DoorDashing a burrito. That’s because drug sales on social media platforms have exploded, creating a streamlined drug-buying experience that puts an entire black market at young people’s fingertips.
The negative impacts of this trend are obvious: reporting shows that powerful opioids are being pushed into the hands of young people through platforms like Facebook, Instagram, and Snapchat. Young people will seek out prescription medications—stuff like Xanax, Oxycontin, and Vicodin—only to be sold counterfeit pills that have secretly been laced with fentanyl or meth (this is done because of the narcotics’ cheapness and addictiveness). Teenagers looking to score will then be delivered fatally powerful drugs, which end up killing them.
What the Cooper Davis Act would do
In an attempt to solve this dizzying drug crisis, the Cooper Davis Act has proposed a radical strategy: according to the most recent version of the bill text, which was shared with Gizmodo by the ACLU, the law would require “electronic communication service providers and remote computing services” to report to the U.S. Attorney General any evidence they discover of “the unlawful sale and distribution of counterfeit substances and certain controlled substances.” What this means is that large tech companies—everything from social media giants like Instagram, Facebook, and Snapchat to cloud computing or email providers—would be legally required to report certain types of drug activity (basically anything having to do with fentanyl, meth, and counterfeit prescription medications) to the federal government if the company became aware of the drugs being bought or sold on their platforms.
That might theoretically sound like a good idea but the big question is: how, exactly, are platforms supposed to figure out who is a drug dealer and who isn’t? That part isn’t made clear by the legislation. What is clear is that, under the new law, platforms would be required to surrender large quantities of user data to the government if they suspected a particular user of wrongdoing. That data would be packaged into a report and sent to the DEA and would include…
…the [user’s] electronic mail address, Internet Protocol address, uniform resource locator, payment information (excluding personally identifiable information), screen names or monikers for the account used or any other accounts associated with the individual, or any other identifying information, including self-reported identifying information…
Additionally, platforms would also have the discretion to share even more data with the government if they felt like—including private communications like DMs and emails. Meanwhile, companies that failed to report evidence of drug offenses could face steep fines. A first failure to report drug activity could result in fines of up to $190,000 per violation, while each additional offense after that could see fines of up to $380,000 per violation.
Why the Cooper Davis Act seems like a bad idea
Critics see a number of dangers inherent in the Cooper Davis Act, but the biggest is that it could effectively subvert Americans’ already limited Fourth Amendment protections when it comes to the internet. “Right now, federal law protects user data and limits the ways that platforms and other entities can share it with law enforcement,” Cody Venzke, senior policy counsel with the ACLU, tells me. But Cooper Davis “would explicitly create an exception to those protections,” he said.
In theory, the Fourth Amendment is supposed to prohibit warrantless search and seizure of private property, meaning cops can’t bust down your door and dig through your stuff without a court order. This principle works pretty well in the real world but gets decidedly murky when it comes to the web. Because so much of Americans’ “personal” data is now stored by proprietary online platforms, it’s hard to say that this data is actually owned by the user. Instead, it’s really owned by the company, which means that if the company wants to share “your” data with the government, it’s usually well within its rights to do so.
Still, companies aren’t necessarily looking to do that on a regular basis and web users’ privacy is partially protected from government searches of corporate data by the Stored Communications Act, a 1986 law that stipulates police must secure a warrant or a subpoena before they can rifle through someone’s digital accounts. But the SCA already suffers from a number of loopholes and critics point out that the Cooper Davis Act would carve out yet another exception when it comes to drug-related activity. The SCA is specifically supposed to protect web users’ private communications, forcing cops to retrieve a warrant before they search them. However, Venzke says that, under the most recent version of the Cooper Davis bill, internet service providers are given the power to “hand over messages, emails, private posts,” and other personal communications to law enforcement “with no notice to the user, no judicial oversight, and no warrant.”
This bill would do more than whittle away Americans’ online rights, however. In essence, it would deputize large parts of the internet as an unofficial wing of the federal government—offloading some of the investigative work from police agencies onto the shoulders of major tech firms. Instead of the DEA having to find a narcotics suspect and then secure a court order for that person’s digital records, tech companies would be responsible for finding the suspect for the DEA and would then be obligated to send the government a ton of information about that web user, all without any sort of involvement of the court system.
The Cooper Davis Act might have unintended consequences
The premise of Cooper Davis is disturbing enough, but even more alarming are the law’s lack of technical details. The bill plops a hefty responsibility onto web companies (identifying and reporting criminal suspects) but does almost nothing to elucidate how they should go about doing that.
Companies looking for a roadmap would likely end up turning to another federal policy known as 2258A. Venzke says that the Cooper Davis Act is actually modeled off of 2258A and that it uses similar policy and language. This longstanding law requires web companies to report child sexual abuse material to the federal government if the companies become aware of it on their platforms. Under this regulation, web platforms are obligated to report suspected child abuse material to the CyberTipline of the National Center for Missing and Exploited Children, a federally funded nonprofit established by Congress to combat child abuse. NCMEC, in turn, forwards the reports it receives to relevant law enforcement agencies for further investigation.
Over the years, companies like Facebook, Apple, and Google have addressed 2258A’s reporting requirements by developing a sophisticated surveillance system designed to detect abuse material when it’s uploaded to their sites; the system leverages a database of cryptographic hashes, each of which represents a known child abuse image or video. Companies then scan user accounts for matches to these hashes and, when they get a positive hit, they forward the user’s relevant data to NCMEC.
However, when it comes to online drug activity, things are decidedly more complicated. Unlike the problem of CSAM—in which a database of known prohibited material can be compiled and scanned against—it’s far from clear how companies would reliably identify and report suspected drug activity. Online drug transactions are largely carried out under the cover of coded language, using oblique terms and signals. How are companies supposed to sift through all that without driving themselves (and their users) insane?
“If platforms are actively monitoring for fentanyl [sales], they’re going to have to look for a lot more than images and videos,” said Venzke. “They’re going to have to dig through speech, they’re going to have to look at emojis, they’re going to have to try to infer user intent.” Since the bill does little to stipulate how reporting will be conducted, it will be up to the companies to figure out how to do all this. This could easily lead platforms to build their own internal surveillance systems, the likes of which are designed to monitor how platform users interact in an effort to ferret out drug activity. In this scenario, the likelihood that platforms would end up reporting a lot of “false positives” to the government (i.e., people suspected of drug activity who, in reality, have done nothing wrong) would be high, Venzke says.
“Content moderation of this sort, at scale, is really, really, really hard,” McKinney agreed. “As good as AI is, context matters. A word should not be enough to trigger extra surveillance.”
Overall, critics feel the law could be a disaster for internet privacy.
“The point of the Constitution, the point of the Fourth Amendment…is that the government is supposed to be constrained as to what they’re allowed to access about our private thoughts,” said McKinney. “Obviously the government doesn’t like being constrained. They want to be able to see everything.”
Venzke, meanwhile, said he and his colleagues were “holding their breath” until the vote goes through. “The Senate Judiciary has been proactive in addressing folks’ safety online, but unfortunately they’ve done it by undermining free speech and privacy online, which is not the right approach…We’re hoping folks will stand up for our privacy rights and that the bill will be pulled from consideration.”
Gizmodo reached out to the offices of Senator Marshall for comment but did not hear back. We will update this story if we do.
Source link