EFF: COVID-19 and Digital Rights

The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Here are their thoughts on threats and opportunities arising from COVID-19 response, COVID-19 and Digital Rights.

Surveillance. Governments around the world are demanding extraordinary new surveillance powers that many hope will contain the virus’ spread. But many of these powers would invade our privacy, inhibit our free speech, and disparately burden vulnerable groups of people. Mindful of the stakes, we ask three questions when analyzing proposals that would provide greater surveillance powers to the government: Would the proposal work? Would it excessively intrude on our freedoms? Are there sufficient safeguards? Different proposals raise different issues. For example:

  • Government has not shown that some intrusive technologies would work, such as phone location surveillance, which is insufficiently granular to identify when two people were close enough together to transmit the virus.
  • Some surveillance proposals are too dangerous to a democratic society, such as dragnet surveillance cameras in public places that use face recognition or thermal imaging, mounting such technologies on drones, or giving police officers access to public health data about where people who have tested positive live.
  • Some technologies, such as aggregate location data used to inform public health decisions, need strict safeguards.
  • No COVID tracking app will work absent widespread testing and interview-based contact tracing. Bluetooth proximity is the most promising approach so far, but needs rigorous security testing and data minimization. No one should be forced to use it.

Many new government surveillance programs are being built in partnership with corporations that hold vast stores of consumers’ personal data. We need new laws to protect our data privacy.

Free speech. The free flow of ideas about COVID-19 is vital. This includes anonymous whistle-blowing about containment efforts, online criticisms of government responses to the crisis, and prisoner access to social media to tell the world about outbreaks behind bars. Governments will inevitably abuse any new powers to censor what they deems false information about the virus. When online platforms increase their reliance on automated content moderation, in part because human moderators cannot safely come to work, those moderation “decisions” must be temporary, transparent, and easily appealable

Government transparency. Government decision-making about the virus must be transparent. When governments temporarily close the physical spaces where they make decisions, for purposes of social distancing, they must adopt new transparency accommodations, such as broadcasting their proceedings. While government responses to public records requests may be slower during this public health crisis, the outbreak is no excuse to shut them down altogether…(continues)

EFF: EARN IT Bill to Scan Every Online Message

From digital civil liberties champion Electronic Frontier Foundation, The EARN IT Bill Is the Government’s Plan to Scan Every Message Online

Imagine an Internet where the law required every message sent to be read by government-approved scanning software. Companies that handle such messages wouldn’t be allowed to securely encrypt them, or they’d lose legal protections that allow them to operate.

That’s what the Senate Judiciary Committee has proposed and hopes to pass into law. The so-called EARN IT bill, sponsored by Senators Lindsay Graham (R-SC) and Richard Blumenthal (D-CT), will strip Section 230 protections away from any website that doesn’t follow a list of “best practices,” meaning those sites can be sued into bankruptcy. The “best practices” list will be created by a government commission, headed by Attorney General Barr, who has made it very clear he would like to ban encryption, and guarantee law enforcement “legal access” to any digital message.

The EARN IT bill had its first hearing today, and its supporters’ strategy is clear. Because they didn’t put the word “encryption” in the bill, they’re going to insist it doesn’t affect encryption.

“This bill says nothing about encryption,” co-sponsor Sen. Blumenthal said at today’s hearing. “Have you found a word in this bill about encryption?” he asked one witness.

It’s true that the bill’s authors avoided using that word. But they did propose legislation that enables an all-out assault on encryption. It would create a 19-person commission that’s completely controlled by the Attorney General and law enforcement agencies. And, at the hearing, a Vice-President at the National Center for Missing and Exploited Children (NCMEC) made it clear [PDF] what he wants the best practices to be. NCMEC believes online services should be made to screen their messages for material that NCMEC considers abusive; use screening technology approved by NCMEC and law enforcement; report what they find in the messages to NCMEC; and be held legally responsible for the content of messages sent by others.

You can’t have an Internet where messages are screened en masse, and also have end-to-end encryption any more than you can create backdoors that can only be used by the good guys. The two are mutually exclusive. Concepts like “client-side scanning” aren’t a clever route around this; such scanning is just another way to break end-to-end encryption. Either the message remains private to everyone but its recipients, or it’s available to others…

Click here to read the entire article at EFF.org.

EFF: Dangers to Privacy in EARN IT Act

The EARN IT Act introduced by Senator Lindsay Graham purports to be for the prevention of online child exploitation “and other purposes.” It’s those other purposes that we need to watch. The EFF, an organization fighting for your digital civil liberties, writes the article Congress Must Stop the Graham-Blumenthal Anti-Security Bill, expounding upon the many dangers lurking inside this bill.

There’s a new and serious threat to both free speech and security online. Under a draft bill that Bloomberg recently leaked, the Attorney General could unilaterally dictate how online platforms and services must operate. If those companies don’t follow the Attorney General’s rules, they could be on the hook for millions of dollars in civil damages and even state criminal penalties.

The bill, known as the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act, grants sweeping powers to the Executive Branch. It opens the door for the government to require new measures to screen users’ speech and even backdoors to read your private communications—a stated goal of one of the bill’s authors.

Senators Lindsay Graham (R-SC) and Richard Blumenthal (D-CT) have been quietly circulating a draft version of EARN IT. Congress must forcefully reject this dangerous bill before it is introduced.

EARN IT Is an Attack on Speech

EARN IT undermines Section 230, the most important law protecting free speech online. Section 230 enforces the common-sense principle that if you say something illegal online, you should be the one held responsible, not the website or platform where you said it (with some important exceptions)…

EARN IT is a direct threat to constitutional protections for free speech and expression. To pass constitutional muster, a law that regulates the content of speech must be as narrowly tailored as possible so as not to chill legitimate, lawful speech. Rather than being narrowly tailored, EARN IT is absurdly broad: under EARN IT, the Commission would effectively have the power to change and broaden the law however it saw fit, as long as it could claim that its recommendations somehow aided in the prevention of child exploitation. Those laws could change and expand unpredictably, especially after changes in the presidential administration…

Throughout his term as Attorney General, William Barr has frequently and vocally demanded “lawful access” to encrypted communications, ignoring the bedrock technical consensus that it is impossible to build a backdoor that is only available to law enforcement. Barr is far from the first administration official to make impossible demands of encryption providers: he joins a long history of government officials from both parties demanding that encryption providers compromise their users’ security.

We know how Barr is going to use his power on the “best practices” panel: to break encryption. He’s said, over and over, that he thinks the “best practice” is to always give law enforcement extraordinary access. So it’s easy to predict that Barr would use EARN IT to demand that providers of end-to-end encrypted communication give law enforcement officers a way to access users’ encrypted messages. This could take the form of straight-up mandated backdoors, or subtler but no less dangerous “solutions” such as client-side scanning. These demands would put encryption providers like WhatsApp and Signal in an awful conundrum: either face the possibility of losing everything in a single lawsuit or knowingly undermine their own users’ security, making all of us more vulnerable to criminals…

Weakening Section 230 makes it much more difficult for a startup to compete with the likes of Facebook or Google. Giving platforms a legal requirement to screen or filter users’ posts makes it extremely difficult for a platform without the resources of the big five tech companies to grow its user base (and of course, if a startup can’t grow its user base, it can’t get the investment necessary to compete)…

Click here to read the entire article at EFF

 

EFF Assists in Right to Repair Law

Cory Doctorow of the Electronic Frontier Foundation, a non-profit group which works to protect civil liberties in the digital world, has written about how the EFF is assisting legislation in the state of Massachusetts to help protect vehicle owners’ right to repair their vehicles on their own or at dealer independent service providers. Farmers in our own area are well acquainted with the efforts of tractor manufacturers to limit their right to repair. Back in 2012, Massachusetts became the first state to pass right to repair legislation which ended up improving access to repair information for most of the country. Manufacturers have since redesigned their products to try to avoid those protections.

Bay Staters Continue to Lead in Right to Repair, and EFF Is There to Help

…EFF was pleased to submit comments to the Massachusetts Legislature’s Joint Committee on Consumer Protection and Professional Licensure for a hearing on January 13 in support of HB4122.

In those comments, sent to each member of the Committee, EFF Special Consultant Cory Doctorow wrote:

Auto manufacturers have argued that independent service endangers drivers’ cybersecurity. In reality, the opposite is true: security is weakened by secrecy and strengthened by independent testing and scrutiny. It is an iron law of information security that “there is no security in obscurity”—that is, security cannot depend on keeping defects a secret in the hopes that “bad guys” won’t discover and exploit those defects. And since anyone can design a security system that they themselves can’t imagine any way of breaking, allowing manufacturers to shroud their security measures in secrecy doesn’t mean that their cars can’t be hacked—in fact, history has shown that vehicle computers depending on secrecy for security are, in fact, frequently vulnerable to hacking.

In 2018 and 2019, cities, hospitals, and other large institutions had their informatics systems seized by petty criminals using off-the-shelf ransomware that had combined with a defect in Windows that the NSA had discovered and kept secret—until an NSA leaker released it to the world. As these cities discovered, the NSA’s decision to keep these defects secret did not put them out of reach of bad guys—it just meant that institutional Microsoft customers were put at grave risk, and that Microsoft itself did not know about the devastating bugs in its own products and so could not fix them.

Information security is absolutely reliant upon independent security researchers probing systems and disclosing what they discover. Allowing car manufacturers to monopolize service—and thus scrutiny—over their products ensures that the defects in these fast-moving, heavy machines will primarily become generally known after they are exploited to the potentially lethal detriment of drivers and the pedestrians around them.

The manufacturers’ desire to monopolize bad news about design defects in their own products is especially dire because it rides on the tails of a strategy of monopolizing service and parts for those products. The uncompetitive, concentrated automotive sector has already brought itself to the brink of ruin—averted only by the infusion of $80.7B in tax-funded bailouts. More than a decade later, it remains in dire need of competitive discipline, as is evidenced by a commercial strategy dominated by reducing public choice, surveilling their own customers and selling their data, and extracting monopoly rents from luckless drivers who are locked into their proprietary ecosystems.

EFF: Ending Government Use of Face Surveillance

The Electronic Frontier Foundation (EFF) has launched a new campaign called About Face to help communities call for an end to government use of face surveillance. With the recent announcement that facial recognition is coming to Sea-Tac airport, you can see that face surveillance is becoming more and more prevalent in America.

…Many forms of biometric data collection raise a wealth of privacy, security, and ethical concerns. Face surveillance ups the ante. We expose our faces to public view every time we go outside. Paired with the growing ubiquity of surveillance cameras in our public, face surveillance technology allows for the covert and automated collection of information related to when and where we worship or receive medical care, and who we associate with professionally or socially.

Many proponents of the technology argue that there is no reasonable expectation of privacy when we spend time in public, and that if we have nothing to hide, we have nothing to fear. EFF is not alone in finding this argument meritless. In his recent majority opinion in the watershed Carpenter v. United States (2018), Supreme Court Chief Justice John Roberts wrote: “A person does not surrender all Fourth Amendment protection by venturing in the public sphere.” In a recent Wired interview, Attorney Gretchen Greene explains: “Even if I trust the government, I do care. I would rather live in a world where I feel like I have some privacy, even in public spaces.” Greene goes on to identify the inherent First-Amendment concerns implicated by government use of face surveillance: “If people know where you are, you might not go there. You might not do those things.”

Like many of us, Greene is particularly concerned about how the technology will impact members of already marginalized communities. “Coming out as gay is less problematic professionally than it was, in the US, but still potentially problematic. So, if an individual wants to make the choice [of] when to publicly disclose that, then they don’t want facial recognition technology identifying that they are walking down the street to the LGBTQ center.” These concerns are not limited to any one community, and the impacts will be felt regardless of intent. “We’re not trying to stop people from going to church, we’re not trying to stop them from going to community centers, but we will if they are afraid of [the consequence] in an environment that is hostile to, for instance, a certain ethnicity or a certain religion…”

Click here to read the entire article at EFF.org.

EFF: US-UK Agreement to Allow Warrantless Access to US Internet Servers

This article is from the Electronic Frontier Foundation, which fights for your digital freedoms, about an agreement between the US and the UK which would allow the UK police access to data held by American companies without following US privacy laws or the 4th Amendment.

Congress, Remember the 4th Amendment? It’s Time to Stop the U.S.-UK Agreement.

Unless Congress stops it, foreign police will soon be able to collect and search data on the servers of U.S. Internet companies. They’ll be able to do it without a probable cause warrant, or any oversight from a U.S. judge. This is all happening because of a new law enforcement deal between the U.S. and the United Kingdom. And while it seeks to exclude purely domestic correspondence between U.S. citizens and residents, plenty of Americans’ data will get swept up when they communicate with targeted individuals located abroad.

This is all happening because, for the first time, the U.S. executive branch is flexing its power to enter into law enforcement agreements under the CLOUD Act. We’ve been strongly opposed to this law since it was introduced last year. The recently signed deal between the U.S. Department of Justice and the U.K. Home Office will allow U.K. police easy access to data held by American companies, regardless of where the data is stored. These U.K. data requests, including demands to collect real-time communications, do not need to meet the standards set by U.S. privacy laws or the 4th Amendment. Similarly, the deal will allow U.S. police to grab information held by British companies without following U.K. privacy laws.

This deal, negotiated by American and British law enforcement behind closed doors and without public input, will deal a hammer blow to the legal rights of citizens and residents of both countries. And the damage won’t stop there. The U.S.-U.K. Cloud Act Agreement may well become a model for further bilateral deals with other foreign governments and the United States. Earlier this month, Australian law enforcement agencies began negotiating their own deal to directly access private information held by U.S. Internet companies.

There’s still one possible path to put the brakes on this disastrous U.S.-UK deal: Congress can introduce a joint resolution of disapproval of the agreement within 180 days. This week, EFF has joined 19 other privacy, civil liberties, and human rights organizations to publish a joint letter explaining why Congress must take action to resist this deal.

No Prior Judicial Authorization

In the U.S., the standard for when law enforcement can collect stored communications content is clear: police need to get a warrant, based on probable cause. If police want to wiretap an active conversation, they have to satisfy an even higher standard, sometimes called a “super warrant,” that limits both the timing and use of a wiretap. Perhaps most importantly, stored communications warrants and wiretap warrants have to be signed by a U.S. judge, which adds an extra layer of review to whether privacy standards are met. At EFF, a core part of our work is insisting on the importance of a warrant in many different scenarios.

Judicial authorization is a critical step in the U.S. warrant process. When police search people’s private homes, offices, or devices, they must justify why the search for specific evidence outweighs the presumption that individuals remain free from government intrusion. Judicial authorization acts as a safeguard between citizens and law enforcement. Further, history has shown that police can and will abuse their powers for intimidation, or even personal gain. In colonial times, the British military used general warrants to search through colonists’ houses and seize property—actions that helped fuel a revolution, and formed the basis for the 4th Amendment to the U.S. Constitution.

Incredibly, the DOJ has just thrown those rights away. Instead of relying on probable cause, the new agreement uses an untested privacy standard that says that orders must be based on a “reasonable justification based on articulable and credible facts, particularity, legality, and severity.” No judge in any country has decided what this means. Continue reading “EFF: US-UK Agreement to Allow Warrantless Access to US Internet Servers”

EFF: Big Tech’s Disingenuous Push for a Federal Privacy Law

Following the theme of the earlier article on The Meat Packing Myth is this article from the Electronic Frontier Foundation – an organization leading the fight for digital privacy and free speech — about a push by big tech companies for federal regulation of digital privacy and why this push is in the self-interest of these corporations rather than in support of your actual privacy.

Big Tech’s Disingenuous Push for a Federal Privacy Law

This week, the Internet Association launched a campaign asking the federal government to pass a new privacy law.

The Internet Association (IA) is a trade group funded by some of the largest tech companies in the world, including Google, Microsoft, Facebook, Amazon, and Uber. Many of its members keep their lights on by tracking users and monetizing their personal data. So why do they want a federal consumer privacy law?

Surprise! It’s not to protect your privacy. Rather, this campaign is a disingenuous ploy to undermine real progress on privacy being made around the country at the state level. IA member companies want to establish a national “privacy law” that undoes stronger state laws and lets them continue business as usual. Lawyers call this “preemption.” IA calls this “a unified, national standard” to avoid “a patchwork of state laws.” We call this a big step backwards for all of our privacy.

The question we should be asking is, “What are they afraid of?”

Stronger state laws

After years of privacy scandals, Americans across the political spectrum want better consumer privacy protections. So far, Congress has failed to act, but states have taken matters into their own hands. The Illinois Biometric Information Privacy Act (BIPA), passed in 2008, makes it illegal to collect biometric data from Illinois citizens without their express, informed, opt-in consent. Vermont requires data brokers to register with the state and report on their activities. And the California Consumer Privacy Act (CCPA), passed in 2018, gives users the right to access their personal data and opt out of its sale. In state legislatures across the country, consumer privacy bills are gaining momentum.

This terrifies big tech companies. Last quarter alone, the IA spent nearly $176,000 lobbying the California legislature, largely to weaken CCPA before it takes effect in January 2021. Thanks to the efforts of a coalition of privacy advocates, including EFF, it failed. The IA and its allies are losing the fight against state privacy laws. So, after years of fighting any kind of privacy legislation, they’re now looking to the federal government to save them from the states. The IA has joined Technet, a group of tech CEOs, and Business Roundtable, another industry lobbying organization, in calls for a weak national “privacy” law that will preempt stronger state laws. In other words, they want to roll back all the progress states like California have made, and prevent other states from protecting consumers in the future. We must not allow them to succeed.

A private right of action

Laws with a private right of action allow ordinary people to sue companies when they break the law. This is essential to make sure the law is properly enforced. Without a private right of action, it’s up to regulators like the Federal Trade Commission or the U.S. Department of Justice to go after misbehaving companies. Even in the best of times, regulatory bodies often don’t have the resources needed to police a multi-trillion dollar industry. And regulators can fall prey to regulatory capture. If all the power of enforcement is left in the hands of a single group, an industry can lobby the government to fill that group with its own people. Federal Communications Commission chair Ajit Pai is a former Verizon lawyer, and he’s overseen massive deregulation of the telecom industry his office is supposed to keep in check.

The strongest state privacy laws include private rights of action. Illinois BIPA allows users whose biometric data is illegally collected or handled to sue the companies responsible. And CCPA lets users sue when a company’s negligence results in a breach of personal information. The IA wants to erase these laws and reduce the penalties its member companies can face for their misconduct in legal proceedings brought by ordinary consumers…