Electronic Frontier Foundation

EFF Argues 'If Not Overturned, a Bad Copyright Decision Will Lead Many Americans to Lose Internet Access' (eff.org) 89

The EFF's senior staff attorney and their legal intern are warning that a bad copyright decision by a district court judge could lead many Americans to lose their internet access.

"In going after ISPs for the actions of just a few of their users, Sony Music, other major record labels, and music publishing companies have found a way to cut people off of the internet based on mere accusations of copyright infringement." When these music companies sued Cox Communications, an ISP, the court got the law wrong. It effectively decided that the only way for an ISP to avoid being liable for infringement by its users is to terminate a household or business's account after a small number of accusations — perhaps only two. The court also allowed a damages formula that can lead to nearly unlimited damages, with no relationship to any actual harm suffered.

If not overturned, this decision will lead to an untold number of people losing vital internet access as ISPs start to cut off more and more customers to avoid massive damages...

The district court agreed with Sony that Cox is responsible when its subscribers — home and business internet users — infringe the copyright in music recordings by sharing them on peer-to-peer networks. It effectively found that Cox didn't terminate accounts of supposedly infringing subscribers aggressively enough. An earlier lawsuit found that Cox wasn't protected by the Digital Millennium Copyright Act's (DMCA) safe harbor provisions that protect certain internet intermediaries, including ISPs, if they comply with the DMCA's requirements. One of those requirements is implementing a policy of terminating "subscribers and account holders... who are repeat infringers" in "appropriate circumstances." The court ruled in that earlier case that Cox didn't terminate enough customers who had been accused of infringement by the music companies.

In this case, the same court found that Cox was on the hook for the copyright infringement of its customers and upheld the jury verdict of $1 billion in damages — by far the largest amount ever awarded in a copyright case.

The District Court got the law wrong... An ISP can be contributorily liable if it knew that a customer infringed on someone else's copyright but didn't take "simple measures" available to it to stop further infringement. Judge O'Grady's jury instructions wrongly implied that because Cox didn't terminate infringing users' accounts, it failed to take "simple measures." But the law doesn't require ISPs to terminate accounts to avoid liability. The district court improperly imported a termination requirement from the DMCA's safe harbor provision (which was already knocked out earlier in the case). In fact, the steps Cox took short of termination actually stopped most copyright infringement — a fact the district court simply ignored.

The district court also got it wrong on vicarious liability... [T]he court decided that because Cox could terminate accounts accused of copyright infringement, it had the ability to supervise those accounts. But that's not how other courts have ruled. For example, the Ninth Circuit decided in 2019 that Zillow was not responsible when some of its users uploaded copyrighted photos to real estate listings, even though Zillow could have terminated those users' accounts. In reality, ISPs don't supervise the Internet activity of their users. That would require a level of surveillance and control that users won't tolerate, and that EFF fights against every day.

The consequence of getting the law wrong on secondary liability here, combined with the $1 billion damage award, is that ISPs will terminate accounts more frequently to avoid massive damages, and cut many more people off from the internet than is necessary to actually address copyright infringement...

They also argue that the termination of accounts is "overly harsh in the case of most copyright infringers" — especially in a country where millions have only one choice for broadband internet access. "Being effectively cut off from society when an ISP terminates your account is excessive, given the actual costs of non-commercial copyright infringement to large corporations like Sony Music." It's clear that Judge O'Grady misunderstood the impact of losing Internet access. In a hearing on Cox's earlier infringement case in 2015, he called concerns about losing access "completely hysterical," and compared them to "my son complaining when I took his electronics away when he watched YouTube videos instead of doing homework."
Electronic Frontier Foundation

PayPal Shuts Down Long-Time Tor Supporter With No Recourse (eff.org) 152

An anonymous reader quotes a report from the Electronic Frontier Foundation: Larry Brandt, a long-time supporter of internet freedom, used his nearly 20-year-old PayPal account to put his money where his mouth is. His primary use of the payment system was to fund servers to run Tor nodes, routing internet traffic in order to safeguard privacy and avoid country-level censorship. Now Brandt's PayPal account has been shut down, leaving many questions unanswered and showing how financial censorship can hurt the cause of internet freedom around the world.

Brandt first discovered his PayPal account was restricted in March of 2021. Brandt reported to EFF: "I tried to make a payment to the hosting company for my server lease in Finland. My account wouldn't work. I went to my PayPal info page which displayed a large vertical banner announcing my permanent ban. They didn't attempt to inform me via email or phone -- just the banner." Brandt was unable to get the issue resolved directly through PayPal, and so he then reached out to EFF. [...] We found no evidence of wrongdoing that would warrant shutting down his account, and we communicated our concerns to PayPal. Given that the overwhelming majority of transactions on Brandt's account were payments for servers running Tor nodes, EFF is deeply concerned that Brandt's account was targeted for shut down specifically as a result of his activities supporting Tor.

We reached out to PayPal for clarification, to urge them to reinstate Brandt's account, and to educate them about Tor and its value in promoting freedom and privacy globally. PayPal denied that the shutdown was related to the concerns about Tor, claiming only that "the situation has been determined appropriately" and refusing to offer a specific explanation. After several weeks, PayPal has still refused to reinstate Brandt's account. [...] EFF is calling on PayPal to do better by its customers, and that starts by embracing the Santa Clara principles [which attempt to guide companies in centering human rights in their decisions to ban users or take down content]. Specifically, we are calling on them to: publish a transparency report, provide meaningful notice to users, and adopt a meaningful appeal process.
The Tor Project said in an email: "This is the first time we have heard about financial persecution for defending internet freedom in the Tor community. We're very concerned about PayPal's lack of transparency, and we urge them to reinstate this user's account. Running relays for the Tor network is a daily activity for thousands of volunteers and relay associations around the world. Without them, there is no Tor -- and without Tor, millions of users would not have access to the uncensored internet."

Brandt says he's not backing down and is still committed to supporting the Tor network to pay for servers around the world using alternative means. "Tor is of critical importance for anyone requiring anonymity of location or person," says Brandt. "I'm talking about millions of people in China, Iran, Syria, Belarus, etc. that wish to communicate outside their country but have prohibitions against such activities. We need more incentives to add to the Tor project, not fewer."
The Courts

College Student Sues Proctorio After Source Code Copyright Claim (theverge.com) 35

The Electronic Frontier Foundation (EFF) has filed a lawsuit against the remote testing company Proctorio on behalf of Miami University student Erik Johnson. The Verge reports: The lawsuit is intended to "quash a campaign of harassment designed to undermine important concerns" about the company's remote test-proctoring software, according to the EFF. The lawsuit intends to address the company's behavior toward Johnson in September of last year. After Johnson found out that he'd need to use the software for two of his classes, Johnson dug into the source code of Proctorio's Chrome extension and made a lengthy Twitter thread criticizing its practices -- including links to excerpts of the source code, which he'd posted on Pastebin. Proctorio CEO Mike Olsen sent Johnson a direct message on Twitter requesting that he remove the code from Pastebin, according to screenshots viewed by The Verge. After Johnson refused, Proctorio filed a copyright takedown notice, and three of the tweets were removed. (They were reinstated after TechCrunch reported on the controversy.)

In its lawsuit, the EFF is arguing that Johnson made fair use of Proctorio's code and that the company's takedown "interfered with Johnson's First Amendment right." "Copyright holders should be held liable when they falsely accuse their critics of copyright infringement, especially when the goal is plainly to intimidate and undermine them," said EFF Staff Attorney Cara Gagliano in a statement. "I'm doing this to stand up against student surveillance, as well as abuses of copyright law," Johnson told The Verge. "This isn't the first, and won't be the last time a company abuses copyright law to try and make criticism more difficult. If nobody calls out this abuse of power now, it'll just keep happening."

The Courts

Proctorio Sued For Using DMCA To Take Down a Student's Critical Tweets (techcrunch.com) 43

A university student is suing exam proctoring software maker Proctorio to "quash a campaign of harassment" against critics of the company, including an accusation that the company misused copyright laws to remove his tweets that were critical of the software. From a report: The Electronic Frontier Foundation, which filed the lawsuit this week on behalf of Miami University student Erik Johnson, who also does security research on the side, accused Proctorio of having "exploited the DMCA to undermine Johnson's commentary." Twitter hid three of Johnson's tweets after Proctorio filed a copyright takedown notice under the Digital Millennium Copyright Act, or DMCA, alleging that three of Johnson's tweets violated the company's copyright. Schools and universities have increasingly leaned on proctoring software during the pandemic to invigilate student exams, albeit virtually. Further reading: Proctorio Is Using Racist Algorithms To Detect Faces; Cheating-Detection Software Provokes 'School-Surveillance Revolt'; and Students Are Easily Cheating 'State-of-the-Art' Test Proctoring Tech.
Privacy

EFF Partners With DuckDuckGo (eff.org) 42

The Electronic Frontier Foundation (EFF) today announced it has enhanced its groundbreaking HTTPS Everywhere browser extension by incorporating rulesets from DuckDuckGo Smarter Encryption. According to the digital rights group's press release, HTTPS Everywhere is "a collaboration with The Tor Project and a key component of EFF's effort to encrypt the web and make the Internet ecosystem safe for users and website owners." From the press release: "DuckDuckGo Smarter Encryption has a list of millions of HTTPS-encrypted websites, generated by continually crawling the web instead of through crowdsourcing, which will give HTTPS Everywhere users more coverage for secure browsing," said Alexis Hancock, EFF Director of Engineering and manager of HTTPS Everywhere and Certbot web encrypting projects. "We're thrilled to be partnering with DuckDuckGo as we see HTTPS become the default protocol on the net and contemplate HTTPS Everywhere's future."

EFF began building and maintaining a crowd-sourced list of encrypted HTTPS versions of websites for a free browser extension -- HTTPS Everywhere -- which automatically takes users to them. That keeps users' web searching, pages visited, and other private information encrypted and safe from trackers and data thieves that try to intercept and steal personal information in transit from their browser. [...] DuckDuckGo, a privacy-focused search engine, also joined the effort with Smarter Encryption to help users browse securely by detecting unencrypted, non-secure HTTP connections to websites and automatically upgrading them to encrypted connections. With more domain coverage in Smarter Encryption, HTTPS Everywhere users are provided even more protection. HTTPS Everywhere rulesets will continue to be hosted through this year, giving our partners who use them time to adjust. We will stop taking new requests for domains to be added at the end of May.

Electronic Frontier Foundation

Privacy Advocate Confronts ACLU Over Its Use of Google and Facebook's Targeted Advertising (twitter.com) 20

Ashkan Soltani was the Chief Technologist of America's Federal Trade Commission in 2014 — and earlier was a staff technologist in its Division of Privacy and Identity Protection helping investigate tech companies including Google and Facebook

Friday on Twitter he accused another group of privacy violations: the nonprofit rights organization, the American Civil Liberties Union. Yesterday, the ACLU updated their privacy statement to finally disclose that they share constituent information with 'service providers' like Facebook for targeted advertising, flying in the face of the org's public advocacy and statements.

In fact, I was retained by the ACLU last summer to perform a privacy audit after concerns were raised internally regarding their data sharing practices. I only agreed to do this work on the promisee by ACLU's Executive Director that the findings would be made public. Unfortunately, after reviewing my findings, the ACLU decided against publishing my report and instead sat on it for ~6 months before quietly updating their terms of service and privacy policy without explanation for the context or motivations for doing so. While I'm bound by a nondisclosure agreement to not disclose the information I uncovered or my specific findings, I can say with confidence that the ACLU's updated privacy statements do not reflect the full picture of their practices.

For example, public transparency data from Google shows that the ACLU has paid Google nearly half a million dollars to deliver targeted advertisements since 2018 (when the data first was made public). The ACLU also opted to only disclose its advertising relationship with Facebook only began in 2021, when in truth, the relationship spans back years totaling over $5 million in ad-spend. These relationships fly against the principles and public statements of the ACLU regarding transparency, control, and disclosure before use, even as the organization claims to be a strong advocate for privacy rights at the federal and state level. In fact, the NY Attorney General conducted an inquiry into whether the ACLU had violated its promises to protect the privacy of donors and members in 2004. The results of which many aren't aware of. And to be clear, the practices described would very much constitute a 'sale' of members' PII under the California Privacy Rights Act (CPRA).

The irony is not lost on me that the ACLU vehemently opposed the CPRA — the toughest state privacy law in the country — when it was proposed. While I have tremendous respect for the work the ACLU and other NGOs do, it's important that nonprofits are bound by the same privacy standards they espouse for everyone else. (Full disclosure: I'm on the EFF advisory board and was recently invited to join EPIC's board.)

My experience with the ACLU further amplifies the need to have strong legal privacy protections that apply to nonprofits as well as businesses — partially since many of the underlying practices, particularly in the area of fundraising and advocacy, are similar if not worse.

Soltani also re-tweeted an interesting response from Alex Fowler, a former EFF VP who was also Mozilla's chief privacy officer for three years: I'm reminded of EFF co-founder John Gilmore telling me about the Coders' Code: If you find a bug or vulnerability, tell the coder. If coder ignores you or refuses to fix the issue, tell the users.
Open Source

Richard Stallman's Return Denounced by the EFF, Tor Project, Mozilla, and the Creator of Rust (itwire.com) 640

Sunday IT Wire counted up the number of signatories on two open letters, one opposing Richard Stallman's return to the FSF and one supporting it.

- The pro-Stallman letter had 3,632 individual signers
- The anti-Stallman letter had 2,812 individual signers (plus 48 companies and organizations).

But the question of Stallman's leadership has now also arisen in the GCC community:

A long-time developer of GCC, the compiler created by the GNU Project and used in Linux distributions, has issued a call for the removal of Free Software Founder Richard Stallman from the GCC steering committee. Nathan Sidwell [also a software engineer at Facebook] said in a post directed to the committee that if it was unwilling to remove Stallman, then the panel should explain why it was not able to do so.

Stallman is also the founder of the GNU Project and the original author of GCC.

"RMS [Stallman] is no longer a developer of GCC, the most recent commit I can find regards SCO in 2003," Sidwell wrote in a long email. "Prior to that there were commits in 1997, but significantly less than 1994 and earlier. GCC's implementation language is now C++, which I believe RMS neither uses nor likes.

"When was RMS' most recent positive input to the GCC project? Even if it was recent and significant, that doesn't mean his toxic behaviour should be accepted."

Meanwhile, the following groups have also issued statements opposing Stallman's return to the FSF:

- Mozilla: We can't demand better of the internet if we don't demand better of our leaders, colleagues and ourselves. We're with the Open Source Diversity Community, Outreachy & the Software Conservancy project in supporting this petition.
- The Tor Project: The Tor Project is joining calls for Richard M. Stallman to be removed from board, staff, volunteer, and other leadership positions in the FOSS community, including the Free Software Foundation and the GNU Project.
Rust creator Graydon Hoare: He's been saying sexist shit & driving women away for decades. He can't change, the FSF board knows it, is sending a "sexism doesn't matter" message. This is bad leadership and I'm sad about all of it, agree with calls to resign.

If someone is a public leader their public behaviour matters. I don't criticize private individuals here and I don't think twitter-justice is especially nuanced. But this is so far over the line, such a stupid and tone-deaf choice, and it is about community leadership.

The EFF: We at EFF are profoundly disappointed to hear of the re-election of Richard Stallman to a leadership position at the Free Software Foundation, after a series of serious accusations of misconduct led to his resignation as president and board member of the FSF in 2019. We are also disappointed that this was done despite no discernible steps taken by him to be accountable for, much less make amends for, his past actions or those who have been harmed by them. Finally, we are also disturbed by the secretive process of his re-election, and how it was belatedly conveyed to FSF's staff and supporters.

Stallman's re-election sends a wrong and hurtful message to free software movement, as well as those who have left that movement because of Stallman's previous behavior.

Free software is a vital component of an open and just technological society: its key institutions and individuals cannot place misguided feelings of loyalty above their commitment to that cause. The movement for digital freedom is larger than any one individual contributor, regardless of their role. Indeed, we hope that this moment can be an opportunity to bring in new leaders and new ideas to the free software movement.

We urge the voting members of the FSF1 to call a special meeting to reconsider this decision, and we also call on Stallman to step down: for the benefit of the organization, the values it represents, and the diversity and long-term viability of the free software movement as a whole.

Finally, the Free Software Foundation itself has now pinned the following tweet at the top of its Twitter feed: No LibrePlanet organizers (staff or volunteer), speakers, award winners, exhibitors, or sponsors were made aware of Richard Stallman's announcement until it was public.
Social Networks

Stricter Rules for Internet Platforms? What are the Alternatives... (acm.org) 83

A law professor serving on the EFF's board of directors (and advisory boards for the Electronic Privacy Information Center and the Center for Democracy and Technology) offers this analysis of "the push for stricter rules for internet platforms," reviewing proposed changes to the liability-limiting Section 230 of the Communications Decency Act — and speculating about what the changes would accomplish: Short of repeal, several initiatives aim to change section 230. Eleven bills have been introduced in the Senate and nine in the House of Representatives to amend section 230 in various ways.... Some would widen the categories of harmful conduct for which section 230 immunity is unavailable. At present, section 230 does not apply to user-posted content that violates federal criminal law, infringes intellectual property rights, or facilitates sex trafficking. One proposal would add to this list violations of federal civil laws.

Some bills would condition section 230 immunity on compliance with certain conditions or make it unavailable if the platforms engage in behavioral advertising. Others would require platforms to spell out their content moderation policies with particularity in their terms of service (TOS) and would limit section 230 immunity to TOS violations. Still others would allow users whose content was taken down in "bad faith" to bring a lawsuit to challenge this and be awarded $5,000 if the challenge was successful. Some bills would impose due process requirements on platforms concerning removal of user-posted content. Other bills seek to regulate platform algorithms in the hope of stopping the spread of extremist content or in the hope of eliminating biases...

Neither legislation nor an FCC rule-making may be necessary to significantly curtail section 230 as a shield from liability. Conservative Justice Thomas has recently suggested a reinterpretation of section 230 that would support imposing liability on Internet platforms as "distributors" of harmful content... Section 230, after all, shields these services from liability as "speakers" and "publishers," but is silent about possible "distributor" liability. Endorsing this interpretation would be akin to adopting the notice-and-takedown rules that apply when platforms host user-uploaded files that infringe copyrights.

Thanks to Slashdot reader Beeftopia for sharing the article, which ultimately concludes: - Notice-and-takedown regimes have long been problematic because false or mistaken notices are common and platforms often quickly take-down challenged content, even if it is lawful, to avoid liability...

- For the most part, these platforms promote free speech interests of their users in a responsible way. Startup and small nonprofit platforms would be adversely affected by some of the proposed changes insofar as the changes would enable more lawsuits against platforms for third-party content. Fighting lawsuits is costly, even if one wins on the merits.

- Much of the fuel for the proposed changes to section 230 has come from conservative politicians who are no longer in control of the Senate.

- The next Congress will have a lot of work to do. Section 230 reform is unlikely to be a high priority in the near term. Yet, some adjustments to that law seem quite likely over time because platforms are widely viewed as having too much power over users' speech and are not transparent or consistent about their policies and practices.

Social Networks

Can WhatsApp Stop Spreading Misinformation Without Compromising Encryption? (qz.com) 149

"WhatsApp, the Facebook-owned messaging platform used by 2 billion people largely in the global south, has become a particularly troublesome vector for misinformation," writes Quartz — though it's not clear what the answer is: The core of the problem is its use of end-to-end encryption, a security measure that garbles users' messages while they travel from one phone to another so that no one other than the sender and the recipient can read them. Encryption is a crucial privacy protection, but it also prevents WhatsApp from going as far as many of its peers to moderate misinformation. The app has taken some steps to limit the spread of viral messages, but some researchers and fact-checkers argue it should do more, while privacy purists worry the solutions will compromise users' private conversations...

In April 2020, WhatsApp began slowing the spread of "highly forwarded messages," the smartphone equivalent of 1990s chain emails. If a message has already been forwarded five times, you can only forward it to one person or group at a time. WhatsApp claims that simple design tweak cut the spread of viral messages by 70%, and fact-checkers have cautiously cheered the change. But considering that all messages are encrypted, it's impossible to know how much of an impact the cut had on misinformation, as opposed to more benign content like activist organizing or memes. Researchers who joined and monitored several hundred WhatsApp groups in Brazil, India, and Indonesia found that limiting message forwarding slows down viral misinformation, but doesn't necessarily limit how far the messages eventually spread....

This isn't just a semantic argument, says EFF strategy director Danny O'Brien. Even the smallest erosion of encryption protections gives Facebook a toehold to begin scanning messages in a way that could later be abused, and protecting the sanctity of encryption is worth giving up a potential tool for curbing misinformation. "This is a consequence of a secure internet," O'Brien says. "Dealing with the consequences of that is going to be a much more positive step than dealing with the consequences of an internet where no one is secure and no one is private...."

No matter what WhatsApp does, it will have to contend with dueling constituencies: the privacy hawks who see the app's encryption as its most important feature, and the fact-checkers who are desperate for more tools to curb the spread of misinformation on a platform that counts a quarter of the globe among its users.

Whatever Facebook decides will have widespread consequences in a world witnessing the simultaneous rise of fatal lies and techno-authoritarianism.

Google

Google's FLoC Is a Terrible Idea (eff.org) 119

Earlier this week, Google said that after it finishes phasing out third-party cookies over the next year or so, it won't introduce other forms of identifiers to track individuals as they browse across the web. Instead, the search giant plans to use something called Federated Learning of Cohorts (FLoC), which the company says has shown promising results. In a deep-dive, EFF has outlined several issues surrounding the usage of FLoC. The introductory excerpt follows: The third-party cookie is dying, and Google is trying to create its replacement. No one should mourn the death of the cookie as we know it. For more than two decades, the third-party cookie has been the lynchpin in a shadowy, seedy, multi-billion dollar advertising-surveillance industry on the Web; phasing out tracking cookies and other persistent third-party identifiers is long overdue. However, as the foundations shift beneath the advertising industry, its biggest players are determined to land on their feet. Google is leading the charge to replace third-party cookies with a new suite of technologies to target ads on the Web. And some of its proposals show that it hasn't learned the right lessons from the ongoing backlash to the surveillance business model. This post will focus on one of those proposals, Federated Learning of Cohorts (FLoC), which is perhaps the most ambitious -- and potentially the most harmful.

FLoC is meant to be a new way to make your browser do the profiling that third-party trackers used to do themselves: in this case, boiling down your recent browsing activity into a behavioral label, and then sharing it with websites and advertisers. The technology will avoid the privacy risks of third-party cookies, but it will create new ones in the process. It may also exacerbate many of the worst non-privacy problems with behavioral ads, including discrimination and predatory targeting. Google's pitch to privacy advocates is that a world with FLoC (and other elements of the "privacy sandbox") will be better than the world we have today, where data brokers and ad-tech giants track and profile with impunity. But that framing is based on a false premise that we have to choose between "old tracking" and "new tracking." It's not either-or. Instead of re-inventing the tracking wheel, we should imagine a better world without the myriad problems of targeted ads.

We stand at a fork in the road. Behind us is the era of the third-party cookie, perhaps the Web's biggest mistake. Ahead of us are two possible futures. In one, users get to decide what information to share with each site they choose to interact with. No one needs to worry that their past browsing will be held against them -- or leveraged to manipulate them -- when they next open a tab. In the other, each user's behavior follows them from site to site as a label, inscrutable at a glance but rich with meaning to those in the know. Their recent history, distilled into a few bits, is "democratized" and shared with dozens of nameless actors that take part in the service of each web page. Users begin every interaction with a confession: here's what I've been up to this week, please treat me accordingly. Users and advocates must reject FLoC and other misguided attempts to reinvent behavioral targeting. We implore Google to abandon FLoC and redirect its effort towards building a truly user-friendly Web.

The Courts

Accused Murderer Wins Right To Check Source Code of DNA Testing Kit (theregister.com) 167

"A New Jersey appeals court has ruled that a man accused of murder is entitled to review proprietary genetic testing software to challenge evidence presented against him," reports The Register.

Long-time Slashdot reader couchslug shared their report: The maker of the software, Cybergenetics, has insisted in lower court proceedings that the program's source code is a trade secret. The co-founder of the company, Mark Perlin, is said to have argued against source code analysis by claiming that the program, consisting of 170,000 lines of MATLAB code, is so dense it would take eight and a half years to review at a rate of ten lines an hour. The company offered the defense access under tightly controlled conditions outlined in a non-disclosure agreement, which included accepting a $1m liability fine in the event code details leaked. But the defense team objected to the conditions, which they argued would hinder their evaluation and would deter any expert witness from participating...

Those arguing on behalf of the defense cited past problems with other genetic testing software such as STRmix and FST (Forensic Statistical Tool). Defense expert witnesses Mats Heimdahl and Jeanna Matthews, for example, said that STRmix had 13 coding errors that affected 60 criminal cases, errors not revealed until a source code review. They also pointed out, as the appeals court ruling describes, how an FST source code review "uncovered that a 'secret function...was present in the software, tending to overestimate the likelihood of guilt.'"

EFF activists have already filed briefs in multiple courts "warning of the danger of secret software being used to convict criminal defendants," reports an EFF blog post.

"No one should be imprisoned or executed based on secret evidence that cannot be fairly evaluated for its reliability, and the ruling in this case will help prevent that injustice."
Electronic Frontier Foundation

EFF, Cory Doctorow Warn About the Dangers of De-Platforming and Censorship (eff.org) 231

Last week Cory Doctorow shared his own answer for what Apple and Google should've done about Parler: They should remove it, and tell users, "We removed Parler because we think it is a politically odious attempt to foment violence. Our judgment is subjective and may be wielded against others in future. If you don't like our judgment, you shouldn't use our app store."

I'm 100% OK with that: first, because it is honest; and second, because it invites the question, "How do we switch app stores?"

Doctorow warns that "vital sectors of the digital economy became as concentrated as they are due to four decades of shameful, bipartisan neglect of antitrust law."

And now Slashdot reader esm88 notes that "The EFF has made a statement raising concerns over tech giants control over the internet and who gets to decide which speech is allowed" (authored by legal director Corynne McSherry, strategy director Danny O'Brien, and Jillian C. York, EFF director for international freedom of expression): Whatever you think of Parler, these decisions should give you pause. Private companies have strong legal rights under U.S. law to refuse to host or support speech they don't like. But that refusal carries different risks when a group of companies comes together to ensure that forums for speech or speakers are effectively taken offline altogether... Amazon's decision highlights core questions of our time: Who should decide what is acceptable speech, and to what degree should companies at the infrastructure layer play a role in censorship? At EFF, we think the answer is both simple and challenging: wherever possible, users should decide for themselves, and companies at the infrastructure layer should stay well out of it....

The core problem remains: regardless of whether we agree with an individual decision, these decisions overall have not and will not be made democratically and in line with the requirements of transparency and due process. Instead they are made by a handful of individuals, in a handful of companies, the most distanced and least visible to the most Internet users. Whether you agree with those decisions or not, you will not be a part of them, nor be privy to their considerations. And unless we dismantle the increasingly centralized chokepoints in our global digital infrastructure, we can anticipate an escalating political battle between political factions and nation states to seize control of their powers.

On Friday Bill Ottman, founder and CEO of the right-leaning blockchain-based social network Minds (which includes a Slashdot discussion area), posted that in order to remain in the Google Play store, "We had to remove search, discovery, and comments..." We aren't happy and will be working towards something better. What is fascinating is how Signal and Telegram are navigating this and in my opinion they are still there because they are encrypted messengers without much "public" content. Obviously controversial speech is happening there too...

We will be releasing a full report on our plan for fully censorship-resistant infrastructure.

Ottman also advises users downloading apps from Apple's store to "leave if you're smart."
Electronic Frontier Foundation

Are Google, Apple, Facebook, and Microsoft 'Digital Warlords'? (locusmag.com) 66

EFF special consultant/blogger/science fiction writer Cory Doctorow warns in Locus magazine about the dangers of what Bruce Schneier calls "feudal security": Here in the 21st century, we are beset by all manner of digital bandits, from identity thieves, to stalkers, to corporate and government spies, to harassers... To be safe, then, you have to ally yourself with a warlord. Apple, Google, Facebook, Microsoft, and a few others have built massive fortresses bristling with defenses, whose parapets are stalked by the most ferocious cybermercenaries money can buy, and they will defend you from every attacker — except for their employers. If the warlord turns on you, you're defenseless.

We see this dynamic playing out with all of our modern warlords. Google is tweaking Chrome, its dominant browser, to block commercial surveillance, but not Google's own commercial surveillance. Google will do its level best to block scumbag marketers from tracking you on the web, but if a marketer pays Google, and convinces Google's gatekeepers that it is not a scumbag, Google will allow them to spy on you. If you don't mind being spied on by Google, and if you trust Google to decide who's a scumbag and who isn't, this is great. But if you and Google disagree on what constitutes scumbaggery, you will lose, thanks, in part, to other changes to Chrome that make it much harder to block the ads that Chrome lets through.

Over in Facebook land, this dynamic is a little easier to see. After the Cambridge Analytica scandal, Facebook tightened up who could buy Facebook's surveillance data about you and what they could do with it. Then, in the runup to the 2020 US elections, Facebook went further, instituting policies intended to prevent paid political disinformation campaigns at a critical juncture. But Facebook isn't doing a very good job of defending its users from the bandits. It's a bad (or possibly inattentive, or indifferent, or overstretched) warlord, though...

Back to Apple. In 2017, Apple removed all effective privacy tools from the Chinese version of the iPhone/iPad App Store, at the behest of the Chinese government. The Chinese government wanted to spy on Apple customers in China, and so it ordered Apple to facilitate this surveillance... If Apple chose not to comply with the Chinese order, it would either have to risk fines against its Chinese subsidiary and possible criminal proceedings against its Chinese staff, or pull out of China and risk having its digital services blocked by China's Great Firewall, and its Chinese manufacturing subcontractors could be ordered to sever their relations with Apple. In other words, the cost of noncompliance with the order is high, so high that Apple decided that putting its customers at risk was an acceptable alternative.

Therein lies the problem with trusting warlords to keep you safe: they have priorities that aren't your priorities, and when there's a life-or-death crisis that requires them to choose between your survival and their own, they will throw you to the bandits...

"The fact that Apple devices are designed to prevent users from overriding the company's veto over their computing makes it inevitable that some government will demand that this veto be exercised in their favor..." Doctorow concludes. "As with feudal aristocrats, the state is happy to lend these warlords their legitimacy, in exchange for the power to militarize the aristocrat's holdings... "

His proposed solution? What if Google didn't collect or retain so much user data in the first place -- or gave its users the power to turn off data-collection and data-retention altogether? And "What if Apple — by design — made is possible for users to override its killswitches?"
Electronic Frontier Foundation

EFF Reveals Behind-the-Scenes Account of the Fight to Save .ORG (eff.org) 46

As part of its "Year in Review" series, the EFF shares their dramatic behind-the-scenes details about 2020's fight over the future of .org domains. It begins when the Internet Society (ISOC) announced plans to sell the Public Interest Registry — which manages the .org top-level domain (TLD) — to private equity firm Ethos Capital.

"If you come at the nonprofit sector, you'd best not miss." EFF and other leaders in the NGO community sprung to action, writing a letter to ISOC urging it to stop the sale. What follows was possibly the most dramatic show of solidarity from the nonprofit sector of all time. And we won.

Prior to the announcement, EFF had spent six months voicing our concerns to the Internet Corporation for Assigned Names and Numbers (ICANN) about the 2019 .ORG Registry Agreement, which gave the owner of .ORG new powers to censor nonprofits' websites (the agreement also lifted a longstanding price cap on .ORG registrations and renewals)... Throughout that six-month process of navigating ICANN's labyrinthine decision-making structure, none of us knew that ISOC would soon be selling PIR. With .ORG in the hands of a private equity firm, those fears of censorship and price gouging became a lot more tangible for nonprofits and NGOs. The power to take advantage of .ORG users was being handed to a for-profit company whose primary obligation was to make money for its investors....

More NGOs began to take notice of the .ORG sale and the danger it posed to nonprofits' freedom of expression online. Over 500 organizations and 18,000 individuals had signed our letter by the end of 2019, including big-name organizations like Greenpeace, Consumer Reports, Oxfam, and the YMCA of the USA. At the same time, questions began to emerge (PDF) about whether Ethos Capital could possibly make a profit without some drastic changes in policy for .ORG. By the beginning of 2020, the financial picture had become a lot clearer: Ethos Capital was paying $1.135 billion for .ORG, nearly a third of which was financed by a loan. No matter how well-meaning Ethos was, the pressure to sell "censorship as a service" would align with Ethos' obligation to produce returns for its investors...

Six members of Congress wrote a letter to ICANN in January urging it to scrutinize the sale more carefully. A few days later, EFF, nonprofit advocacy group NTEN, and digital rights groups Fight for the Future and Demand Progress participated in a rally outside of the ICANN headquarters in Los Angeles. Our message was simple: stop the sale and create protections for nonprofits. Before the protest, ICANN staff reached out to the organizers offering to meet with us in person, but on the day of the protest, ICANN canceled on us. That same week, Amnesty International, Access Now, the Sierra Club, and other global NGOs held a press conference at the World Economic Forum to tell world leaders that selling .ORG threatens civil society. All of the noise caught the attention of California Attorney General Xavier Becerra, who wrote to ICANN (PDF) asking it for key information about its review of the sale...

Click through to read the conclusion...
Electronic Frontier Foundation

Edward Snowden Urges Donations to the EFF (eff.org) 99

In October, Edward Snowden was granted permanent residency in Russia. A new web page by the EFF applauds his past activities as a U.S. whistleblower. "His revelations about secret surveillance programs opened the world's eyes to a new level of government misconduct, and reinvigorated EFF's continuing work in the courts and with lawmakers to end unlawful mass spying."

And then they shared this fund-raising pitch written by Edward Snowden: Seven years ago I did something that would change my life and alter the world's relationship to surveillance forever.

When journalists revealed the truth about state deception and illegal conduct against citizens, it was human rights and civil liberties groups like EFF — backed by people around the world just like you — that seized the opportunity to hold authority to account.

Surveillance quiets resistance and takes away our choices. It robs us of private space, eroding our dignity and the things that make us human.

When you're secure from the spectre of judgement, you have room to think, to feel, and to make mistakes as your authentic self. That's where you test your notions of what's right. That's when you question the things that are wrong.

By sounding the alarm and shining a light on mass surveillance, we force governments around the world to confront their wrongdoing.

Slowly, but surely, grassroots work is changing the future. Laws like the USA Freedom Act have just begun to rein in excesses of government surveillance. Network operators and engineers are triumphantly "encrypting all the things" to harden the Internet against spying. Policymakers began holding digital privacy up to the light of human rights law. And we're all beginning to understand the power of our voices online.

This is how we can fix a broken system. But it only works with your help.

For 30 years, EFF members have joined forces to ensure that technology supports freedom, justice, and innovation for all people. It takes unique expertise in the courts, with policymakers, and on technology to fight digital authoritarianism, and thankfully EFF brings all of those skills to the fight. EFF relies on participation from you to keep pushing the digital rights movement forward .

Each of us plays a crucial role in advancing democracy for ourselves, our neighbors, and our children. I hope you'll answer the call by joining EFF to build a better digital future together.

Sincerely,

Edward Snowden

Electronic Frontier Foundation

ExamSoft Flags One-Third of California Bar Exam Test Takers For Cheating (eff.org) 82

The California Bar released data last week confirming that during its use of ExamSoft for the October Bar exam, over one-third of the nearly nine-thousand online examinees were flagged by the software. The Electronic Frontier Foundation is concerned that the exam proctoring software is incorrectly flagging students for cheating "due either to the software's technical failures or to its requirements that students have relatively new computers and access to near-broadband speeds." From the report: This is outrageous. It goes without saying that of the 3,190 applicants flagged by the software, the vast majority were not cheating. Far more likely is that, as EFF and others have said before, remote proctoring software is surveillance snake oil -- you simply can't replicate a classroom environment online, and attempting to do so via algorithms and video monitoring only causes harm. In this case, the harm is not only to the students who are rightfully upset about the implications and the lack of proper channels for redress, but to the institution of the Bar itself. While examinees have been searching for help from other examinees as well as hiring legal counsel in their attempt to defend themselves from potentially baseless claims of cheating, the California Committee of Bar Examiners has said "everything is going well" and called these results "a good thing to see" (13:30 into the video of the Committee meeting).

That is not how we see it. These flags have triggered concern for hundreds, if not thousands, of test takers, most of whom had no idea that they were flagged until recently. Many only learned about the flag after receiving an official "Chapter 6 Notice" from the Bar, which is sent when an applicant is observed (supposedly) violating exam conduct rules or seen or heard with prohibited items, like a cell phone, during the exam. In a depressingly ironic introduction to the legal system, the Bar has requested that students respond to the notices within 10 days, but it would appear that none of them have been given enough information to do so, as Chapter 6 Notices contain only a short summary of the violation. These summaries are decidedly vague: "Facial view of your eyes was not within view of the camera for a prolonged period of time"; "No audible sound was detected"; "Leaving the view of the webcam outside of scheduled breaks during a remote-proctored exam." Examinees do not currently have access to the flagged videos themselves, and are not expected to receive access to them, or any other evidence against them, before they are required to submit a response.
The report goes on to say that some of these flags are technical issues with ExamSoft. For example, Lenovo laptops appear to have been flagged en masse for an issue with the software's inability to access the internal microphone.

Other flags are likely due to the inability of the software to correctly recognize the variability of examinees' demeanors and expressions. "We implore the California Bar to rethink its plans for remotely-proctored future exams, and to work carefully to offer clearer paths for examinees who have been flagged by these inadequate surveillance tools," the EFF says in closing. "Until then, the Bar must provide examinees who have been flagged with a fair appeals process, including sharing the videos and any other information necessary for them to defend themselves before requiring a written response."
Privacy

Civil Rights Groups Move To Block Expansion of Facial Recognition in Airports (theverge.com) 26

A coalition of civil rights groups led by the American Civil Liberties Union have filed an objection to the proposed expansion of Customs and Border Protections facial recognition at land and sea ports. The National Immigration Law Center, Fight for the Future, and the Electronic Frontier Foundation are also participating in the motion, alongside twelve others. From a report: Filed in November, CBP's proposed rule would expand the biometric exit system, authorizing the collection of facial images from any non-citizen entering the country. But in a filing on Monday, the final day of the comment period, the coalition argued that those measures are too extreme. "CBP's proposed use of face surveillance at airports, sea ports, and the land border would put the United States on an extraordinarily dangerous path toward the normalization of this surveillance," said Ashley Gorski, senior staff attorney with the ACLU's National Security Project, in a statement to reporters. "The deployment of this society-changing technology is unnecessary and unjustified." The filing raises a variety of legal objections to the expansion, in particular arguing that Congress did not intend to authorize long-term facial recognition when it mandated biometric exit tracking in 1996. At the time, Congress left the specific method open to interpretation, but the technology for algorithmic facial recognition from a video feed was not yet developed enough to be considered.
Electronic Frontier Foundation

Facebook's Criticism of Apple's Tracking Change Called 'Laughable' by EFF (macrumors.com) 46

The MacRumors site writes: Facebook's recent criticism directed at Apple over an upcoming tracking-related privacy measure is "laughable," according to the Electronic Frontier Foundation (EFF), a non-profit organization that defends civil liberties in the digital world.

Facebook has claimed that Apple's new opt-in tracking policy will hurt small businesses who benefit from personalized advertising, but the EFF believes that Facebook's campaign against Apple is really about "what Facebook stands to lose if its users learn more about exactly what it and other data brokers are up to behind the scenes," noting that Facebook has "built a massive empire around the concept of tracking everything you do...." According to the EFF, a number of studies have shown that most of the money made from targeted advertising does not reach app developers, and instead goes to third-party data brokers like Facebook, Google, and lesser-known firms.

"Facebook touts itself in this case as protecting small businesses, and that couldn't be further from the truth," the EFF said. "Facebook has locked them into a situation in which they are forced to be sneaky and adverse to their own customers. The answer cannot be to defend that broken system at the cost of their own users' privacy and control."

"This is really about who benefits from the normalization of surveillance-powered advertising..." argues the EFF. And they ultimately come down in support of Apple's new privacy changes.

"Here, Apple is right and Facebook is wrong."
Open Source

After Restoring YouTube-dl, GitHub Revamps Its Copyright Takedown Policy (engadget.com) 24

On October 23rd GitHub initially complied with a takedown request for the open-source project youtube-dl — and then after 24 days, reinstated it.

"If there's a silver lining to the episode, it's that GitHub is implementing new policies to avoid a repeat of a repeat situation moving forward," reports Engadget: First, it says a team of both technical and legal experts will manually evaluate every single section 1201 claim. In instances where there's any ambiguity to a claim, the company says it will err on the side of developers and leave their repository online. If the company's technical and legal teams ultimately find any issues with a project, GitHub will give its owners the chance to address those problems before it takes down their work. Following a takedown, it will continue to give people the chance to recover their data — provided it doesn't include any offending code.

GitHub is also establishing a $1 million defense fund to provide legal aid to developers against suspect section 1201 claims, as well as doubling down on its lobbying work to amend the DMCA and other similar copyright laws across the world.

Cellphones

The US Could Soon Ban the Selling of Carrier-Locked Phones (wired.com) 62

An anonymous reader quotes a report from Wired: In the U.S., a complicated combination of corporate interests and pre-smartphone era legislation has resulted in more than two decades of back and forth about the legality of phone locking. It's looking like that battle could ramp up again next year. The transition to a Biden administration could shake up the regulatory body that governs these rules. The timing also coincides with a congressional proceeding that takes place every three years to determine what tweaks should be made to digital rights laws. 2021 could be the year of the truly unlocked phone. For some activists, it's a glimmer of light at the end of a very long tunnel.

[H]ow could carriers be forced to provide phones that are unlocked by default? There are a couple of promising avenues, though neither are a given. The "agenda" here meaning something to be decided by a regulating body. In the UK, the regulator Ofcom made that call. The US Ofcom equivalent is the Federal Communications Commission. Under its current leadership of Trump appointee Ajit Pai, the FCC has been staunchly pro-business, passing legislation like the repeal of net neutrality at the behest of companies like AT&T. "Getting this done in an Ajit Pai FCC would be extremely difficult and very unlikely, given how friendly that FCC has been toward private companies and broadband providers," Sheehan says. "Whether or not that could happen in a Biden administration, we don't know. I think it would be much more possible."

Another route would be to take the problem back to its source: Section 1201 itself. Every three years, the US Library of Congress and Copyright Office hold a rulemaking proceeding that takes public comment. It's a chance for advocates to make their case for amending Section 1201, assuming they can afford the legal fees necessitated by such an involved, drawn out process. It's a less overtly political process, as the key decisionmakers at the two institutions don't come and go with each presidential administration like they usually do at the FCC. These sessions have already yielded positive outcomes for fans of repairability, like an exemption that took effect in 2016 that made it legal to hack car computers and other devices. The next proceeding is currently underway. If citizens want to urge the government to amend Section 1201, the first round of comments are required to be in by December 14. Responses and additional proposals will go back and forth through the spring of 2021, until the Copyright Office ultimately decides which changes to implement. Both Sheehan and Wiens are working with other advocates to make their case for a future of unlockability.

Slashdot Top Deals