Encryption

NBC: 'You Probably Don't Need to Rely on a VPN Anymore' (nbcnews.com) 166

NBC News writes: VPNs, or virtual private networks, continue to be used by millions of people as a way of masking their internet activity by encrypting their location and web traffic. But on the modern internet, most people can safely ditch them, thanks to the widespread use of encryption that has made public internet connections far less of a security threat, cybersecurity experts say. "Most commercial VPNs are snake oil from a security standpoint," said Nicholas Weaver, a cybersecurity lecturer at the University of California, Berkeley. "They don't improve your security at all...."

Most browsers have quietly implemented an added layer of security in recent years that automatically encrypts internet traffic at most sites with a technology called HTTPS. Indicated by a tiny padlock by the URL, the presence of HTTPS means that worrisome scenario, in which a scammer or a hacker squats on a public Wi-Fi connection in order to watch people's internet habits, isn't feasible. It's not clear that the threat of a hacker at your coffee shop was ever that real to begin with, but it is certainly not a major danger now, Weaver said. "Remember, someone attacking you at the coffee shop needs to be basically at the coffee shop," he said. "I don't know of them ever being used outside of pranks. And those are all irrelevant now with most sites using HTTPS," he said in a text message.

There are still valid uses for VPNs. They're an invaluable tool for getting around certain types of censorship, though other options also exist, such as the Tor Browser, a free web browser that automatically reroutes users' traffic and is widely praised by cybersecurity experts. VPNs are also vital for businesses that need their employees to log in remotely to their internal network. And they're a popular and effective way to watch television shows and movies that are restricted to particular countries on streaming services. But like with antivirus software, the paid VPN industry is a booming global market despite its core mission no longer being necessary for many people.

Most VPNs market their products as a security tool. A Consumer Reports investigation published earlier this month found that 12 of the 16 biggest VPNs make hyperbolic claims or mislead customers about their security benefits. And many can make things worse, either by selling customers' browsing history to data brokers, or by having poor cybersecurity.

The article credits the Electronic Frontier Foundation for popularizing encryption through browser extensions and web site certificates starting in 2010. "In 2015, Google started prioritizing websites that enabled HTTPS in its search results. More and more websites started offering HTTPS connections, and now practically all sites that Google links to do so.

"Since late 2020, major browsers such as Brave, Chrome, Firefox, Safari and Edge all built HTTPS into their programs, making Electronic Frontier Foundation's browser extension no longer necessary for most people."
Announcements

What Were Slashdot's Most Popular Stories of 2021? (slashdot.org) 16

Another 12 months gone by, and with it nearly 8,000 new Slashdot headlines — so which ones drew the most views?

Click here for lists of Slashdot's top 10 most-visited and most-commented stories of the year — and also the all-time top 10 lists since Slashdot's creation in 1997.

Here's some of 2021's highlights:
  • Remember that big electrical outage that left millions of Texans without power in the middle of a winter storm? As the crisis was still raging, CNN asked the million-dollar question: who's actually to blame? This became Slashdot's 9th most-visited story of the year — and also the 7th most-commented.
  • Two of the 10 most-visited stories of the year were "Ask Slashdot" technical questions: In April RockDoctor (Slashdot reader #15,477) asked whether a software RAID is better than a hardware RAID? And in January of 2020 Slashdot reader lsllll asked for suggestions on a a battery-powered wi-fi security camera supporting FTP/SMB

    Interestingly, one of the year's most-commented poll topics had asked whether bitcoin would break $100,000 before the end of 2021. 4,951 voters — a full 25% — had said "Yes" — and were off by more than half, with bitcoin actually tumbling 8% in the last week of 2021 to wind up somewhere near $46,371 as of late Friday afternoon.

    At the time of the poll — October 8th — the price of Bitcoin was already up to $53,963. One month later it had reached it's highest price of 2021 — $67,582 — before dropping 31.7% over the next 53 days.

    In the October poll asking whether bitcoin would reach $100,000 in the final 84 days of 2021 — another 14,687 Slashdot readers voted "No."

Technology

Messy NFT Drop Angers Infosec Pioneers With Unauthorized Portraits (theverge.com) 65

An unauthorized NFT drop celebrating infosec pioneers has collapsed into a mess of conflicting takedowns and piracy. From a report: Released on Christmas Day by a group called "ItsBlockchain," the "Cipher Punks" NFT package included portraits of 46 distinct figures, with ten copies of each token. Taken at their opening price, the full value of the drop was roughly $4,000. But almost immediately, the infosec community began to raise objections -- including some from the portrait subjects themselves. The portrait images misspelled several names -- including EFF speech activist Jillian York and OpenPGP creator Jon Callas -- and based at least one drawing on a copyright-protected photograph. More controversially, the list included some figures who have been ostracized for harmful personal behavior, including Jacob Appelbaum and Richard Stallman.
Social Networks

Federal Court Blocks Texas' Unconstitutional Social Media Law (eff.org) 292

An anonymous reader quotes a report from the Electronic Frontier Foundation: On December 1, hours before Texas' social media law, HB 20, was slated to go into effect, a federal court in Texas blocked it for violating the First Amendment. Like a similar law in Florida, which was blocked and is now pending before the Eleventh Circuit Court of Appeals, the Texas law will go to the Fifth Circuit. These laws are retaliatory, obviously unconstitutional, and EFF will continue advocating that courts stop them. In October, EFF filed an amicus brief against HB 20 in Netchoice v. Paxton, a challenge to the law brought by two associations of tech companies. HB 20 prohibits large social media platforms from removing or moderating content based on the viewpoint of the user. We argued, and the federal court agreed, that the government cannot regulate the editorial decisions made by online platforms about what content they host. As the judge wrote, platforms' right under the First Amendment to moderate content "has repeatedly been recognized by courts." Social media platforms are not "common carriers" that transmit speech without curation.

Moreover, Texas explicitly passed HB 20 to stop social media companies' purported discrimination against conservative users. The court explained that this "announced purpose of balancing the discussion" is precisely the kind of government manipulation of public discourse that the First Amendment forbids. As EFF's brief explained, the government can't retaliate against disfavored speakers and promote favored ones. Moreover, HB 20 would destroy or prevent the emergence of even large conservative platforms, as they would have to accept user speech from across the political spectrum. HB 20 also imposed transparency requirements and user complaint procedures on large platforms. While these kinds of government mandates might be appropriate when carefully crafted -- and separated from editorial restrictions or government retaliation -- they are not here. The court noted that companies like YouTube and Facebook remove millions of pieces of user content a month. It further noted Facebook's declaration in the case that it would be "impossible" to establish a system by December 1 compliant with the bill's requirements for that many removals. Platforms would simply stop removing content to avoid violating HB 20 -- an impermissible chill of First Amendment rights.

Privacy

Apple Removes All References To Controversial CSAM Scanning Feature From Its Child Safety Webpage (macrumors.com) 36

Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods. From a report: Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search. Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.
Chrome

EFF Warns Chrome Users: 'Manifest V3 Is Deceitful and Threatening' (eff.org) 46

In a recent blog post from the Electronic Frontier Foundation, the digital rights group warns that Google Chrome's latest specification for building Chrome extensions, known as Manifest V3, "is outright harmful to privacy efforts." EFF technologist Daly Barnett writes: Like FLoC and Privacy Sandbox before it, Manifest V3 is another example of the inherent conflict of interest that comes from Google controlling both the dominant web browser and one of the largest internet advertising networks. [...] It will restrict the capabilities of web extensions -- especially those that are designed to monitor, modify, and compute alongside the conversation your browser has with the websites you visit. Under the new specifications, extensions like these -- like some privacy-protective tracker blockers -- will have greatly reduced capabilities. Google's efforts to limit that access is concerning, especially considering that Google has trackers installed on 75% of the top one million websites.

It's also doubtful Mv3 will do much for security. Firefox maintains the largest extension market that's not based on Chrome, and the company has said it will adopt Mv3 in the interest of cross-browser compatibility. Yet, at the 2020 AdBlocker Dev Summit, Firefox's Add-On Operations Manager said about the extensions security review process: "For malicious add-ons, we feel that for Firefox it has been at a manageable level... since the add-ons are mostly interested in grabbing bad data, they can still do that with the current webRequest API that is not blocking." In plain English, this means that when a malicious extension sneaks through the security review process, it is usually interested in simply observing the conversation between your browser and whatever websites you visit. The malicious activity happens elsewhere, after the data has already been read. A more thorough review process could improve security, but Chrome hasn't said they'll do that. Instead, their solution is to restrict capabilities for all extensions.

As for Chrome's other justification for Mv3 -- performance -- a 2020 study (PDF) by researchers at Princeton and the University of Chicago revealed that privacy extensions, the very ones that will be hindered by Mv3, actually improve browser performance. The development specifications of web browser extensions may seem in the weeds, but the broader implications should matter to all internet citizens: it's another step towards Google defining how we get to live online. Considering that Google has been the world's largest advertising company for years now, these new limitations are paternalistic and downright creepy.

Wireless Networking

What Happens When You Use Bluetooth Tags to Track Your Stolen Items? 166

"The third time my 1999 Honda Civic was stolen, I had a plan," writes Washington Post technology reporter Heather Kelly. Specifically, it was a tile tracker hidden in the car, "quietly transmitting its approximate location over Bluetooth." Later that day, I was across town hiding down the block from my own car as police detained the surprised driver. When the Tile app pinged me with a last known location, I showed up expecting the car to be abandoned. I quickly realized it was still in use, with one person looking through the trunk and another napping in the passenger seat, so I called the police...

In April of this year, one month after my car was stolen, Apple released the $29 AirTag, bringing an even more effective Bluetooth tracking technology to a much wider audience. Similar products from Samsung and smaller brands such as Chipolo are testing the limits of how far people will go to get back their stolen property and what they consider justice. "The technology has unintended consequences. It basically gives the owner the ability to become a mini surveillance operation," said Andrew Guthrie Ferguson, a law professor at the American University Washington College of Law...

Apple has been careful to never say AirTags can be used to recover stolen property. The marketing for the device is light and wholesome, focusing on situations like lost keys between sofa cushions. The official tagline is "Lose your knack for losing things" and there's no mention of crime, theft or stealing in any of the ads, webpages or support documents. But in reality, the company has built a network that is ideal for that exact use case. Every compatible iPhone, iPad and Mac is being silently put to work as a location device without their owners knowing when it happens. An AirTag uses Bluetooth to send out a ping with its encrypted location to the closest Apple devices, which pass that information on to the Apple cloud. That spot is visible on a map in the Find My app. The AirTag owner can also turn on Lost Mode to get a notification the next time it's detected, as well as leave contact information in case it's found. Apple calls this the Find My network, and it also works for lost or stolen Apple devices and a handful of third-party products. The proliferation of compatible Apple devices — there are nearly a billion in the network around the world — makes Find My incredibly effective, especially in cities. (Apple device owners are part of the Find My network by default, but can opt out in settings, and the location information is all encrypted...)

All the tracker companies recommend contacting law enforcement first, which may sound logical until you find yourself waiting hours in a parking lot for officers to address a relatively low-priority crime, or having to explain to them what Bluetooth trackers are.

The Times shares stories of two people who tried using AirTags to track down their stolen property. One Seattle man tracked down his stolen electric bike — and ended up pedalling away furiously on the (now out of power) bicycle as the suspected thief chased after him.

And an Ohio man waited for hours in an unfamiliar drugstore parking lot for a response from the police, eventually travelling with them to the suspect's house — where his stolen laptop was returned to the police officer by a man holding two babies in his arms.

Some parents have even hidden them in their childrens' backpacks, and pet owners have hidden them in their pet's collars, the Times reports — adding that the EFF's director of cybersecurity sees another possibility. "The problem is it's impossible to build a tool that is designed to track down stolen items without also building the perfect tool for stalking."
Electronic Frontier Foundation

EFF Board of Directors Removes 76-Year-Old John Gilmore (eff.org) 243

76-year-old John Gilmore co-founded the EFF in 1990, and in the 31 years since he's "provided leadership and guidance on many of the most important digital rights issues we advocate for today," the EFF said in a statement Friday.

"But in recent years, we have not seen eye-to-eye on how to best communicate and work together," they add, announcing "we have been unable to agree on a way forward with Gilmore in a governance role." That is why the EFF Board of Directors has recently made the difficult decision to vote to remove Gilmore from the Board.

We are deeply grateful for the many years Gilmore gave to EFF as a leader and advocate, and the Board has elected him to the role of Board Member Emeritus moving forward. "I am so proud of the impact that EFF has had in retaining and expanding individual rights and freedoms as the world has adapted to major technological changes," Gilmore said. "My departure will leave a strong board and an even stronger staff who care deeply about these issues."

John Gilmore co-founded EFF in 1990 alongside John Perry Barlow, Steve Wozniak and Mitch Kapor, and provided significant financial support critical to the organization's survival and growth over many years. Since then, Gilmore has worked closely with EFF's staff, board, and lawyers on privacy, free speech, security, encryption, and more. In the 1990s, Gilmore found the government documents that confirmed the First Amendment problem with the government's export controls over encryption, and helped initiate the filing of Bernstein v DOJ, which resulted in a court ruling that software source code was speech protected by the First Amendment and the government's regulations preventing its publication were unconstitutional. The decision made it legal in 1999 for web browsers, websites, and software like PGP and Signal to use the encryption of their choice.

Gilmore also led EFF's effort to design and build the DES Cracker, which was regarded as a fundamental breakthrough in how we evaluate computer security and the public policies that control its use. At the time, the 1970s Data Encryption Standard (DES) was embedded in ATM machines and banking networks, as well as in popular software around the world. U.S. government officials proclaimed that DES was secure, while secretly being able to wiretap it themselves. The EFF DES Cracker publicly showed that DES was in fact so weak that it could be broken in one week with an investment of less than $350,000. This catalyzed the international creation and adoption of the much stronger Advanced Encryption Standard (AES), now widely used to secure information worldwide....

EFF has always valued and appreciated Gilmore's opinions, even when we disagree. It is no overstatement to say that EFF would not exist without him. We look forward to continuing to benefit from his institutional knowledge and guidance in his new role of Board Member Emeritus.

Gilmore also created the alt* hierarchy on Usenet, co-founded the Cypherpunks mailing list, and was one of the founders of Cygnus Solutions (according to his page on Wikipedia).

He's also apparently Slashdot user #35,813 (though he hasn't posted a comment since 2004).
Privacy

Police Can't Demand You Reveal Your Phone Passcode and Then Tell a Jury You Refused (eff.org) 75

EFF: The Utah Supreme Court is the latest stop in EFF's roving campaign to establish your Fifth Amendment right to refuse to provide your password to law enforcement. Yesterday, along with the ACLU, we filed an amicus brief in State v. Valdez, arguing that the constitutional privilege against self-incrimination prevents the police from forcing suspects to reveal the contents of their minds. That includes revealing a memorized passcode or directly entering the passcode to unlock a device.

In Valdez, the defendant was charged with kidnapping his ex-girlfriend after arranging a meeting under false pretenses. During his arrest, police found a cell phone in Valdez's pocket that they wanted to search for evidence that he set up the meeting, but Valdez refused to tell them the passcode. Unlike many other cases raising these issues, however, the police didn't bother seeking a court order to compel Valdez to reveal his passcode. Instead, during trial, the prosecution offered testimony and argument about his refusal. The defense argued that this violated the defendant's Fifth Amendment right to remain silent, which also prevents the state from commenting on his silence. The court of appeals agreed, and now the state has appealed to the Utah Supreme Court.

Encryption

With HTTPS Everywhere, EFF Begins Plans to Eventually Deprecate 'HTTPS Everywhere' Extension (therecord.media) 48

The Record reports: The Electronic Frontier Foundation said it is preparing to retire the famous HTTPS Everywhere browser extension after HTTPS adoption has picked up and after several web browsers have introduced HTTPS-only modes." "After the end of this year, the extension will be in 'maintenance mode' for 2022," said Alexis Hancock, Director of Engineering at the EFF. Maintenance mode means the extension will receive minor bug fixes next year but no new features or further development.

No official end-of-life date has been decided, a date after which no updates will be provided for the extension whatsoever.

Launched in June 2010, the HTTPS Everywhere browser extension is one of the most successful browser extensions ever released. The extension worked by automatically switching web connections from HTTP to HTTPS if websites had an HTTPS option available. At the time it was released, it helped upgrade site connections to HTTPS when users clicked on HTTP links or typed domains in their browser without specifying the "https://" prefix. The extension reached cult status among privacy advocates and was integrated into the Tor Browser and, after that, in many other privacy-conscious browsers. But since 2010, HTTPS is not a fringe technology anymore. Currently, around 86.6% of all internet sites support HTTPS connections. Browser makers such as Chrome and Mozilla previously reported that HTTPS traffic usually accounts for 90% to 95% of their daily connections.

From EFF's announcement: The goal of HTTPS Everywhere was always to become redundant. That would mean we'd achieved our larger goal: a world where HTTPS is so broadly available and accessible that users no longer need an extra browser extension to get it. Now that world is closer than ever, with mainstream browsers offering native support for an HTTPS-only mode.

With these simple settings available, EFF is preparing to deprecate the HTTPS Everywhere web extension as we look to new frontiers of secure protocols like SSL/TLS... We know many different kinds of users have this tool installed, and want to give our partners and users the needed time to transition.

The announcement also promises to inform users of browser-native HTTPS-only options before the day when the extension reaches its final sunsetting — and ends with instructions for how to activate the native HTTPS-only features in Firefox, Chrome, Edge, and Safari, "and celebrate with us that HTTPS is truly everywhere for users."
Electronic Frontier Foundation

Why EFF Flew a Plane Over Apple's Headquarters (eff.org) 29

EFF.org has the story: For the last month, civil liberties and human rights organizations, researchers, and customers have demanded that Apple cancel its plan to install photo-scanning software onto devices. This software poses an enormous danger to privacy and security. Apple has heard the message, and announced that it would delay the system while consulting with various groups about its impact. But in order to trust Apple again, we need the company to commit to canceling this mass surveillance system.

The delay may well be a diversionary tactic. Every September, Apple holds one of its big product announcement events, where Apple executives detail the new devices and features coming out. Apple likely didn't want concerns about the phone-scanning features to steal the spotlight.

But we can't let Apple's disastrous phone-scanning idea fade into the background, only to be announced with minimal changes down the road. To make sure Apple is listening to our concerns, EFF turned to an old-school messaging system: aerial advertising.

During Apple's event, a plane circled the company's headquarters carrying an impossible-to-miss message: "Apple, don't scan our phones!" The evening before Apple's event, protestors also rallied nationwide in front of Apple stores. The company needs to hear us, and not just dismiss the serious problems with its scanning plan. A delay is not a cancellation, and the company has also been dismissive of some concerns, referring to them as "confusion" about the new features.

Apple's iMessage is one of the preeminent end-to-end encrypted chat clients. End-to-end encryption is what allows users to exchange messages without having them intercepted and read by repressive governments, corporations, and other bad actors. We don't support encryption for its own sake: we fight for it because encryption is one of the most powerful tools individuals have for maintaining their digital privacy and security in an increasingly insecure world.

Now that Apple's September event is over, Apple must reach out to groups that have criticized it and seek a wider range of suggestions on how to deal with difficult problems, like protecting children online...

The world, thankfully, has moved towards encrypted communications over the last two decades, not away from them, and that's a good thing. If Apple wants to maintain its reputation as a pro-privacy company, it must continue to choose real end-to-end encryption over government demands to read user's communication.

Privacy matters now more than ever. It will continue to be a selling point and a distinguishing feature of some products and companies. For now, it's an open question whether Apple will continue to be one of them.

Youtube

YouTube Blocks 31st Ig Nobel Awards Ceremony, Citing Copyright on a Recording from 1914 (improbable.com) 130

The 31st annual Ig Nobel Prizes were awarded at a special ceremony on September 9th, announced the magazine responsible for the event, the Annals of Improbable Research.

But this week they made another announcement. "YouTube's notorious takedown algorithms are blocking the video of the 2021 Ig Nobel Prize ceremony." We have so far been unable to find a human at YouTube who can fix that. We recommend that you watch the identical recording on Vimeo.

Here's what triggered this: The ceremony includes bits of a recording (of tenor John McCormack singing "Funiculi, Funicula") made in the year 1914.

YouTube's takedown algorithm claims that the following corporations all own the copyright to that audio recording that was MADE IN THE YEAR 1914: "SME, INgrooves (on behalf of Emerald); Wise Music Group, BMG Rights Management (US), LLC, UMPG Publishing, PEDL, Kobalt Music Publishing, Warner Chappell, Sony ATV Publishing, and 1 Music Rights Societies"

Businesses

Apple Risks Losing Billions of Dollars Annually From Ruling (bloomberg.com) 61

Mark Gurman, reporting on Friday's ruling in Apple and Epic lawsuit: So how much does Apple stand to lose? That all comes down to how many developers try to bypass its payment system. Loup Venture's Gene Munster, a longtime Apple watcher, put the range at $1 billion to $4 billion, depending on how many developers take advantage of the new policy. Apple depicted the ruling as a victory, signaling that it's not too worried about the financial impact. "The court has affirmed what we've known all along: The App Store is not in violation of antitrust law" and "success is not illegal," Apple said in a statement. Kate Adams, the iPhone maker's general counsel, called the ruling a "resounding victory" that "underscores the merit" of its business.

Apple's adversary in the trial -- Epic Games, the maker of Fortnite -- also contended that the judge sided with Apple. This "isn't a win for developers or for consumers," Epic Chief Executive Officer Tim Sweeney said on Twitter. [...] Apple made about $3.8 billion in U.S. revenue from games in 2020, most of which came from in-app purchases, according to estimates from Sensor Tower. But even if the ruling ends up costing Apple a few billion dollars a year, that's still a small fraction of its total revenue. In fiscal 2021 alone, the company is estimated to bring in more than $360 billion, meaning the change won't make or break its overall financial performance. And many developers may choose to stick to Apple's payment system so they don't have to build their own web payment platform.

More concerns were shared by the EFF in a thread on Twitter. "Disappointingly, a court found that Apple is not a monopolist in mobile gaming or in-app transactions, so its App Store commissions don't violate antitrust law. One bright spot: the court found Apple's gag rules on app developers violate California law...

"The court's opinion spells out many serious problems with today's mobile app ecosystem, such as false tensions between user choice and user privacy. Congress can help with real antitrust reform and new legal tools, and shouldn't let Apple's privacywashing derail that work."
Privacy

'Apple's Device Surveillance Plan Is a Threat To User Privacy -- And Press Freedom' (freedom.press) 213

The Freedom of the Press Foundation is calling Apple's plan to scan photos on user devices to detect known child sexual abuse material (CSAM) a "dangerous precedent" that "could be misused when Apple and its partners come under outside pressure from governments or other powerful actors." They join the EFF, whistleblower Edward Snowden, and many other privacy and human rights advocates in condemning the move. Advocacy Director Parker Higgins writes: Very broadly speaking, the privacy invasions come from situations where "false positives" are generated -- that is to say, an image or a device or a user is flagged even though there are no sexual abuse images present. These kinds of false positives could happen if the matching database has been tampered with or expanded to include images that do not depict child abuse, or if an adversary could trick Apple's algorithm into erroneously matching an existing image. (Apple, for its part, has said that an accidental false positive -- where an innocent image is flagged as child abuse material for no reason -- is extremely unlikely, which is probably true.) The false positive problem most directly touches on press freedom issues when considering that first category, with adversaries that can change the contents of the database that Apple devices are checking files against. An organization that could add leaked copies of its internal records, for example, could find devices that held that data -- including, potentially, whistleblowers and journalists who worked on a given story. This could also reveal the extent of a leak if it is not yet known. Governments that could include images critical of its policies or officials could find dissidents that are exchanging those files.
[...]
Journalists, in particular, have increasingly relied on the strong privacy protections that Apple has provided even when other large tech companies have not. Apple famously refused to redesign its software to open the phone of an alleged terrorist -- not because they wanted to shield the content on a criminal's phone, but because they worried about the precedent it would set for other people who rely on Apple's technology for protection. How is this situation any different? No backdoor for law enforcement will be safe enough to keep bad actors from continuing to push it open just a little bit further. The privacy risks from this system are too extreme to tolerate. Apple may have had noble intentions with this announced system, but good intentions are not enough to save a plan that is rotten at its core.

Electronic Frontier Foundation

Edward Snowden and EFF Slam Apple's Plans To Scan Messages and iCloud Images (macrumors.com) 55

Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF). MacRumors reports: In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns that Apple is rolling out a form of "mass surveillance to the entire world" and setting a precedent that could allow the company to scan for any other arbitrary content in the future. Snowden also noted that Apple has historically been an industry-leader in terms of digital privacy, and even refused to unlock an iPhone owned by Syed Farook, one of the shooters in the December 2015 attacks in San Bernardino, California, despite being ordered to do so by the FBI and a federal judge. Apple opposed the order, noting that it would set a "dangerous precedent."

The EFF, an eminent international non-profit digital rights group, has issued an extensive condemnation of Apple's move to scan users' iCloud libraries and messages, saying that it is extremely "disappointed" that a "champion of end-to-end encryption" is undertaking a "shocking about-face for users who have relied on the company's leadership in privacy and security." The EFF highlighted how various governments around the world have passed laws that demand surveillance and censorship of content on various platforms, including messaging apps, and that Apple's move to scan messages and "iCloud Photos" could be legally required to encompass additional materials or easily be widened. "Make no mistake: this is a decrease in privacy for all "iCloud Photos" users, not an improvement," the EFF cautioned.

Programming

After YouTube-dl Incident, GitHub's DMCA Process Now Includes Free Legal Help (venturebeat.com) 30

"GitHub has announced a partnership with the Stanford Law School to support developers facing takedown requests related to the Digital Millennium Copyright Act (DMCA)," reports VentureBeat: While the DMCA may be better known as a law for protecting copyrighted works such as movies and music, it also has provisions (17 U.S.C. 1201) that criminalize attempts to circumvent copyright-protection controls — this includes any software that might help anyone infringe DMCA regulations. However, as with the countless spurious takedown notices delivered to online content creators, open source coders too have often found themselves in the DMCA firing line with little option but to comply with the request even if they have done nothing wrong. The problem, ultimately, is that freelance coders or small developer teams often don't have the resources to fight DMCA requests, which puts the balance of power in the hands of deep-pocketed corporations that may wish to use DMCA to stifle innovation or competition. Thus, GitHub's new Developer Rights Fellowship — in conjunction with Stanford Law School's Juelsgaard Intellectual Property and Innovation Clinic — seeks to help developers put in such a position by offering them free legal support.

The initiative follows some eight months after GitHub announced it was overhauling its Section 1201 claim review process in the wake of a takedown request made by the Recording Industry Association of America (RIAA), which had been widely criticized as an abuse of DMCA... [M]oving forward, whenever GitHub notifies a developer of a "valid takedown claim," it will present them with an option to request free independent legal counsel.

The fellowship will also be charged with "researching, educating, and advocating on DMCA and other legal issues important for software innovation," GitHub's head of developer policy Mike Linksvayer said in a blog post, along with other related programs.

Explaining their rationale, GitHub's blog post argues that currently "When developers looking to learn, tinker, or make beneficial tools face a takedown claim under Section 1201, it is often simpler and safer to just fold, removing code from public view and out of the common good.

"At GitHub, we want to fix this."
Electronic Frontier Foundation

EFF Sues US Postal Office For Records About Covert Social Media Spying Program (eff.org) 57

The Electronic Frontier Foundation (EFF) filed a Freedom of Information Act (FOIA) lawsuit against the U.S. Postal Service and its inspection agency seeking records about a covert program to secretly comb through online posts of social media users before street protests, raising concerns about chilling the privacy and expressive activity of internet users. From the press release: Under an initiative called Internet Covert Operations Program, analysts at the U.S. Postal Inspection Service (USPIS), the Postal Service's law enforcement arm, sorted through massive amounts of data created by social media users to surveil what they were saying and sharing, according to media reports. Internet users' posts on Facebook, Twitter, Parler, and Telegraph were likely swept up in the surveillance program. USPIS has not disclosed details about the program or any records responding to EFF's FOIA request asking for information about the creation and operation of the surveillance initiative. In addition to those records, EFF is also seeking records on the program's policies and analysis of the information collected, and communications with other federal agencies, including the Department of Homeland Security (DHS), about the use of social media content gathered under the program.

Media reports revealed that a government bulletin dated March 16 was distributed across DHS's state-run security threat centers, alerting law enforcement agencies that USPIS analysts monitored "significant activity regarding planned protests occurring internationally and domestically on March 20, 2021." Protests around the country were planned for that day, and locations and times were being shared on Parler, Telegram, Twitter, and Facebook, the bulletin said. "We're filing this FOIA lawsuit to shine a light on why and how the Postal Service is monitoring online speech. This lawsuit aims to protect the right to protest," said Houston Davidson, EFF public interest legal fellow. "The government has never explained the legal justifications for this surveillance. We're asking a court to order the USPIS to disclose details about this speech-monitoring program, which threatens constitutional guarantees of free expression and privacy."

Cellphones

Church Official Exposed Through America's 'Vast and Largely Unregulated Data-Harvesting' (nytimes.com) 101

The New York Times' On Tech newsletter shares a thought-provoking story: This week, a top official in the Roman Catholic Church's American hierarchy resigned after a news site said that it had data from his cellphone that appeared to show the administrator using the L.G.B.T.Q. dating app Grindr and regularly going to gay bars. Journalists had access to data on the movements and digital trails of his mobile phone for parts of three years and were able to retrace where he went.

I know that people will have complex feelings about this matter. Some of you may believe that it's acceptable to use any means necessary to determine when a public figure is breaking his promises, including when it's a priest who may have broken his vow of celibacy. To me, though, this isn't about one man. This is about a structural failure that allows real-time data on Americans' movements to exist in the first place and to be used without our knowledge or true consent. This case shows the tangible consequences of practices by America's vast and largely unregulated data-harvesting industries. The reality in the United States is that there are few legal or other restrictions to prevent companies from compiling the precise locations of where we roam and selling that information to anyone.

This data is in the hands of companies that we deal with daily, like Facebook and Google, and also with information-for-hire middlemen that we never directly interact with. This data is often packaged in bulk and is anonymous in theory, but it can often be traced back to individuals, as the tale of the Catholic official shows...

Losing control of our data was not inevitable. It was a choice — or rather a failure over years by individuals, governments and corporations to think through the consequences of the digital age.

We can now choose a different path.

"Data brokers are the problem," writes the EFF, arguing that the incident "shows once again how easy it is for anyone to take advantage of data brokers' stores to cause real harm." This is not the first time Grindr has been in the spotlight for sharing user information with third-party data brokers... But Grindr is just one of countless apps engaging in this exact kind of data sharing. The real problem is the many data brokers and ad tech companies that amass and sell this sensitive data without anything resembling real users' consent.

Apps and data brokers claim they are only sharing so-called "anonymized" data. But that's simply not possible. Data brokers sell rich profiles with more than enough information to link sensitive data to real people, even if the brokers don't include a legal name. In particular, there's no such thing as "anonymous" location data. Data points like one's home or workplace are identifiers themselves, and a malicious observer can connect movements to these and other destinations. Another piece of the puzzle is the ad ID, another so-called "anonymous" label that identifies a device. Apps share ad IDs with third parties, and an entire industry of "identity resolution" companies can readily link ad IDs to real people at scale.

All of this underlines just how harmful a collection of mundane-seeming data points can become in the wrong hands... That's why the U.S. needs comprehensive data privacy regulation more than ever. This kind of abuse is not inevitable, and it must not become the norm.

Crime

A Threat to Privacy in the Expanded Use of License Plate-Scanning Cameras? (yahoo.com) 149

Long-time Slashdot reader BigVig209 shares a Chicago Tribune report "on how suburban police departments in the Chicago area use license plate cameras as a crime-fighting tool." Critics of the cameras note that only a tiny percentage of the billions of plates photographed lead to an arrest, and that the cameras generally haven't been shown to prevent crime. More importantly they say the devices are unregulated, track innocent people and can be misused to invade drivers' privacy. The controversy comes as suburban police departments continue to expand the use of the cameras to combat rising crime. Law enforcement officials say they are taking steps to safeguard the data. But privacy advocates say the state should pass a law to ensure against improper use of a nationwide surveillance system operated by private companies.

Across the Chicago area, one survey by the nonprofit watchdog group Muckrock found 88 cameras used by more than two dozen police agencies. In response to a surge in shootings, after much delay, state police are taking steps to add the cameras to area expressways. In the northwest suburbs, Vernon Hills and Niles are among several departments that have added license plate cameras recently. The city of Chicago has ordered more than 200 cameras for its squad cars. In Indiana, the city of Hammond has taken steps to record nearly every vehicle that comes into town.

Not all police like the devices. In the southwest suburbs, Darien and La Grange had issues in years past with the cameras making false readings, and some officers stopped using them...

Homeowner associations may also tie their cameras into the systems, which is what led to the arrest in Vernon Hills. One of the leading sellers of such cameras, Vigilant Solutions, a part of Chicago-based Motorola Solutions, has collected billions of license plate numbers in its National Vehicle Location Service. The database shares information from thousands of police agencies, and can be used to find cars across the country... Then there is the potential for abuse by police. One investigation found that officers nationwide misused agency databases hundreds of times, to check on ex-girlfriends, romantic rivals, or perceived enemies. To address those concerns, 16 states have passed laws restricting the use of the cameras.

The article cites an EFF survey which found 99.5% of scanned plates weren't under suspicion — "and that police shared their data with an average of 160 other agencies."

"Two big concerns the American Civil Liberties Union has always had about the cameras are that the information can be used to track the movements of the general population, and often is sold by operators to third parties like credit and insurance companies."
Electronic Frontier Foundation

'Golden Age of Surveillance', as Police Make 112,000 Data Requests in 6 Months (newportri.com) 98

"When U.S. law enforcement officials need to cast a wide net for information, they're increasingly turning to the vast digital ponds of personal data created by Big Tech companies via the devices and online services that have hooked billions of people around the world," reports the Associated Press: Data compiled by four of the biggest tech companies shows that law enforcement requests for user information — phone calls, emails, texts, photos, shopping histories, driving routes and more — have more than tripled in the U.S. since 2015. Police are also increasingly savvy about covering their tracks so as not to alert suspects of their interest... In just the first half of 2020 — the most recent data available — Apple, Google, Facebook and Microsoft together fielded more than 112,000 data requests from local, state and federal officials. The companies agreed to hand over some data in 85% of those cases. Facebook, including its Instagram service, accounted for the largest number of disclosures.

Consider Newport, a coastal city of 24,000 residents that attracts a flood of summer tourists. Fewer than 100 officers patrol the city — but they make multiple requests a week for online data from tech companies. That's because most crimes — from larceny and financial scams to a recent fatal house party stabbing at a vacation rental booked online — can be at least partly traced on the internet. Tech providers, especially social media platforms, offer a "treasure trove of information" that can help solve them, said Lt. Robert Salter, a supervising police detective in Newport.

"Everything happens on Facebook," Salter said. "The amount of information you can get from people's conversations online — it's insane."

As ordinary people have become increasingly dependent on Big Tech services to help manage their lives, American law enforcement officials have grown far more savvy about technology than they were five or six years ago, said Cindy Cohn, executive director of the Electronic Frontier Foundation, a digital rights group. That's created what Cohn calls "the golden age of government surveillance." Not only has it become far easier for police to trace the online trails left by suspects, they can also frequently hide their requests by obtaining gag orders from judges and magistrates. Those orders block Big Tech companies from notifying the target of a subpoena or warrant of law enforcement's interest in their information — contrary to the companies' stated policies...

Nearly all big tech companies — from Amazon to rental sites like Airbnb, ride-hailing services like Uber and Lyft and service providers like Verizon — now have teams to respond...

Cohn says American law is still premised on the outdated idea that valuable data is stored at home — and can thus be protected by precluding home searches without a warrant. At the very least, Cohn suggests more tech companies should be using encryption technology to protect data access without the user's key.

But Newport supervising police detective Lt. Robert Salter supplied his own answer for people worried about how police officers are requesting more and more data. "Don't commit crimes and don't use your computer and phones to do it."

Slashdot Top Deals