AI

Publishers and Law Professors Back Authors in Meta AI Copyright Battle 14

Publishers and law professors have filed amicus briefs supporting authors who sued Meta over its AI training practices, arguing that the company's use of "thousands of pirated books" fails to qualify as fair use under copyright law.

The filings [PDF] in California's Northern District federal court came from copyright law professors, the International Association of Scientific, Technical and Medical Publishers (STM), Copyright Alliance, and Association of American Publishers. The briefs counter earlier support for Meta from the Electronic Frontier Foundation and IP professors.

While Meta's defenders pointed to the 2015 Google Books ruling as precedent, the copyright professors distinguished Meta's use, arguing Google Books told users something "about" books without "exploiting expressive elements," whereas AI models leverage the books' creative content.

"Meta's use wasn't transformative because, like the AI models, the plaintiffs' works also increased 'knowledge and skill,'" the professors wrote, warning of a "cascading effect" if Meta prevails. STM is specifically challenging Meta's data sources: "While Meta attempts to label them 'publicly available datasets,' they are only 'publicly available' because those perpetuating their existence are breaking the law."
Earth

Clean Energy Powered 40% of Global Electricity in 2024, Report Finds (theguardian.com) 84

The world used clean power sources to meet more than 40% of its electricity demand last year for the first time since the 1940s, figures show. The Guardian: A report by the energy thinktank Ember said the milestone was powered by a boom in solar power capacity, which has doubled in the last three years. The report found that solar farms had been the world's fastest-growing source of energy for the last 20 consecutive years.

Phil MacDonald, Ember's managing director, said: "Solar power has become the engine of the global energy transition. Paired with battery storage, solar is set to be an unstoppable force. As the fastest-growing and largest source of new electricity, it is critical in meeting the world's ever-increasing demand for electricity."

Overall, solar power remains a relatively small part of the global energy system. It made up almost 7% of the world's electricity last year, according to Ember, while wind power made up just over 8% of the global power system. The fast-growing technologies remain dwarfed by hydro power, which has remained relatively steady in recent years, and made up 14% of the worldâ(TM)s electricity in 2024.

Data Storage

Micron To Impose Tariff-Related Surcharge on SSDs, Other Products (reuters.com) 159

Micron has informed US customers it will implement surcharges on memory modules and solid-state drives starting Wednesday to offset President Trump's new tariffs, according to Reuters. While semiconductors received exemptions in Trump's recent trade action, memory storage products didn't escape the new duties.

Micron, which manufactures primarily in Asian countries including China and Taiwan, had previously signaled during a March earnings call that tariff costs would be passed to customers.
The Internet

Scientists Debate Actual Weight of the Internet (wired.com) 54

The internet's physical mass remains contested among scientists, with estimates ranging from a strawberry to something almost unimaginably small. In 2006, Harvard physicist Russell Seitz calculated the internet weighed roughly 50 grams based on server energy, a figure that would now equate to potato-weight given internet growth.

Christopher White, president of NEC Laboratories America, has dismissed this calculation as "just wrong." White suggests a more accurate method that accounts for the energy needed to encode all internet data in one place, yielding approximately 53 quadrillionths of a gram at room temperature. Alternatively, if the internet's projected 175 zettabytes of data were stored in DNA -- a storage medium scientists are actively exploring -- it would weigh 960,947 grams, equivalent to 10.6 American males. Though scientists debate measurement methods, White asserts the web's true complexity makes it "essentially unknowable."
Linux

An Interactive-Speed Linux Computer Made of Only 3 8-Pin Chips (dmitry.gr) 35

Software engineer and longtime Slashdot reader, Dmitry Grinberg (dmitrygr), shares a recent project they've been working on: "an interactive-speed Linux on a tiny board you can easily build with only 3 8-pin chips": There was a time when one could order a kit and assemble a computer at home. It would do just about what a contemporary store-bought computer could do. That time is long gone. Modern computers are made of hundreds of huge complex chips with no public datasheets and many hundreds of watts of power supplied to them over complex power delivery topologies. It does not help that modern operating systems require gigabytes of RAM, terabytes of storage, and always-on internet connectivity to properly spy on you. But what if one tried to fit a modern computer into a kit that could be easily assembled at home? What if the kit only had three chips, each with only 8 pins? Can it be done? Yes. The system runs a custom MIPS emulator written in ARMv6 assembly and includes a custom bootloader that supports firmware updates via FAT16-formatted SD cards. Clever pin-sharing hacks allow all components (RAM, SD, serial I/O) to work despite the 6 usable I/O pins. Overclocked to up to 150MHz, the board boots into a full Linux shell in about a minute and performs at ~1.65MHz MIPS-equivalent speed.

It's not fast, writes Dmitry, but it's fully functional -- you can edit files, compile code, and even install Debian packages. A kit may be made available if a partner is found.
AI

DeepMind Details All the Ways AGI Could Wreck the World (arstechnica.com) 36

An anonymous reader quotes a report from Ars Technica, written by Ryan Whitwam: Researchers at DeepMind have ... released a new technical paper (PDF) that explains how to develop AGI safely, which you can download at your convenience. It contains a huge amount of detail, clocking in at 108 pages before references. While some in the AI field believe AGI is a pipe dream, the authors of the DeepMind paper project that it could happen by 2030. With that in mind, they aimed to understand the risks of a human-like synthetic intelligence, which they acknowledge could lead to "severe harm." This work has identified four possible types of AGI risk, along with suggestions on how we might ameliorate said risks. The DeepMind team, led by company co-founder Shane Legg, categorized the negative AGI outcomes as misuse, misalignment, mistakes, and structural risks.

The first possible issue, misuse, is fundamentally similar to current AI risks. However, because AGI will be more powerful by definition, the damage it could do is much greater. A ne'er-do-well with access to AGI could misuse the system to do harm, for example, by asking the system to identify and exploit zero-day vulnerabilities or create a designer virus that could be used as a bioweapon. DeepMind says companies developing AGI will have to conduct extensive testing and create robust post-training safety protocols. Essentially, AI guardrails on steroids. They also suggest devising a method to suppress dangerous capabilities entirely, sometimes called "unlearning," but it's unclear if this is possible without substantially limiting models. Misalignment is largely not something we have to worry about with generative AI as it currently exists. This type of AGI harm is envisioned as a rogue machine that has shaken off the limits imposed by its designers. Terminators, anyone? More specifically, the AI takes actions it knows the developer did not intend. DeepMind says its standard for misalignment here is more advanced than simple deception or scheming as seen in the current literature.

To avoid that, DeepMind suggests developers use techniques like amplified oversight, in which two copies of an AI check each other's output, to create robust systems that aren't likely to go rogue. If that fails, DeepMind suggests intensive stress testing and monitoring to watch for any hint that an AI might be turning against us. Keeping AGIs in virtual sandboxes with strict security and direct human oversight could help mitigate issues arising from misalignment. Basically, make sure there's an "off" switch. If, on the other hand, an AI didn't know that its output would be harmful and the human operator didn't intend for it to be, that's a mistake. We get plenty of those with current AI systems -- remember when Google said to put glue on pizza? The "glue" for AGI could be much stickier, though. DeepMind notes that militaries may deploy AGI due to "competitive pressure," but such systems could make serious mistakes as they will be tasked with much more elaborate functions than today's AI. The paper doesn't have a great solution for mitigating mistakes. It boils down to not letting AGI get too powerful in the first place. DeepMind calls for deploying slowly and limiting AGI authority. The study also suggests passing AGI commands through a "shield" system that ensures they are safe before implementation.

Lastly, there are structural risks, which DeepMind defines as the unintended but real consequences of multi-agent systems contributing to our already complex human existence. For example, AGI could create false information that is so believable that we no longer know who or what to trust. The paper also raises the possibility that AGI could accumulate more and more control over economic and political systems, perhaps by devising heavy-handed tariff schemes. Then one day, we look up and realize the machines are in charge instead of us. This category of risk is also the hardest to guard against because it would depend on how people, infrastructure, and institutions operate in the future.

Microsoft

Microsoft Urges Businesses To Abandon Office Perpetual Licenses 95

Microsoft is pushing businesses to shift away from perpetual Office licenses to Microsoft 365 subscriptions, citing collaboration limitations and rising IT costs associated with standalone software. "You may have started noticing limitations," Microsoft says in a post. "Your apps are stuck on your desktop, limiting productivity anytime you're away from your office. You can't easily access your files or collaborate when working remotely."

In its pitch, the Windows-maker says Microsoft 365 includes Office applications as well as security features, AI tools, and cloud storage. The post cites a Microsoft-commissioned Forrester study that claims the subscription model delivers "223% ROI over three years, with a payback period of less than six months" and "over $500,000 in benefits over three years."
Nintendo

Nintendo Switch 2 Arrives on June 5, Priced at $450 (engadget.com) 46

Nintendo's Switch 2, priced at $450, launches June 5 with a 7.9-inch LCD screen offering 1080p resolution, HDR support, and 120Hz refresh capability. The device maintains the original Switch's 13.99mm thickness while increasing internal storage to 256GB from the previous 32GB.

The console outputs at 4K/60fps when docked, with the dock featuring a built-in cooling fan. Two USB-C ports handle accessories and charging. The system supports microSD Express cards but not original Switch microSD cards. Joy-Con controllers now attach via magnets instead of sliding rails and feature mouse-like functionality with compatible games. Both Joy-Cons and the new Pro Controller include a "C" button that activates a chat menu for the new "Game Chat" feature.

Game cards for Switch 2 will be red rather than black. The system maintains backward compatibility with original Switch cartridges and introduces a "Game Share" feature for local game sharing between consoles.
AI

MCP: the New 'USB-C For AI' That's Bringing Fierce Rivals Together (arstechnica.com) 30

An anonymous reader quotes a report from Ars Technica: What does it take to get OpenAI and Anthropic -- two competitors in the AI assistant market -- to get along? Despite a fundamental difference in direction that led Anthropic's founders to quit OpenAI in 2020 and later create the Claude AI assistant, a shared technical hurdle has now brought them together: How to easily connect their AI models to external data sources. The solution comes from Anthropic, which developed and released an open specification called Model Context Protocol (MCP) in November 2024. MCP establishes a royalty-free protocol that allows AI models to connect with outside data sources and services without requiring unique integrations for each service.

"Think of MCP as a USB-C port for AI applications," wrote Anthropic in MCP's documentation. The analogy is imperfect, but it represents the idea that, similar to how USB-C unified various cables and ports (with admittedly a debatable level of success), MCP aims to standardize how AI models connect to the infoscape around them. So far, MCP has also garnered interest from multiple tech companies in a rare show of cross-platform collaboration. For example, Microsoft has integrated MCP into its Azure OpenAI service, and as we mentioned above, Anthropic competitor OpenAI is on board. Last week, OpenAI acknowledged MCP in its Agents API documentation, with vocal support from the boss upstairs. "People love MCP and we are excited to add support across our products," wrote OpenAI CEO Sam Altman on X last Wednesday.

MCP has also rapidly begun to gain community support in recent months. For example, just browsing this list of over 300 open source servers shared on GitHub reveals growing interest in standardizing AI-to-tool connections. The collection spans diverse domains, including database connectors like PostgreSQL, MySQL, and vector databases; development tools that integrate with Git repositories and code editors; file system access for various storage platforms; knowledge retrieval systems for documents and websites; and specialized tools for finance, health care, and creative applications. Other notable examples include servers that connect AI models to home automation systems, real-time weather data, e-commerce platforms, and music streaming services. Some implementations allow AI assistants to interact with gaming engines, 3D modeling software, and IoT devices.

Privacy

Nearly 1.5 Million Private Photos from Five Dating Apps Were Exposed Online (bbc.com) 32

"Researchers have discovered nearly 1.5 million pictures from specialist dating apps — many of which are explicit — being stored online without password protection," reports the BBC, "leaving them vulnerable to hackers and extortionists."

And the images weren't limited to those from profiles, the BBC learned from the ethical hacker who discovered the issue. "They included pictures which had been sent privately in messages, and even some which had been removed by moderators..." Anyone with the link was able to view the private photos from five platforms developed by M.A.D Mobile [including two kink/BDSM sites and two LGBT apps]... These services are used by an estimated 800,000 to 900,000 people.

M.A.D Mobile was first warned about the security flaw on 20th January but didn't take action until the BBC emailed on Friday. They have since fixed it but not said how it happened or why they failed to protect the sensitive images. Ethical hacker Aras Nazarovas from Cybernews first alerted the firm about the security hole after finding the location of the online storage used by the apps by analysing the code that powers the services...

None of the text content of private messages was found to be stored in this way and the images are not labelled with user names or real names, which would make crafting targeted attacks at users more complex.

In an email M.A.D Mobile said it was grateful to the researcher for uncovering the vulnerability in the apps to prevent a data breach from occurring. But there's no guarantee that Mr Nazarovas was the only hacker to have found the image stash.

"Mr Nazarovas and his team decided to raise the alarm on Thursday while the issue was still live as they were concerned the company was not doing anything to fix it..."
Biotech

Scientists Create New Heavy-Metal Molecule: 'Berkelocene' (mercurynews.com) 21

An anonymous reader shared this report from the Mercury News: After a year of fastidious planning, a microscopic sample of the ultra-rare radioactive element berkelium arrived at a Berkeley Lab. With just 48 hours to experiment before it would become unusable, a group of nearly 20 researchers focused intently on creating a brand-new molecule. Using a chemical glove box, a polycarbonate glass box with protruding gloves that shields substances from oxygen and moisture, scientists combined the berkelium metal with an organic molecule containing only carbon and hydrogen to create a chemical reaction... [Post-doc researcher Dominic] Russo, researcher Stefan Minasian, and 17 other scientists at Lawrence Berkeley National Laboratory had created berkelocene, a new molecule that usurps theorists' expectations about how carbon bonds with heavy-metal elements.

In the future, berkelocene may help humanity safely dispose of nuclear waste, according to a study published in the academic journal Science... The new molecular structure is, in the nomenclature of researchers, a "sandwich." In this formation, a berkelium atom, serving as the filling, lays in between two 8-membered carbon rings — the "bread" — and resembles an atomic foot-long sub. "It has this very symmetric geometry, and it's the first time that that's been observed," Minasian said.

The researchers believe more accurate models for how actinide elements like uranium behave will help solve problems related to long-term nuclear waste storage.
Science

A New Image File Format Efficiently Stores Invisible Light Data (arstechnica.com) 11

An anonymous reader quotes a report from Ars Technica: Imagine working with special cameras that capture light your eyes can't even see -- ultraviolet rays that cause sunburn, infrared heat signatures that reveal hidden writing, or specific wavelengths that plants use for photosynthesis. Or perhaps using a special camera designed to distinguish the subtle visible differences that make paint colors appear just right under specific lighting. Scientists and engineers do this every day, and they're drowning in the resulting data. A new compression format called Spectral JPEG XL might finally solve this growing problem in scientific visualization and computer graphics. Researchers Alban Fichet and Christoph Peters of Intel Corporation detailed the format in a recent paper published in the Journal of Computer Graphics Techniques (JCGT). It tackles a serious bottleneck for industries working with these specialized images. These spectral files can contain 30, 100, or more data points per pixel, causing file sizes to balloon into multi-gigabyte territory -- making them unwieldy to store and analyze.

[...] The current standard format for storing this kind of data, OpenEXR, wasn't designed with these massive spectral requirements in mind. Even with built-in lossless compression methods like ZIP, the files remain unwieldy for practical work as these methods struggle with the large number of spectral channels. Spectral JPEG XL utilizes a technique used with human-visible images, a math trick called a discrete cosine transform (DCT), to make these massive files smaller. Instead of storing the exact light intensity at every single wavelength (which creates huge files), it transforms this information into a different form. [...]

According to the researchers, the massive file sizes of spectral images have reportedly been a real barrier to adoption in industries that would benefit from their accuracy. Smaller files mean faster transfer times, reduced storage costs, and the ability to work with these images more interactively without specialized hardware. The results reported by the researchers seem impressive -- with their technique, spectral image files shrink by 10 to 60 times compared to standard OpenEXR lossless compression, bringing them down to sizes comparable to regular high-quality photos. They also preserve key OpenEXR features like metadata and high dynamic range support.
The report notes that broader adoption "hinges on the continued development and refinement of the software tools that handle JPEG XL encoding and decoding."

Some scientific applications may also see JPEG XL's lossy approach as a drawback. "Some researchers working with spectral data might readily accept the trade-off for the practical benefits of smaller files and faster processing," reports Ars. "Others handling particularly sensitive measurements might need to seek alternative methods of storage."
Operating Systems

Linux Kernel 6.14 Is a Big Leap Forward In Performance, Windows Compatibility (zdnet.com) 34

An anonymous reader quotes a report from ZDNet, written by Steven Vaughan-Nichols: Despite the minor delay, Linux 6.14 arrives packed with cutting-edge features and improvements to power upcoming Linux distributions, such as the forthcoming Ubuntu 25.04 and Fedora 42. The big news for desktop users is the improved NTSYNC driver, especially those who like to play Windows games or run Windows programs on Linux. This driver is designed to emulate Windows NT synchronization primitives. What that feature means for you and me is that it will significantly improve the performance of Windows programs running on Wine and Steam Play. [...] Gamers always want the best possible graphics performance, so they'll also be happy to see that Linux now supports recently launched AMD RDNA 4 graphics cards. This approach includes support for the AMD Radeon RX 9070 XT and RX 9070 graphics cards. Combine this support with the recently improved open-source RADV driver and AMD gamers should see the best speed yet on their gaming rigs.

Of course, the release is not just for gamers. Linux 6.14 also includes several AMD and Intel processor enhancements. These boosts focus on power management, thermal control, and compute performance optimizations. These updates are expected to improve overall system efficiency and performance. This release also comes with the AMDXDNA driver, which provides official support for AMD's neural processing units based on the XDNA architecture. This integration enables efficient execution of AI workloads, such as convolutional neural networks and large language models, directly on supported AMD hardware. While Rust has faced some difficulties in recent months in Linux, more Rust programming language abstractions have been integrated into the kernel, laying the groundwork for future drivers written in Rust. [...] Besides drivers, Miguel Ojeda, Rust for Linux's lead developer, said recently that the introduction of the macro for smart pointers with Rust 1.84: derive(CoercePointee) is an "important milestone on the way to building a kernel that only uses stable Rust functions." This approach will also make integrating C and Rust code easier. We're getting much closer to Rust being grafted into Linux's tree.

In addition, Linux 6.14 supports Qualcomm's latest Snapdragon 8 Elite mobile processor, enhancing performance and stability for devices powered by this chipset. That support means you can expect to see much faster Android-based smartphones later this year. This release includes a patch for the so-called GhostWrite vulnerability, which can be used to root some RISC-V processors. This fix will block such attacks. Additionally, Linux 6.14 includes improvements for the copy-on-write Btrfs file system/logical volume manager. These primarily read-balancing methods offer flexibility for different RAID hardware configurations and workloads. Additionally, support for uncached buffered I/O optimizes memory usage on systems with fast storage devices.
Linux 6.14 is available for download here.
Android

Google Will Develop the Android OS Fully In Private 20

An anonymous reader quotes a report from Android Authority: No matter the manufacturer, every Android phone has one thing in common: its software base. Manufacturers can heavily customize the look and feel of the Android OS they ship on their Android devices, but under the hood, the core system functionality is derived from the same open-source foundation: the Android Open Source Project. After over 16 years, Google is making big changes to how it develops the open source version of Android in an effort to streamline its development. [...] Beginning next week, all Android development will occur within Google's internal branches, and the source code for changes will only be released when Google publishes a new branch containing those changes. As this is already the practice for most Android component changes, Google is simply consolidating its development efforts into a single branch.

This change will have minimal impact on regular users. While it streamlines Android OS development for Google, potentially affecting the speed of new version development and bug reduction, the overall effect will likely be imperceptible. Therefore, don't expect this change to accelerate OS updates for your phone. This change will also have minimal impact on most developers. App developers are unaffected, as it pertains only to platform development. Platform developers, including those who build custom ROMs, will largely also see little change, since they typically base their work on specific tags or release branches, not the main AOSP branch. Similarly, companies that release forked AOSP products rarely use the main AOSP branch due to its inherent instability.

External developers who enjoy reading or contributing to AOSP will likely be dismayed by this news, as it reduces their insight into Google's development efforts. Without a GMS license, contributing to Android OS development becomes more challenging, as the available code will consistently lag behind by weeks or months. This news will also make it more challenging for some developers to keep up with new Android platform changes, as they'll no longer be able to track changes in AOSP. For reporters, this change means less access to potentially revealing information, as AOSP patches often provide insights into Google's development plans. [...] Google will share more details about this change when it announces it later this week. If you're interested in learning more, be sure to keep an eye out for the announcement and new documentation on source.android.com.
Android Authority's Mishaal Rahman says Google is "committed to publishing Android's source code, so this change doesn't mean that Android is becoming closed-source."

"What will change is the frequency of public source code releases for specific Android components," says Rahman. "Some components like the build system, update engine, Bluetooth stack, Virtualization framework, and SELinux configuration are currently AOSP-first, meaning they're developed fully in public. Most Android components like the core OS framework are primarily developed internally, although some features, such as the unlocked-only storage area API, are still developed within AOSP."
AI

DeepSeek-V3 Now Runs At 20 Tokens Per Second On Mac Studio 90

An anonymous reader quotes a report from VentureBeat: Chinese AI startup DeepSeek has quietly released a new large language model that's already sending ripples through the artificial intelligence industry -- not just for its capabilities, but for how it's being deployed. The 641-gigabyte model, dubbed DeepSeek-V3-0324, appeared on AI repository Hugging Face today with virtually no announcement (just an empty README file), continuing the company's pattern of low-key but impactful releases. What makes this launch particularly notable is the model's MIT license -- making it freely available for commercial use -- and early reports that it can run directly on consumer-grade hardware, specifically Apple's Mac Studio with M3 Ultra chip.

"The new DeepSeek-V3-0324 in 4-bit runs at > 20 tokens/second on a 512GB M3 Ultra with mlx-lm!" wrote AI researcher Awni Hannun on social media. While the $9,499 Mac Studio might stretch the definition of "consumer hardware," the ability to run such a massive model locally is a major departure from the data center requirements typically associated with state-of-the-art AI. [...] Simon Willison, a developer tools creator, noted in a blog post that a 4-bit quantized version reduces the storage footprint to 352GB, making it feasible to run on high-end consumer hardware like the Mac Studio with M3 Ultra chip. This represents a potentially significant shift in AI deployment. While traditional AI infrastructure typically relies on multiple Nvidia GPUs consuming several kilowatts of power, the Mac Studio draws less than 200 watts during inference. This efficiency gap suggests the AI industry may need to rethink assumptions about infrastructure requirements for top-tier model performance.
"The implications of an advanced open-source reasoning model cannot be overstated," reports VentureBeat. "Current reasoning models like OpenAI's o1 and DeepSeek's R1 represent the cutting edge of AI capabilities, demonstrating unprecedented problem-solving abilities in domains from mathematics to coding. Making this technology freely available would democratize access to AI systems currently limited to those with substantial budgets."

"If DeepSeek-R2 follows the trajectory set by R1, it could present a direct challenge to GPT-5, OpenAI's next flagship model rumored for release in coming months. The contrast between OpenAI's closed, heavily-funded approach and DeepSeek's open, resource-efficient strategy represents two competing visions for AI's future."
Google

Google Says It Might Have Deleted Your Maps Timeline Data (arstechnica.com) 14

Google has confirmed that a technical issue has permanently deleted location history data for numerous users of its Maps application, with no recovery possible for most affected customers. The problem emerged after Google transitioned its Timeline feature from cloud to on-device storage in 2024 to enhance privacy protections. Users began reporting missing historical location data on support forums and social media platforms in recent weeks. "This is the result of a technical issue and not user error or an intentional change," said a Google spokesperson. Only users who manually enabled encrypted cloud backups before the incident can recover their data, according to Google. The company began shifting location storage policies in 2023, initially stopping collection of sensitive location data including visits to abortion clinics and domestic violence shelters.
GNU is Not Unix

FSF Holds Live Auction of 'Historically Important' Free Software Memorabilia 6

In 30 minutes the Free Software Foundation holds a live auction of memorabilia to celebrate their upcoming 40th anniversary. "By moving out of the FSF office, we got to sort through all the fun and historically important memorabilia and selected the best ones," they announced earlier — and 25 items will up for bids. (To participate in the live auction, you must register in advance.)

"This is your chance to get your very own personal souvenir of the FSF," explains an 11-page auction booklet, "from original GNU art to a famous katana and the Internet Hall of Fame medal of the FSF's founder." That's right... a katana. Once upon a time, this 41-inch blade turned heads at the FSF's tech team office. Donated by FSF friends and fans of the XKCD webcomic #225, it became a lighthearted "weapon" in the war for user freedom. As RMS himself is anti-violence, he made a silly joke by examining the katana closely instead of brandishing it, symbolizing that software freedom can be defended with wit. In a legendary photo, this was perceived as if he sniffed the blade. Between the etched dragon on the scabbard and the wavy hamon on the blade, it's as flashy as it is symbolic — especially if you like taking on proprietary software with style (and a dash of humor).
The auction is intended "to entrust some of the historically important free software memorabilia that were in the FSF's office and archive to the free software community instead of locking them away in a storage unit where no one can enjoy them.

"Hopefully, this way some of these unique items will be displayed in galleries or on the walls of free software enthusiasts. All auction proceeds will go towards the FSF's mission to promote computer user freedom."

And speaking of user freedom, here's how they described the Internet Hall of Fame medal: When Richard M. Stallman, the founder of the FSF, was inducted into the Internet Hall of Fame, it was the ultimate nod to free software's immense impact on the Internet... The medal is shiny, and the frame is fancy, but the real radiance is the recognition that the Internet might look much more locked down and dull without those original free software seeds. Hang it on your wall, and you'll be reminded that hacking for user freedom can change the world.
AI

New iOS Update Re-Enables Apple Intelligence For Users Who Had Turned It Off 54

Apple's latest iOS 18.3.2 update is automatically re-enabling its Apple Intelligence feature even for users who previously disabled it, adding to mounting concerns about the company's AI strategy.

The update presents a splash screen with no option except to tap "Continue," which activates the feature. Users must then manually disable it through settings, with the AI consuming up to 7GB of storage space. This forced activation comes amid broader troubles with Apple's AI initiatives.
Businesses

Software Startup Rippling Sues Competitor Deel, Claiming a Spy Carried Out 'Corporate Espionage' (cnbc.com) 10

HR software startup Rippling has sued competitor Deel, alleging that Deel orchestrated corporate espionage by recruiting an employee within Rippling to steal trade secrets, including customer data, sales strategies, and internal records. The lawsuit (PDF) claims the spy shared confidential information with Deel executives and a reporter, leading to legal action under the Racketeer Influenced and Corrupt Organizations (RICO) Act. Deel denies wrongdoing and plans to counter the claims. CNBC reports: The two startups are among the most world's most valuable. Investors valued Rippling at $13.5 billion in a funding round announced last year, while Deel told media outlets in 2023 that it was worth $12 billion. Deel ranked No. 28 on CNBC's 2024 Disruptor 50 list. "Weeks after Rippling is accused of violating sanctions law in Russia and seeding falsehoods about Deel, Rippling is trying to shift the narrative with these sensationalized claims," a Deel spokesperson told CNBC in an email. "We deny all legal wrongdoing and look forward to asserting our counterclaims."

Rippling confirmed its findings earlier this month. The company's general counsel sent a letter to three Deel executives that referred to a new Slack channel, and the Deel spy quickly looked for it. Rippling subsequently served a court order to the spy at its office in Dublin, Ireland requiring him to preserve information on his mobile phone. "Deel's spy lied to the court-appointed solicitor about the location of his phone, and then locked himself in a bathroom -- seemingly in order to delete evidence from his phone -- all while the independent solicitor repeatedly warned him not to delete materials from his device and that his non-compliance was breaching a court order with penal endorsement," Rippling said in Monday's filing. "The spy responded: 'I'm willing to take that risk.' He then fled the premises."
"We always prefer to win by building the best products and we don't turn to the legal system lightly," Parker Conrad, Rippling's co-founder and CEO, said in a Monday X post. "But we are taking this extraordinary step to send a clear message that this type of misconduct has no place in our industry."
Data Storage

Google Is Switching Legacy G Suite Users To Pooled Workspace Storage (theverge.com) 10

According to The Verge, legacy G Suite accounts will soon lose their individual storage allotment perks and be transitioned to pooled storage, which will be "shared across all users within your organization." The changes will come into effect starting May 1st. From the report: G Suite was rebranded as Workspace in 2020. G Suite legacy free edition, which Google stopped offering in 2012, provides each user with 15GB of free allocated storage and was offered for personal use -- making it ideal for families or groups that need to share a collective domain. Existing users have been permitted to access Workspace services at no additional charge, but Google says it's now making this change because pooled storage provides a "simpler and more flexible way to manage storage." "Google Workspace customers have had the benefit of pooled storage for years, and now we're rolling it out to users with this legacy offering," Google spokesperson Jenny Thomson told The Verge.

No action is required for the switch according to Google, and users cannot opt out of the pooled storage transition. The total amount of storage allocated to the entire G Suite account won't be reduced, but if more storage is required then it can be purchased "at a discount" starting at increments of 100GB, which typically costs $15. Google hasn't specified how large this discount will be. Storage limitations can still be set for each user within the G Suite account after the transition to prevent the collective storage pool from being hogged by individual users. These limits will have to be manually assigned by an account admin, however.

Slashdot Top Deals