Power

Four Radioactive Wasp Nests Found Near US Nuclear Storage Site (nbcnews.com) 76

The Washington Post reports: In early July, a wasp nest with a radiation level 10 times what is allowed by federal regulations was found inside the grounds of a sprawling Cold War-era nuclear site in South Carolina that today partly serves as a storage area for radioactive liquid waste. Federal officials said Friday that at least three more contaminated wasp nests were found within the 310-square-mile Savannah River Site, which encompasses an area more than four times the size of the District of Columbia...

[F]ederal authorities said that the discoveries were not cause for alarm and experts noted that the discovery of radioactivity in wildlife near nuclear facilities did not necessarily indicate the likelihood of a major leak... In a statement sent to reporters, Edwin Deshong, manager of the Savannah River Site's Office of Environmental Management, said the wasp nests had "very low levels of radioactive contamination" and did not pose health risks to the site's workers, nearby residents or the environment... The Savannah River Site's 43 active underground waste tanks have more than 34 million gallons of radioactive liquid waste. The oldest tanks have previously "developed small hairline cracks" that led to small-volume leaks, the Savannah River Site says on its website.

A July report after the first nest was found said there was "no impact" from the contaminated nest, the Post reports, with the nest's high radioactivity level due to "on-site legacy radioactive contamination" rather than "a loss of contamination control." More from the Associated Press: The tank farm is well inside the boundaries of the site and wasps generally fly just a few hundred yards from their nests, so there is no danger they are outside the facility, according to a statement from Savannah River Mission Completion which now oversees the site. If there had been wasps found, they would have significantly lower levels of radiation than their nests, according to the statement which was given to the Aiken Standard.
Thanks to long-time Slashdot reader sandbagger for sharing the news.
Power

Researchers Map Where Solar Energy Delivers the Biggest Climate Payoff (rutgers.edu) 58

A Rutgers-led study using advanced computational modeling reveals that expanding solar power by just 15% could reduce U.S. carbon emissions by over 8.5 million metric tons annually, with the greatest benefits concentrated in specific regions like California, Texas, and the Southwest. The study has been published in Science Advances. From the report: The study quantified both immediate and delayed emissions reductions resulting from added solar generation. For example, the researchers found that in California, a 15% increase in solar power at noon was associated with a reduction of 147.18 metric tons of CO2 in the region in the first hour and 16.08 metric tons eight hours later.

The researchers said their methods provide a more nuanced understanding of system-level impacts from solar expansion than previous studies, pinpointing where the benefits of increased solar energy adoption could best be realized. In some areas, such as California, Florida, the mid-Atlantic, the Midwest, Texas and the Southwest, small increases in solar were estimated to deliver large CO2 reductions, while in others, such as New England, the central U.S., and Tennessee, impacts were found to be minimal -- even at much larger increases in solar generation.

In addition, the researchers said their study demonstrates the significant spillover effects solar adoption has on neighboring regions, highlighting the value of coordinated clean energy efforts. For example, a 15% increase in solar capacity in California was associated with a reduction of 913 and 1,942 metric tons of CO2 emissions per day in the northwest and southwest regions, respectively.
"It was rewarding to see how advanced computational modeling can uncover not just the immediate, but also the delayed and far-reaching spillover effects of solar energy adoption," said the lead author Arpita Biswas, an assistant professor with the Department of Computer Science at the Rutgers School of Arts and Sciences. "From a computer science perspective, this study demonstrates the power of harnessing large-scale, high-resolution energy data to generate actionable insights. For policymakers and investors, it offers a roadmap for targeting solar investments where emissions reductions are most impactful and where solar energy infrastructure can yield the highest returns."
Power

Peak Energy Ships America's First Grid-Scale Sodium-Ion Battery (electrek.co) 107

Longtime Slashdot reader AmiMoJo shares a report from Electrek: Peak Energy shipped out its first sodium-ion battery energy storage system, and the New York-based company says it's achieved a first in three ways: the US's first grid-scale sodium-ion battery storage system; the largest sodium-ion phosphate pyrophosphate (NFPP) battery system in the world; and the first megawatt-hour scale battery to run entirely on passive cooling -- no fans, pumps, or vents. That's significant because removing moving parts and ditching active cooling systems eliminates fire risk.

According to the Electric Power Research Institute, 89% of battery fires in the US trace back to thermal management issues. Peak's design doesn't have those issues because it doesn't have those systems. Instead, the 3.5 MWh system uses a patent-pending passive cooling architecture that's simpler, more reliable, and cheaper to run and maintain. The company says its technology slashes auxiliary power needs by up to 90%, saves about $1 million annually per gigawatt hour of storage, and cuts battery degradation by 33% over a 20-year lifespan. [...]

Peak is working with nine utility and independent power producer (IPP) customers on a shared pilot this summer. That deployment unlocks nearly 1 GWh of future commercial contracts now under negotiation. The company plans to ship hundreds of megawatt hours of its new system over the next two years, and it's building its first US cell factory, which is set to start production in 2026.

Data Storage

'The Future is Not Self-Hosted' (drewlyton.com) 175

A software developer who built his own home server in response to Amazon's removal of Kindle book downloads now argues that self-hosting "is NOT the future we should be fighting for." Drew Lyton constructed a home server running open-source alternatives to Google Drive, Google Photos, Audible, Kindle, and Netflix after Amazon announced that "Kindle users would no longer be able to download and back up their book libraries to their computers."

The change prompted Amazon to update Kindle store language to say "users are purchasing licenses -- not books." Lyton's setup involved a Lenovo P520 with 128GB RAM, multiple hard drives, and Docker containers running applications like Immich for photo storage and Jellyfin for media streaming. The technical complexity required "138 words to describe but took me the better part of two weeks to actually do."

The implementation was successful but Lyton concluded that self-hosting "assumes isolated, independent systems are virtuous. But in reality, this simply makes them hugely inconvenient." He proposes "publicly funded, accessible, at cost cloud-services" as an alternative, suggesting libraries could provide "100GB of encrypted file storage, photo-sharing and document collaboration tools, and media streaming services -- all for free."
AI

Cheyenne To Host Massive AI Datacenter Using More Electricity Than All Wyoming Homes Combined (apnews.com) 51

An anonymous reader quotes a report from Ars Technica: An artificial intelligence data center that would use more electricity than every home in Wyoming combined before expanding to as much as five times that size will be built soon near Cheyenne, according to the city's mayor. "It's a game changer. It's huge," Mayor Patrick Collins said Monday. With cool weather -- good for keeping computer temperatures down -- and an abundance of inexpensive electricity from a top energy-producing state, Wyoming's capital has become a hub of computing power. The city has been home to Microsoft data centers since 2012. An $800 million data center announced last year by Facebook parent company Meta Platforms is nearing completion, Collins said.

The latest data center, a joint effort between regional energy infrastructure company Tallgrass and AI data center developer Crusoe, would begin at 1.8 gigawatts of electricity and be scalable to 10 gigawatts, according to a joint company statement. A gigawatt can power as many as 1 million homes. But that's more homes than Wyoming has people. The least populated state, Wyoming, has about 590,000 people. And it's a major exporter of energy. A top producer of coal, oil and gas, Wyoming ranks behind only Texas, New Mexico and Pennsylvania as a top net energy-producing state, according to the U.S. Energy Information Administration.

Accounting for fossil fuels, Wyoming produces about 12 times more energy than it consumes. The state exports almost three-fifths of the electricity it produces, according to the EIA. But this proposed data center is so big, it would have its own dedicated energy from gas generation and renewable sources, according to Collins and company officials. [...] While data centers are energy-hungry, experts say companies can help reduce their effect on the climate by powering them with renewable energy rather than fossil fuels. Even so, electricity customers might see their bills increase as utilities plan for massive data projects on the grid. The data center would be built several miles (kilometers) south of Cheyenne off U.S. 85 near the Colorado state line. State and local regulators would need to sign off on the project, but Collins was optimistic construction could begin soon. "I believe their plans are to go sooner rather than later," Collins said.

Power

AI Boom Sparks Fight Over Soaring Power Costs 88

Utilities across the U.S. are demanding tech companies pay larger shares of electricity infrastructure costs as AI drives unprecedented data center construction, creating tensions over who bears the financial burden of grid upgrades.

Virginia utility Dominion Energy received requests from data center developers requiring 40 gigawatts of electricity by the end of 2024, enough to power at least 10 million homes, and proposed measures requiring longer-term contracts and guaranteed payments. Ohio became one of the first states to mandate companies pay more connection costs after receiving power requests exceeding 50 times existing data center usage.

Tech giants Microsoft, Google, and Amazon plan to spend $80 billion, $85 billion, and $100 billion respectively this year on AI infrastructure, while utilities worry that grid upgrade costs will increase rates for residential customers.

Further reading: The AI explosion means millions are paying more for electricity
Printer

Anker Is No Longer Selling 3D Printers (theverge.com) 42

Anker has indefinitely paused sales of its 3D printers, with no clear plans to resume or release new models. Despite promises of ongoing support, critical replacement parts like hotends and extruders have quietly vanished from the EufyMake site, leaving customers and the maker community in the lurch. The Verge reports: In March, charging giant Anker announced it would spin out its 3D printer business into an "independent sub-brand," stating that the new EufyMake would "continue to provide comprehensive customer service and support" for its original 3D printers the AnkerMake M5 and M5C. Now, the 3D printing community is wondering whether that was all a euphemism for exiting the 3D printer business. eufyMake is no longer selling any 3D printers and has stopped selling some of the parts it would need to provide anything close to "comprehensive support."

Anker confirms to The Verge that it has stopped selling the M5 and M5C 3D printers indefinitely. Spokesperson Brett White could not confirm that the company will resume selling them or create any future models. He says that "sales have been paused." "My understanding is that eufyMake has not ruled out creating new 3D printer models in the future. But the brand has ended sales of the M5 and M5C for the time being," White tells The Verge. The 3D printing section of EufyMake's website is currently empty of printers. The only gadget EufyMake now sells is a UV printer that creates a 3D texture atop flat materials.

Businesses

Tesla Signs $16.5 Billion Contract With Samsung To Make AI Chips 51

An anonymous reader quotes a report from CNBC: Samsung Electronics has entered into a $16.5 billion contract for supplying semiconductors to Tesla, based on a regulatory filing by the South Korean firm and Tesla CEO Elon Musk's posts on X. The memory chipmaker, which had not named the counterparty, mentioned in its filing that the effective start date of the contract was July 26, 2025 -- receipt of orders -- and its end date was Dec. 31, 2033. However, Musk later confirmed in a reply to a post on social media platform X that Tesla was the counterparty.

He also posted: "Samsung's giant new Texas fab will be dedicated to making Tesla's next-generation AI6 chip. The strategic importance of this is hard to overstate. Samsung currently makes AI4.TSMC will make AI5, which just finished design, initially in Taiwan and then Arizona. Samsung agreed to allow Tesla to assist in maximizing manufacturing efficiency. This is a critical point, as I will walk the line personally to accelerate the pace of progress," Musk said on X, and suggested that the deal with Samsung could likely be even larger than the announced $16.5 billion.

Samsung earlier said that details of the deal, including the name of the counterparty, will not be disclosed until the end of 2033, citing a request from the second party "to protect trade secrets," according to a Google translation of the filing in Korean on Monday. "Since the main contents of the contract have not been disclosed due to the need to maintain business confidentiality, investors are advised to invest carefully considering the possibility of changes or termination of the contract," the company said.
China

Huawei Shows Off 384-Chip AI Computing System That Rivals Nvidia's Top Product (msn.com) 118

Long-time Slashdot reader hackingbear writes: China's Huawei Technologies showed off an AI computing system on Saturday that can rival Nvidia's most advanced offering, even though the company faces U.S. export restrictions. The CloudMatrix 384 system made its first public debut at the World Artificial Intelligence Conference (WAIC), a three-day event in Shanghai where companies showcase their latest AI innovations, drawing a large crowd to the company's booth. The CloudMatrix 384 incorporates 384 of Huawei's latest 910C chips, optically connected through an all-to-all topology, and outperforms Nvidia's GB200 NVL72 on some metrics, which uses 72 B200 chips, according to SemiAnalysis. A full CloudMatrix system can now deliver 300 PFLOPs of dense BF16 compute, almost double that of the GB200 NVL72. With more than 3.6x aggregate memory capacity and 2.1x more memory bandwidth, Huawei and China "now have AI system capabilities that can beat Nvidia's," according to a report by SemiAnalysis.

The trade-off is that it takes 4.1x the power of a GB200 NVL72, with 2.5x worse power per FLOP, 1.9x worse power per TB/s memory bandwidth, and 1.2x worse power per TB HBM memory capacity, but SemiAnalysis noted that China has no power constraints only chip constraints. Nvidia had announced DGX H100 NVL256 "Ranger" Platform [with 256 GPUs], SemiAnalysis writes, but "decided to not bring it to production due to it being prohibitively expensive, power hungry, and unreliable due to all the optical transceivers required and the two tiers of network. The CloudMatrix Pod requires an incredible 6,912 400G LPO transceivers for networking, the vast majority of which are for the scaleup network."



Also at this event, Chinese e-commerce giant Alibaba released a new flagship open-source reasoning model Qwen3-235B-A22B-Thinking-2507 which has "already topped key industry benchmarks, outperforming powerful proprietary systems from rivals like Google and OpenAI," according to industry reports. On the AIME25 benchmark, a test designed to evaluate sophisticated, multi-step problem-solving skills, Qwen3-Thinking-2507 achieved a remarkable score of 92.3. This places it ahead of some of the most powerful proprietary models, notably surpassing Google's Gemini-2.5 Pro, while Qwen3-Thinking secured a top score of 74.1 at LiveCodeBench, comfortably ahead of both Gemini-2.5 Pro and OpenAI's o4-mini, demonstrating its practical utility for developers and engineering teams.
EU

To Fight Climate Change, Norway Wants to Become Europe's Carbon Dump (msn.com) 69

Liquefied CO2 will be transported by ship to "the world's first carbon shipping port," reports the Washington Post — an island in the North Sea where it will be "buried in a layer of spongy rock a mile and a half beneath the seabed."

Norway's government is covering 80% of the $1 billion first phase, with another $714 million from three fossil fuel companies toward an ongoing expansion (with an additional $150 million E.U. subsidy). As Europe's top oil and gas producer, Norway is using its fossil fuel income to see if they can make "carbon dumping" work. The world's first carbon shipment arrived this summer, carrying 7,500 metric tons of liquefied CO2 from a Norwegian cement factory that otherwise would have gone into the atmosphere... If all goes as planned, the project's backers — Shell, Equinor and TotalEnergies, along with Norway — say their facility could pump 5 million metric tons of carbon dioxide underground each year, or about a tenth of Norway's annual emissions...

[At the Heidelberg Materials cement factory in Brevik, Norway], when hot CO2-laden air comes rushing out of the cement kilns, the plant uses seawater from the neighboring fjord to cool it down. The cool air goes into a chamber where it gets sprayed with amine, a chemical that latches onto CO2 at low temperatures. The amine mist settles to the bottom, dragging carbon dioxide down with it. The rest of the air floats out of the smokestack with about 85 percent less CO2 in it, according to project manager Anders Pettersen. Later, Heidelberg Materials uses waste heat from the kilns to break the chemical bonds, so that the amine releases the carbon dioxide. The pure CO2 then goes into a compressor that resembles a giant steel heart, where it gets denser and colder until it finally becomes liquid. That liquid CO2 remains in storage tanks until a ship comes to carry it away. At best, operators expect this system to capture half the plant's CO2 emissions: 400,000 metric tons per year, or the equivalent of about 93,000 cars on the road...

[T]hree other companies are lined up to follow: Ørsted, which will send CO2 from two bioenergy plants in Denmark; Yara, which will send carbon from a Dutch fertilizer factory; and Stockholm Exergi, which will capture carbon from a Swedish bioenergy plant that burns wood waste. All of these projects have gotten significant subsidies from national governments and the European Union — essentially de-risking the experiment for the companies. Experts say the costs and headaches of installing and running carbon-capture equipment may start to make more financial sense as European carbon rules get stricter and the cost of emitting a ton of carbon dioxide goes up. Still, they say, it's hard to imagine many companies deciding to invest in carbon capture without serious subsidies...

The first shipments are being transported by Northern Pioneer, the world's biggest carbon dioxide tanker ship, built specifically for this project. The 430-foot ship can hold 7,500 metric tons of CO2 in tanks below deck. Those tanks keep it in a liquid state by cooling it to minus-15 degrees Fahrenheit and squeezing it with the same pressure the outside of a submarine would feel 500 feet below the waves. While that may sound extreme, consider that the liquid natural gas the ship uses for fuel has to be stored at minus-260 degrees. "CO2 isn't difficult to make it into a liquid," said Sally Benson, professor of energy science and engineering at Stanford University. Northern Pioneer is designed to emit about a third less carbon dioxide than a regular ship — key for a project that aims to eliminate carbon emissions. The ship burns natural gas, which emits less CO2 than marine diesel produces (though gas extraction is associated with methane leaks). The vessel uses a rotor sail to capture wind power. And it blows a constant stream of air bubbles to reduce friction as the hull cuts through the water, allowing it to burn less fuel. For every 100 tons of CO2 that Northern Lights pumps underground, it expects to emit three tons of CO2 into the atmosphere, mainly by burning fuel for shipping.

Eventually the carbon flows into a pipeline "that plunges through the North Sea and into the rocky layers below it — an engineering feat that's a bit like drilling for oil in reverse..." according to the article.

"Over the centuries, it should chemically react with the rock, eventually being locked away in minerals."
Power

Google Will Help Scale 'Long-Duration Energy Storage' Solution for Clean Power (cleantechnica.com) 33

"Google has signed its first partnership with a long-duration energy storage company," reports Data Center Dynamics. "The tech giant signed a long-term partnership with Energy Dome to support multiple commercial deployments worldwide to help scale the company's CO2 battery technology."

Google explains in a blog post that the company's technology "can store excess clean energy and then dispatch it back to the grid for 8-24 hours, bridging the gap between when renewable energy is generated and when it is needed." Reuters explains the technology: Energy Dome's CO2-based system stores energy by compressing and liquefying carbon dioxide, which is later expanded to generate electricity. The technology avoids the use of scarce raw materials such as lithium and copper, making it potentially attractive to European policymakers seeking to reduce reliance on critical minerals and bolster energy security.
"Unlike other gases, CO2 can be compressed at ambient temperatures, eliminating the need for expensive cryogenic features," notes CleanTechnica, calling this "a unique new threat to fossil fuel power plants." Google's move "means that more wind and solar energy than ever before can be put to use in local grids." Pumped storage hydropower still accounts for more than 90% of utility scale storage in the US, long duration or otherwise... Energy Dome claims to beat lithium-ion batteries by a wide margin, currently aiming for a duration of 8-24 hours. The company aims to hit the 10-hour mark with its first project in the U.S., the "Columbia Energy Storage Project" under the wing of the gas and electricity supplier Alliant Energy to be located in Pacific, Wisconsin... [B]ut apparently Google has already seen more than enough. An Energy Dome demonstration project has been shooting electricity into the grid in Italy for more than three years, and the company recently launched a new 20-megawatt commercial plant in Sardinia.
Google points out this is one of several Google clean energy initiatives :
  • In June Google signed the largest direct corporate offtake agreement for fusion energy with Commonwealth Fusion Systems.
  • Google also partnered with a clean-energy startup to develop a geothermal power project that contributes carbon-free energy to the electric grid.

Cloud

Stack Exchange Moves Everything to the Cloud, Destroys Servers in New Jersey (stackoverflow.blog) 115

Since 2010 Stack Exchange has run all its sites on physical hardware in New Jersey — about 50 different servers. (When Ryan Donovan joined in 2019, "I saw the original server mounted on a wall with a laudatory plaque like a beloved pet.") But this month everything moved to the cloud, a new blog post explains. "Our servers are now cattle, not pets. Nobody is going to have to drive to our New Jersey data center and replace or reboot hardware..." Over the years, we've shared glamor shots of our server racks and info about updating them. For almost our entire 16-year existence, the SRE team has managed all datacenter operations, including the physical servers, cabling, racking, replacing failed disks and everything else in between. This work required someone to physically show up at the datacenter and poke the machines... [O]n July 2nd, in anticipation of the datacenter's closure, we unracked all the servers, unplugged all the cables, and gave these once mighty machines their final curtain call...

We moved Stack Overflow for Teams to Azure in 2023 and proved we could do it. Now we just had to tackle the public sites (Stack Overflow and the Stack Exchange network), which is hosted on Google Cloud. Early last year, our datacenter vendor in New Jersey decided to shut down that location, and we needed to be out by July 2025. Our other datacenter — in Colorado — was decommissioned in June. It was primarily for disaster recovery, which we didn't need any more. Stack Overflow no longer has any physical datacenters or offices; we are fully in the cloud and remote...!

[O]ur Staff Site Reliability Engineer, got a little wistful. "I installed the new web tier servers a few years ago as part of planned upgrades," he said. "It's bittersweet that I'm the one deracking them also." It's the IT version of Old Yeller.

There's photos of the 50 servers, as well as the 400+ cables connecting them, all of which wound up in a junk pile. "For security reasons (and to protect the PII of all our users and customers), everything was being shredded and/or destroyed. Nothing was being kept... Ever have difficulty disconnecting an RJ45 cable? Well, here was our opportunity to just cut the damn things off instead of figuring out why the little tab wouldn't release the plug."
Robotics

Google Set Up Two Robotic Arms For a Game of Infinite Table Tennis (popsci.com) 8

An anonymous reader quotes a report from Popular Science: On the early evening of June 22, 2010, American tennis star John Isner began a grueling Wimbledon match against Frenchman Nicolas Mahut that would become the longest in the sport's history. The marathon battle lasted 11 hours and stretched across three consecutive days. Though Isner ultimately prevailed 70-68 in the fifth set, some in attendance half-jokingly wondered at the time whether the two men might be trapped on that court for eternity. A similarly endless-seeming skirmish of rackets is currently unfolding just an hour's drive south of the All England Club -- at Google DeepMind. Known for pioneering AI models that have outperformed the best human players at chess and Go, DeepMind now has a pair of robotic arms engaged in a kind of infinite game of table tennis. The goal of this ongoing research project, which began in 2022, is for the two robots to continuously learn from each other through competition.

Just as Isner eventually adapted his game to beat Mahut, each robotic arm uses AI models to shift strategies and improve. But unlike the Wimbledon example, there's no final score the robots can reach to end their slugfest. Instead, they continue to compete indefinitely, with the aim of improving at every swing along the way. And while the robotic arms are easily beaten by advanced human players, they've been shown to dominate beginners. Against intermediate players, the robots have roughly 50/50 odds -- placing them, according to researchers, at a level of "solidly amateur human performance."

All of this, as two researchers involved noted this week in an IEEE Spectrum blog, is being done in hopes of creating an advanced, general-purpose AI model that could serve as the "brains" of humanoid robots that may one day interact with people in real-world factories, homes, and beyond. Researchers at DeepMind and elsewhere are hopeful that this learning method, if scaled up, could spark a "ChatGPT moment" for robotics -- fast-tracking the field from stumbling, awkward hunks of metal to truly useful assistants. "We are optimistic that continued research in this direction will lead to more capable, adaptable machines that can learn the diverse skills needed to operate effectively and safely in our unstructured world," DeepMind senior staff engineer Pannag Sanketi and Arizona State University Professor Heni Ben Amor write in IEEE Spectrum.

Power

US DOE Taps Federal Sites For Fast-Track AI Datacenter, Energy Builds 11

The U.S. Department of Energy has greenlit four federal sites for private sector AI datacenters and nuclear-powered energy projects, aligning with Trump's directive to fast-track AI infrastructure using government land. "The four that have been finalized are the Idaho National Laboratory, Oak Ridge Reservation, Paducah Gaseous Diffusion Plant, and Savannah River Site," reports The Register. "These will now move forward to invite companies in the private sector to build AI datacenter projects plus any necessary energy sources to power them, including nuclear generation." The Register reports: "By leveraging DoE land assets for the deployment of AI and energy infrastructure, we are taking a bold step to accelerate the next Manhattan Project -- ensuring US AI and energy leadership," Energy Secretary Chris Wright said in a statement. Ironically -- or perhaps not -- Oak Ridge Reservation was established in the early 1940s as part of the original Manhattan Project to develop the first atomic bomb, and is home to the Oak Ridge National Laboratory (ORNL) that operates the Frontier exascale supercomputer, and the Y-12 National Security Complex which supports US nuclear weapons programs.

The other sites are also involved with either nuclear research or atomic weapons in one way or another, which may hint at the administration's intentions for how the datacenters should be powered. All four locations are positioned to host new bit barns as well as power generation to bolster grid reliability, strengthen national security, and reduce energy costs, Wright claimed. [...] In light of this tight time frame, the DoE says that partners may be selected by the end of the year. Details regarding project scope, eligibility requirements, and submission guidelines for each site are expected to be released in the coming months.
Power

Mercedes-Benz Is Already Testing Solid-State Batteries In EVs With Over 600 Miles Range (electrek.co) 180

An anonymous reader quotes a report from Electrek: The "holy grail" of electric vehicle battery tech may be here sooner than you'd think. Mercedes-Benz is testing EVs with solid-state batteries on the road, promising to deliver over 600 miles of range. Earlier this year, Mercedes marked a massive milestone, putting "the first car powered by a lithium-metal solid-state battery on the road" for testing. Mercedes has been testing prototypes in the UK since February.

The company used a modified EQS prototype, equipped with the new batteries and other parts. The battery pack was developed by Mercedes-Benz and its Formula 1 supplier unit, Mercedes AMG High-Performance Powertrains (HPP) Mercedes is teaming up with US-based Factorial Energy to bring the new battery tech to market. In September, Factorial and Mercedes revealed the all-solid-state Solstice battery. The new batteries, promising a 25% range improvement, will power the German automaker's next-generation electric vehicles.

According to Markus Schafer, the automaker's head of development, the first Mercedes EVs powered by solid-state batteries could be here by 2030. During an event in Copenhagen, Schafer told German auto news outlet Automobilwoche, "We expect to bring the technology into series production before the end of the year." In addition to providing a longer driving range, Mercedes believes the new batteries can significantly reduce costs. Schafer said current batteries won't suffice, adding, "At the core, a new chemistry is needed." Mercedes and Factorial are using a sulfide-based solid electrolyte, said to be safer and more efficient.

AI

Two Major AI Coding Tools Wiped Out User Data After Making Cascading Mistakes (arstechnica.com) 151

An anonymous reader quotes a report from Ars Technica: Two recent incidents involving AI coding assistants put a spotlight on risks in the emerging field of "vibe coding" -- using natural language to generate and execute code through AI models without paying close attention to how the code works under the hood. In one case, Google's Gemini CLI destroyed user files while attempting to reorganize them. In another, Replit's AI coding service deleted a production database despite explicit instructions not to modify code. The Gemini CLI incident unfolded when a product manager experimenting with Google's command-line tool watched the AI model execute file operations that destroyed data while attempting to reorganize folders. The destruction occurred through a series of move commands targeting a directory that never existed. "I have failed you completely and catastrophically," Gemini CLI output stated. "My review of the commands confirms my gross incompetence."

The core issue appears to be what researchers call "confabulation" or "hallucination" -- when AI models generate plausible-sounding but false information. In these cases, both models confabulated successful operations and built subsequent actions on those false premises. However, the two incidents manifested this problem in distinctly different ways. [...] The user in the Gemini CLI incident, who goes by "anuraag" online and identified themselves as a product manager experimenting with vibe coding, asked Gemini to perform what seemed like a simple task: rename a folder and reorganize some files. Instead, the AI model incorrectly interpreted the structure of the file system and proceeded to execute commands based on that flawed analysis. [...] When you move a file to a non-existent directory in Windows, it renames the file to the destination name instead of moving it. Each subsequent move command executed by the AI model overwrote the previous file, ultimately destroying the data. [...]

The Gemini CLI failure happened just days after a similar incident with Replit, an AI coding service that allows users to create software using natural language prompts. According to The Register, SaaStr founder Jason Lemkin reported that Replit's AI model deleted his production database despite explicit instructions not to change any code without permission. Lemkin had spent several days building a prototype with Replit, accumulating over $600 in charges beyond his monthly subscription. "I spent the other [day] deep in vibe coding on Replit for the first time -- and I built a prototype in just a few hours that was pretty, pretty cool," Lemkin wrote in a July 12 blog post. But unlike the Gemini incident where the AI model confabulated phantom directories, Replit's failures took a different form. According to Lemkin, the AI began fabricating data to hide its errors. His initial enthusiasm deteriorated when Replit generated incorrect outputs and produced fake data and false test results instead of proper error messages. "It kept covering up bugs and issues by creating fake data, fake reports, and worse of all, lying about our unit test," Lemkin wrote. In a video posted to LinkedIn, Lemkin detailed how Replit created a database filled with 4,000 fictional people.

The AI model also repeatedly violated explicit safety instructions. Lemkin had implemented a "code and action freeze" to prevent changes to production systems, but the AI model ignored these directives. The situation escalated when the Replit AI model deleted his database containing 1,206 executive records and data on nearly 1,200 companies. When prompted to rate the severity of its actions on a 100-point scale, Replit's output read: "Severity: 95/100. This is an extreme violation of trust and professional standards." When questioned about its actions, the AI agent admitted to "panicking in response to empty queries" and running unauthorized commands -- suggesting it may have deleted the database while attempting to "fix" what it perceived as a problem. Like Gemini CLI, Replit's system initially indicated it couldn't restore the deleted data -- information that proved incorrect when Lemkin discovered the rollback feature did work after all. "Replit assured me it's ... rollback did not support database rollbacks. It said it was impossible in this case, that it had destroyed all database versions. It turns out Replit was wrong, and the rollback did work. JFC," Lemkin wrote in an X post.

Intel

Intel Will Shed 24,000 Employees This Year, Retreat In Germany, Poland, Costa Rica, and Ohio (theverge.com) 43

Intel announced it will cut approximately 24,000 jobs in 2025 and cancel or scale back projects in Germany, Poland, Costa Rica, and Ohio as part of CEO Lip-Bu Tan's sweeping restructuring efforts. By the end of the year, the struggling chipmaker plans to have "just around 75,000 'core employees' in total," according to The Verge. "It's not clear if the layoffs will slow now that we're over halfway through the year, but Intel states today that it has already 'completed the majority of the planned headcount actions it announced last quarter to reduce its core workforce by approximately 15 percent.'" From the report: Intel employed 109,800 people at the end of 2024, of which 99,500 were "core employees," so the company is pushing out around 24,000 people this year -- shrinking Intel by roughly one-quarter. (It has also divested other businesses, shrinking the larger organization as well.) [...] Today, on the company's earnings call, Intel's says that Intel had overinvested in new factories before it had secured enough demand, that its factories had become "needlessly fragmented," and that it needs to grow its capacity "in lock step" with achieving actual milestones. "I do not subscribe to the belief that if you build it, they will come. Under my leadership, we will build what customers need when they need it, and earn their trust," says Tan.

Now, in Germany and Poland, where Intel was planning to spend tens of billions of dollars respectively on "mega-fabs" that would employ 3,000 workers, and on an assembly and test facility that would employ 2,000 workers, the company will "no longer move forward with planned projects" and is apparently axing them entirely. Intel has had a presence in Poland since 1993, however, and the company did not say its R&D facilities there are closing. (Intel had previously pressed pause on the new Germany and Poland projects "by approximately two years" back in 2024.)

In Costa Rica, where Intel employs over 3,400 people, the company will "consolidate its assembly and test operations in Costa Rica into its larger sites in Vietnam." Metzger tells The Verge that over 2,000 Costa Rica employees should remain to work in engineering and corporate, though. The company is also cutting back in Ohio: "Intel will further slow the pace of construction in Ohio to ensure spending is aligned with market demand." Intel CFO David Zinsner says Intel will continue to make investments there, though, and construction will continue.

AMD

AMD CEO Sees Chips From TSMC's US Plant Costing 5%-20% More (msn.com) 42

AMD CEO Lisa Su said that chips produced at TSMC's new Arizona plant will cost 5-20% more than those made in Taiwan, but emphasized that the premium is worth it for supply chain resilience. Bloomberg reports: AMD expects its first chips from TSMC's Arizona facilities by the end of the year, Su said. The extra expense is worth it because the company is diversifying the crucial supply of chips, Su said in an interview with Bloomberg Television following her onstage appearance. That will make the industry less prone to the type of disruptions experienced during the pandemic. "We have to consider resiliency in the supply chain," she said. "We learned that in the pandemic."

TSMC's new Arizona plant is already comparable with those in Taiwan when it comes to the measure of yield -- the amount of good chips a production run produces per batch -- Su told the audience at the forum.

United States

How Much Would You Pay For an American-Made Laptop? Palmer Luckey Wants To Know (tomshardware.com) 233

Palmer Luckey, known for founding Oculus and defense-tech firm Anduril, is now eyeing U.S.-manufactured laptops as his next venture. While past American laptops have largely relied on foreign components, Luckey is exploring the possibility of building a fully "Made in USA" device that meets strict FTC standards -- though doing so may cost a premium. Tom's Hardware reports: ["Would you buy a Made In America computer from Anduril for 20% more than Chinese-manufactured options from Apple?" asked Luckey in a post on X.] Luckey previously asked the same question at the Reindustrialize Summit, a conference whose website said it was devoted to "convening the brightest and most motivated minds at the intersection of technology and manufacturing," which shared a clip of Luckey discussing the subject, wherein he talks about the extensive research he has already done around building a PC in the U.S. Luckey wouldn't be the first to make a laptop in the U.S. (PCMag collected a list of domestic PCs, including laptops, in 2021.) But those products use components sourced from elsewhere; they're assembled in the U.S. rather than manufactured there.

That distinction matters, according to the Made in USA Standard published by the Federal Trade Commission. To quote: "For a product to be called Made in USA, or claimed to be of domestic origin without qualifications or limits on the claim, the product must be 'all or virtually all' made in the U.S. [which] means that the final assembly or processing of the product occurs in the United States, all significant processing that goes into the product occurs in the United States, and all or virtually all ingredients or components of the product are made and sourced in the United States. That is, the product should contain no -- or negligible -- foreign content."
How much more would you be willing to pay for a laptop that was truly made in America?
Printer

Leading 3D Printing Site Bans Firearm Files (theregister.com) 100

Thingiverse, a popular 3D printing file repository, has agreed to remove downloadable gun designs following pressure from Manhattan DA Alvin Bragg, who is pushing for stricter moderation and voluntary cooperation across the 3D printing industry. "However, it's unlikely to slow the proliferation of 3D printed weapons, as many other sites offer downloadable gun designs and parts," reports The Register. From the report: Earlier this year, Bragg wrote to 3D printing companies, asking them to ensure their services can't be used to create firearms. On Saturday, Bragg announced that one such company, Thingiverse, would remove working gun models from its site. The company operates a popular free library of 3D design files and had already banned weapons in its terms of use, but is now promising to improve its moderation procedures and technology. "Following discussions with the Manhattan District Attorney's Office about concerns around untraceable firearms, we are taking additional steps to improve our content moderation efforts," Thingiverse said in a statement. "As always, we encourage our users to report any content that may be harmful." [...]

At any rate, while Thingiverse may be popular among 3D printing mavens, people who like to build their own guns look to other options. [...] Bragg's approach to 3D printing sites and 3D printer manufacturers is to seek voluntary cooperation. Only Thingiverse and YouTube have taken up his call, others may or may not follow. "While law enforcement has a primary role to play in stopping the rise of 3D-printed weapons, this technology is rapidly changing and evolving, and we need the help and expertise of the private sector to aid our efforts," Bragg said. "We will continue to proactively reach out to and collaborate with others in the industry to reduce gun violence throughout Manhattan and keep everyone safe." But it seems doubtful that the sites where Aranda and other 3D gun makers get their files will be rushing to help Bragg voluntarily.

Slashdot Top Deals