×
Google

How Reliable Are Modern CPUs? (theregister.com) 64

Slashdot reader ochinko (user #19,311) shares The Register's report about a recent presentation by Google engineer Peter Hochschild. His team discovered machines with higher-than-expected hardware errors that "showed themselves sporadically, long after installation, and on specific, individual CPU cores rather than entire chips or a family of parts." The Google researchers examining these silent corrupt execution errors (CEEs) concluded "mercurial cores" were to blame CPUs that miscalculated occasionally, under different circumstances, in a way that defied prediction...The errors were not the result of chip architecture design missteps, and they're not detected during manufacturing tests. Rather, Google engineers theorize, the errors have arisen because we've pushed semiconductor manufacturing to a point where failures have become more frequent and we lack the tools to identify them in advance.

In a paper titled "Cores that don't count" [PDF], Hochschild and colleagues Paul Turner, Jeffrey Mogul, Rama Govindaraju, Parthasarathy Ranganathan, David Culler, and Amin Vahdat cite several plausible reasons why the unreliability of computer cores is only now receiving attention, including larger server fleets that make rare problems more visible, increased attention to overall reliability, and software development improvements that reduce the rate of software bugs. "But we believe there is a more fundamental cause: ever-smaller feature sizes that push closer to the limits of CMOS scaling, coupled with ever-increasing complexity in architectural design," the researchers state, noting that existing verification methods are ill-suited for spotting flaws that occur sporadically or as a result of physical deterioration after deployment.

Facebook has noticed the errors, too. In February, the social ad biz published a related paper, "Silent Data Corruption at Scale," that states, "Silent data corruptions are becoming a more common phenomena in data centers than previously observed...."

The risks posed by misbehaving cores include not only crashes, which the existing fail-stop model for error handling can accommodate, but also incorrect calculations and data loss, which may go unnoticed and pose a particular risk at scale. Hochschild recounted an instance where Google's errant hardware conducted what might be described as an auto-erratic ransomware attack. "One of our mercurial cores corrupted encryption," he explained. "It did it in such a way that only it could decrypt what it had wrongly encrypted."

How common is the problem? The Register notes that Google's researchers shared a ballpark figure "on the order of a few mercurial cores per several thousand machines similar to the rate reported by Facebook."
Hardware

Apple Working On iPad Pro With Wireless Charging, New iPad Mini (bloomberg.com) 11

An anonymous reader quotes a report from Bloomberg: Apple is working on a new iPad Pro with wireless charging and the first iPad mini redesign in six years, seeking to continue momentum for a category that saw rejuvenated sales during the pandemic. The Cupertino, California-based company is planning to release the new iPad Pro in 2022 and the iPad mini later this year [...]. The main design change in testing for the iPad Pro is a switch to a glass back from the current aluminum enclosure. The updated iPad mini is planned to have narrower screen borders while the removal of its home button has also been tested.

For the new Pro model, the switch to a glass back is being tested, in part, to enable wireless charging for the first time. Making the change in material would bring iPads closer to iPhones, which Apple has transitioned from aluminum to glass backs in recent years. Apple's development work on the new iPad Pro is still early, and the company's plans could change or be canceled before next year's launch [...]. Wireless charging replaces the usual power cable with an inductive mat, which makes it easier for users to top up their device's battery. It has grown into a common feature in smartphones but is a rarity among tablets. Apple added wireless charging to iPhones in 2017 and last year updated it with a magnet-based MagSafe system that ensured more consistent charging speeds.

The company is testing a similar MagSafe system for the iPad Pro. Wireless charging will likely be slower than directly plugging in a charger to the iPad's Thunderbolt port, which will remain as part of the next models. As part of its development of the next iPad Pro, Apple is also trying out technology called reverse wireless charging. That would allow users to charge their iPhone or other gadgets by laying them on the back of the tablet. Apple had previously been working on making this possible for the iPhone to charge AirPods and Apple Watches. In addition to the next-generation iPad Pro and iPad mini, Apple is also working on a thinner version of its entry-level iPad geared toward students. That product is planned to be released as early as the end of this year, about the same time as the new iPad mini.
Apple is still reportedly working on a technology similar to its failed AirPower, a charging mat designed to simultaneously charge an iPhone, Apple Watch and AirPods. People familiar with the matter said it's also internally investigating alternative wireless charging methods that can work over greater distances than an inductive connection.
Power

7-11 Is Opening 500 EV Charging Stations By the End of 2022 (cnet.com) 168

7-11 announced Tuesday that it will be placing 500 EV chargers at 250 stores in the U.S. and Canada by the end of 2022. CNET reports: OK, but if they can't keep the Slurpee machine up and running, what kind of charging can users expect? Well, we don't know, and 7-11 isn't saying, but we do know that they will be DC fast-chargers, and it looks like they'll be supplied by ChargePoint, so we'd bet on anything from 60-ish kilowatts to 125 kilowatts. These new chargers will join 7-11's small network of 22 charging stations at 14 stores in four states, and the whole thing is a part of 7-11's ongoing work to reduce its carbon footprint.
Wireless Networking

Samsung Will Shut Down the v1 SmartThings Hub This Month (arstechnica.com) 86

Samsung is killing the first-generation SmartThings Hub at the end of the month, kicking off phase two of its plan to shut down the SmartThings ecosystem and force users over to in-house Samsung infrastructure. "Phase one was in October, when Samsung killed the Classic SmartThings app and replaced it with a byzantine disaster of an app that it developed in house," writes Ars Technica's Ron Amadeo. "Phase three will see the shutdown of the SmartThings Groovy IDE, an excellent feature that lets members of the community develop SmartThings device handlers and complicated automation apps." From the report: The SmartThings Hub is basically a Wi-Fi access point -- but for your smart home stuff instead of your phones and laptops. Instead of Wi-Fi, SmartThings is the access point for a Zigbee and Z-Wave network, two ultra low-power mesh networks used by smart home devices. [...] The Hub connects your smart home network to the Internet, giving you access to a control app and connecting to other services like your favorite voice assistant. You might think that killing the old Hub could be a ploy to sell more hardware, but Samsung -- a hardware company -- is actually no longer interested in making SmartThings hardware. The company passed manufacturing for the latest "SmartThings Hub (v3)" to German Internet-of-things company Aeotec. The new Hub is normally $125, but Samsung is offering existing users a dirt-cheat $35 upgrade price.

For users who have to buy a new hub, migrating between hubs in the SmartThings ecosystem is a nightmare. Samsung doesn't provide any kind of migration program, so you have to unpair every single individual smart device from your old hub to pair it to the new one. This means you'll need to perform some kind of task on every light switch, bulb, outlet, and sensor, and you'll have to do the same for any other smart thing you've bought over the years. Doing this on each device is a hassle that usually involves finding the manual to look up the secret "exclusion" input, which is often some arcane Konami code. Picture holding the top button on a paddle light for seven seconds until a status light starts blinking and then opening up the SmartThings app to unpair it. Samsung is also killing the "SmartThings Link for Nvidia Shield" dongle, which let users turn Android TV devices into SmartThings Hubs.

Power

Bill Gates' Next Generation Nuclear Reactor To Be Built In Wyoming (reuters.com) 334

Billionaire Bill Gates' advanced nuclear reactor company TerraPower LLC and PacifiCorp have selected Wyoming to launch the first Natrium reactor project on the site of a retiring coal plant, the state's governor said on Wednesday. Reuters reports: TerraPower, founded by Gates about 15 years ago, and power company PacifiCorp, owned by Warren Buffet's Berkshire Hathaway, said the exact site of the Natrium reactor demonstration plant is expected to be announced by the end of the year. Small advanced reactors, which run on different fuels than traditional reactors, are regarded by some as a critical carbon-free technology than can supplement intermittent power sources like wind and solar as states strive to cut emissions that cause climate change.

The project features a 345 megawatt sodium-cooled fast reactor with molten salt-based energy storage that could boost the system's power output to 500 MW during peak power demand. TerraPower said last year that the plants would cost about $1 billion. Late last year the U.S. Department of Energy awarded TerraPower $80 million in initial funding to demonstrate Natrium technology, and the department has committed additional funding in coming years subject to congressional appropriations.

AMD

AMD Unveils Radeon RX 6000M Mobile GPUs For New Breed of All-AMD Gaming Laptops (hothardware.com) 15

MojoKid writes: AMD just took the wraps off its new line of Radeon RX 6000M GPUs for gaming laptops. Combined with its Ryzen 5000 series processors, the company claims all-AMD powered "AMD Advantage" machines will deliver new levels of performance, visual fidelity and value for gamers. AMD unveiled three new mobile GPUs. Sitting at the top is the Radeon RX 6800M, featuring 40 compute units, 40 ray accelerators, a 2,300MHz game clock and 12GB of GDDR6 memory. According to AMD, its flagship Radeon RX 6800M mobile GPU can deliver 120 frames per second at 1440p with a blend of raytracing, compute, and traditional effects.

Next, the new Radeon RX 6700M sports 36 compute units, 36 ray accelerators, a 2,300MHz game clock and 10GB of GDDR6 memory. Finally, the Radeon RX 6600M comes armed with 28 compute units and 28 ray accelerators, a 2,177MHz game clock and 8GB of GDDR6 memory. HotHardware has a deep dive review of a new ASUS ROG Strix G15 gaming laptop with the Radeon RX 6800M on board, as well as an 8-core Ryzen 9 5900HX processor. In the benchmarks, the Radeon RX 6800M-equipped machine puts up numbers that rival GeForce RTX 3070 and 3080 laptop GPUs in traditional rasterized game engines, though it trails a bit in ray tracing enhanced gaming. You can expect this new breed of all-AMD laptops to arrive in market sometime later this month.

Businesses

Instacart Bets on Robots To Shrink Ranks of 500,000 Gig Shoppers (bloomberg.com) 43

Instacart has an audacious plan to replace its army of gig shoppers with robots -- part of a long-term strategy to cut costs and put its relationship with supermarket chains on a sustainable footing. From a report: The plan, detailed in documents reviewed by Bloomberg, involves building automated fulfillment centers around the U.S., where hundreds of robots would fetch boxes of cereal and cans of soup while humans gather produce and deli products. Some facilities would be attached to existing grocery stores while larger standalone centers would process orders for several locations, according to the documents, which were dated July and December.

Despite working on the strategy for more than a year, however, the company has yet to sign up a single supermarket chain. Instacart had planned to begin testing the fulfillment centers later this year, the documents show. But the company has fallen behind schedule, according to people familiar with the situation. And though the documents mention asking several automation providers to build the technology, Instacart hasn't settled on any, said the people, who requested anonymity to discuss a private matter. In February, the Financial Times reported on elements of the strategy and said Instacart in early 2020 sent out requests for proposals to five robotics companies.

An Instacart spokeswoman said the company was busy buttressing its operations during the pandemic, when it signed up 300,000 new gig workers in a matter of weeks, bringing the current total to more than 500,000. But the delays in getting the automation strategy off the ground could potentially undermine plans to go public this year. Investors know robots will play a critical role in modernizing the $1.4 trillion U.S. grocery industry.

Hardware

The GeForce RTX 3080 Ti is Nvidia's 'New Gaming Flagship' (pcworld.com) 60

Nvidia officially announced the long-awaited GeForce RTX 3080 Ti during its Computex keynote late Monday night, and this $1,200 graphics card looks like an utter beast. The $600 GeForce RTX 3070 Ti also made its debut with faster GDDR6X memory. From a report: All eyes are on the RTX 3080 Ti, though. Nvidia dubbed it GeForce's "new gaming flagship" as the $1,500 RTX 3090 is built for work and play alike, but the new GPU is a 3090 in all but name (and memory capacity). While Nvidia didn't go into deep technical details during the keynote, the GeForce RTX 3080 Ti's specifications page shows it packing a whopping 10,240 CUDA cores -- just a couple hundred less than the 3090's 10,496 count, but massively more than the 8,704 found in the vanilla 3080.

Expect this card to chew through games on par with the best, especially in games that support real-time ray tracing and Nvidia's amazing DLSS feature. The memory system can handle the ride, as it's built using the RTX 3090's upgraded bones. The GeForce RTX 3080 Ti comes with a comfortable 12GB of blazing-fast GDDR6X memory over a wide 384-bit bus, which is half the ludicrous 24GB capacity found in the 3090, but more than enough to handle any gaming workload you through at it. That's not true with the vanilla RTX 3080, which comes with 10GB of GDDR6X over a smaller bus, as rare titles (like Doom Eternal) can already use more than 10GB of memory when you're playing at 4K resolution with the eye candy cranked to the max. The extra two gigs make the RTX 3080 Ti feel much more future-proof.

Data Storage

Seagate 'Exploring' Possible New Line of Crypto-Specific Hard Drives (techradar.com) 47

In a Q&A with TechRadar, storage hardware giant Seagate revealed it is keeping a close eye on the crypto space, with a view to potentially launching a new line of purpose-built drives. From the report: Asked whether companies might develop storage products specifically for cryptocurrency use cases, Jason M. Feist, who heads up Seagate's emerging products arm, said it was a "possibility." Feist said he could offer no concrete information at this stage, but did suggest the company is "exploring this opportunity and imagines others may be as well."
Intel

Intel's latest 11th Gen Processor Brings 5.0GHz Speeds To Thin and Light Laptops (theverge.com) 51

Intel made a splash earlier in May with the launch of its first 11th Gen Tiger Lake H-series processors for more powerful laptops, but at Computex 2021, the company is also announcing a pair of new U-series chips -- one of which marks the first 5.0GHz clock speed for the company's U-series lineup of lower voltage chips. From a report: Specifically, Intel is announcing the Core i7-1195G7 -- its new top of the line chip in the U-series range -- and the Core i5-1155G7, which takes the crown of Intel's most powerful Core i5-level chip, too. Like the original 11th Gen U-series chips, the new chips operate in the 12W to 28W range. Both new chips are four core / eight thread configurations, and feature Intel's Iris Xe integrated graphics (the Core i7-1195G7 comes with 96 EUs, while the Core i5-1155G7 has 80 EUs.)

The Core i7-1195G7 features a base clock speed of 2.9GHz, but cranks up to a 5.0GHz maximum single core speed using Intel's Turbo Boost Max 3.0 technology. The Core i5-1155G7, on the other hand, has a base clock speed of 2.5GHz and a boosted speed of 4.5GHz. Getting to 5GHz out of the box is a fairly recent development for laptop CPUs, period: Intel's first laptop processor to cross the 5GHz mark arrived in 2019.

Supercomputing

World's Fastest AI Supercomputer Built from 6,159 NVIDIA A100 Tensor Core GPUs (nvidia.com) 57

Slashdot reader 4wdloop shared this report from NVIDIA's blog, joking that maybe this is where all NVIDIA's chips are going: It will help piece together a 3D map of the universe, probe subatomic interactions for green energy sources and much more. Perlmutter, officially dedicated Thursday at the National Energy Research Scientific Computing Center (NERSC), is a supercomputer that will deliver nearly four exaflops of AI performance for more than 7,000 researchers. That makes Perlmutter the fastest system on the planet on the 16- and 32-bit mixed-precision math AI uses. And that performance doesn't even include a second phase coming later this year to the system based at Lawrence Berkeley National Lab.

More than two dozen applications are getting ready to be among the first to ride the 6,159 NVIDIA A100 Tensor Core GPUs in Perlmutter, the largest A100-powered system in the world. They aim to advance science in astrophysics, climate science and more. In one project, the supercomputer will help assemble the largest 3D map of the visible universe to date. It will process data from the Dark Energy Spectroscopic Instrument (DESI), a kind of cosmic camera that can capture as many as 5,000 galaxies in a single exposure. Researchers need the speed of Perlmutter's GPUs to capture dozens of exposures from one night to know where to point DESI the next night. Preparing a year's worth of the data for publication would take weeks or months on prior systems, but Perlmutter should help them accomplish the task in as little as a few days.

"I'm really happy with the 20x speedups we've gotten on GPUs in our preparatory work," said Rollin Thomas, a data architect at NERSC who's helping researchers get their code ready for Perlmutter. DESI's map aims to shed light on dark energy, the mysterious physics behind the accelerating expansion of the universe.

A similar spirit fuels many projects that will run on NERSC's new supercomputer. For example, work in materials science aims to discover atomic interactions that could point the way to better batteries and biofuels. Traditional supercomputers can barely handle the math required to generate simulations of a few atoms over a few nanoseconds with programs such as Quantum Espresso. But by combining their highly accurate simulations with machine learning, scientists can study more atoms over longer stretches of time. "In the past it was impossible to do fully atomistic simulations of big systems like battery interfaces, but now scientists plan to use Perlmutter to do just that," said Brandon Cook, an applications performance specialist at NERSC who's helping researchers launch such projects. That's where Tensor Cores in the A100 play a unique role. They accelerate both the double-precision floating point math for simulations and the mixed-precision calculations required for deep learning.

Graphics

Resale Prices Triple for NVIDIA Chips as Gamers Compete with Bitcoin Miners (yahoo.com) 108

"In the niche world of customers for high-end semiconductors, a bitter feud is pitting bitcoin miners against hardcore gamers," reports Quartz: At issue is the latest line of NVIDIA graphics cards — powerful, cutting-edge chips with the computational might to display the most advanced video game graphics on the market. Gamers want the chips so they can experience ultra-realistic lighting effects in their favorite games. But they can't get their hands on NVIDIA cards, because miners are buying them up and adapting them to crunch cryptographic codes and harvest digital currency. The fierce competition to buy chips — combined with a global semiconductor shortage — has driven resale prices up as much as 300%, and led hundreds of thousands of desperate consumers to sign up for daily raffles for the right to buy chips at a significant mark-up.

To broker a peace between its warring customers, NVIDIA is, essentially, splitting its cutting-edge graphics chips into two dumbed-down products: GeForce for gamers and the Cryptocurrency Mining Processor (CMP) for miners. GeForce is the latest NVIDIA graphics card — except key parts of it have been slowed down to make it less valuable for miners racing to solve crypto puzzles. CMP is based on a slightly older version of NVIDIA's graphics card which has been stripped of all of its display outputs, so gamers can't use it to render graphics.

NVIDIA's goal in splitting its product offerings is to incentivize miners to only buy CMP chips, and leave the GeForce chips for the gamers. "What we hope is that the CMPs will satisfy the miners...[and] steer our GeForce supply to gamers," said CEO Jansen Huang on a May 26 conference call with investors and analysts... It won't be easy to keep the miners at bay, however. NVIDIA tried releasing slowed-down graphics chips in February in an effort to deter miners from buying them, but it didn't work. The miners quickly figured out how to hack the chips and make them perform at full-speed again.

Power

Is Natural Gas (Mostly) Good for Global Warming? (ieee.org) 139

Natural gas "creates less carbon emissions than the coal it replaces, but we have to find ways to minimize the leakage of methane."

That's the opinion of Vaclav Smil, a distinguished professor emeritus at the University of Manitoba and a Fellow of the Royal Society of Canada, writing in IEEE's Spectrum (in an article shared by Slashdot reader schwit1): Natural gas is abundant, low-cost, convenient, and reliably transported, with low emissions and high combustion efficiency. Natural-gas-fired heating furnaces have maximum efficiencies of 95 to 97 percent, and combined-cycle gas turbines now achieve overall efficiency slightly in excess of 60 percent. Of course, burning gas generates carbon dioxide, but the ratio of energy to carbon is excellent: Burning a gigajoule of natural gas produces 56 kilograms of carbon dioxide, about 40 percent less than the 95 kg emitted by bituminous coal.

This makes gas the obvious replacement for coal. In the United States, this transition has been unfolding for two decades. Gas-fueled capacity increased by 192 gigawatts from 2000 to 2005 and by an additional 69 GW from 2006 through the end of 2020. Meanwhile, the 82 GW of coal-fired capacity that U.S. utilities removed from 2012 to 2020 is projected to be augmented by another 34 GW by 2030, totaling 116 GW — more than a third of the former peak rating.

So far, so green. But methane is itself a very potent greenhouse gas, packing from 84 to 87 times as much global warming potential as an equal quantity of carbon dioxide when measured over 20 years (and 28 to 36 times as much over 100 years). And some of it leaks out. In 2018, a study of the U.S. oil and natural-gas supply chain found that those emissions were about 60 percent higher than the Environmental Protection Agency had estimated. Such fugitive emissions, as they are called, are thought to be equivalent to 2.3 percent of gross U.S. gas production...

Without doubt, methane leakages during extraction, processing, and transportation do diminish the overall beneficial impact of using more natural gas, but they do not erase it, and they can be substantially reduced.

China

China's 'Artificial Sun' Fusion Reactor Just Set a New World Record (scmp.com) 90

The South China Morning Post reports that China "has reached another milestone in its quest for a fusion reactor, with one of its 'artificial suns' sustaining extreme temperatures for several times longer that its previous benchmark, according to state media." State news agency Xinhua reported that the Experimental Advanced Superconducting Tokamak in a facility in the eastern city of Hefei registered a plasma temperature of 120 million degrees Celsius for 101 seconds on Friday. It also maintained a temperature of 160 million degrees Celsius for 20 seconds, the report said...

The facilities are part of China's quest for fusion reactors, which hold out hope of unlimited clean energy. But there are many challenges to overcome in what has already been a decades-long quest for the world's scientists. Similar endeavours are under way in the United States, Europe, Russia, South Korea. China is also among 35 countries involved in the International Thermonuclear Experimental Reactor (ITER) megaproject in France...

Despite the progress made, fusion reactors are still a long way from reality. Song Yuntao, director of the Institute of Plasma Physics of the Chinese Academy of Sciences, said the latest results were a major achievement for physics and engineering in China. "The experiment's success lays the foundation for China to build its own nuclear fusion energy station," Song was quoted as saying.

NASA notes that the core of the Sun is only about 15 million degrees Celsius.

So for many seconds China's fusion reactor was more than 10 times hotter than the sun.
Australia

Robots and AI Will Guide Australia's First Fully Automated Farm (abc.net.au) 41

"Robots and artificial intelligence will replace workers on Australia's first fully automated farm," reports Australia's national public broadcaster ABC.

The total cost of the farm's upgrade? $20 million. Charles Sturt University in Wagga Wagga will create the "hands-free farm" on a 1,900-hectare property to demonstrate what robots and artificial intelligence can do without workers in the paddock... The farm will use robotic tractors, harvesters, survey equipment and drones, artificial intelligence that will handle sowing, dressing and harvesting, new sensors to measure plants, soils and animals and carbon management tools to minimise the carbon footprint.

The farm is already operated commercially and grows a range of broadacre crops, including wheat, canola, and barley, as well as a vineyard, cattle and sheep.

Power

Could Zinc Batteries Replace Lithium-Ion Batteries on the Power Grid? (sciencemag.org) 120

Slashdot reader sciencehabit shares Science magazine's look at efforts to transform zinc batteries "from small, throwaway cells often used in hearing aids into rechargeable behemoths that could be attached to the power grid, storing solar or wind power for nighttime or when the wind is calm." With startups proliferating and lab studies coming thick and fast, "Zinc batteries are a very hot field," says Chunsheng Wang, a battery expert at the University of Maryland, College Park. Lithium-ion batteries — giant versions of those found in electric vehicles — are the current front-runners for storing renewable energy, but their components can be expensive. Zinc batteries are easier on the wallet and the planet — and lab experiments are now pointing to ways around their primary drawback: They can't be recharged over and over for decades.

For power storage, "Lithium-ion is the 800-pound gorilla," says Michael Burz, CEO of EnZinc, a zinc battery startup. But lithium, a relatively rare metal that's only mined in a handful of countries, is too scarce and expensive to back up the world's utility grids. (It's also in demand from automakers for electric vehicles.) Lithium-ion batteries also typically use a flammable liquid electrolyte. That means megawatt-scale batteries must have pricey cooling and fire-suppression technology. "We need an alternative to lithium," says Debra Rolison, who heads advanced electrochemical materials research at the Naval Research Laboratory. Enter zinc, a silvery, nontoxic, cheap, abundant metal. Nonrechargeable zinc batteries have been on the market for decades. More recently, some zinc rechargeables have also been commercialized, but they tend to have limited energy storage capacity. Another technology — zinc flow cell batteries — is also making strides. But it requires more complex valves, pumps, and tanks to operate. So, researchers are now working to improve another variety, zinc-air cells...

Advances are injecting new hope that rechargeable zinc-air batteries will one day be able to take on lithium. Because of the low cost of their materials, grid-scale zinc-air batteries could cost $100 per kilowatt-hour, less than half the cost of today's cheapest lithium-ion versions. "There is a lot of promise here," Burz says. But researchers still need to scale up their production from small button cells and cellphone-size pouches to shipping container-size systems, all while maintaining their performance, a process that will likely take years.

Hardware

Apps Reportedly Limited To Maximum of 5GB RAM In iPadOS, Even With 16GB M1 iPad Pro (macrumors.com) 159

Despite Apple offering the M1 iPad Pro in configurations with 8GB and 16GB of RAM, developers are now indicating that apps are limited to just 5GB of RAM usage, regardless of the configuration the app is running on. MacRumors reports: The M1 iPad Pro comes in two memory configurations; the 128GB, 256GB, and 512GB models feature 8GB of RAM, while the 1TB and 2TB variants offer 16GB of memory, the highest ever in an iPad. Even with the unprecedented amount of RAM on the iPad, developers are reportedly severely limited in the amount they can actually use. Posted by the developer behind the graphic and design app Artstudio Pro on the Procreate Forum, apps can only use 5GB of RAM on the new M1 iPad Pros. According to the developer, attempting to use anymore will cause the app to crash: "There is a big problem with M1 iPad Pro. After making stress test and other tests on new M1 iPad Pro with 16GB or RAM, it turned out that app can use ONLY 5GB or RAM! If we allocate more, app crashes. It is only 0.5GB more that in old iPads with 6GB of RAM! I suppose it isn't better on iPad with 8GB." Following the release of its M1-optimized app, Procreate also noted on Twitter that with either 8GB or 16GB of available RAM, the app is limited by the amount of RAM it can use.
Hardware

More People Are Buying Wearables Than Ever Before (arstechnica.com) 76

An anonymous reader quotes a report from Ars Technica: The wearables category of consumer devices -- which includes smartwatches, fitness trackers, and augmented reality glasses -- shipped more than 100 million units in the first quarter for the first time, according to research firm IDC. Q2 2021 saw a 34.4 percent increase in sales over the same quarter in 2020. To be clear: wearables have sold that many (and more) units in a quarter before, but never in the first quarter, which tends to be a slow period following a spree of holiday-related buying in Q4.

According to IDC's data, Apple leads the market by a significant margin, presumably thanks to the Apple Watch. In Q1 2021, Apple had a market share of 28.8 percent. Samsung sat in a distant second at 11.3 percent, followed by Xiaomi at 9.7 percent and Huawei at 8.2. From there, it's a steep drop to the smaller players -- like BoAt, which has a market share of just 2.9 percent. However, analysts say upstarts or smaller companies like BoAt are driving the significant year-over-year growth for wearables. IDC's report says that the fastest growth comes from form factors besides smartwatches, such as digitally connected rings, audio glasses, and wearable patches. This grab-bag subcategory within wearables, which the IDC simply classifies as "other," actually grew 55 percent year-over-year.

Power

USB-C Power Upgrade Delivers a Whopping 240W for Gaming Laptops and Other Devices (cnet.com) 110

AmiMoJo writes: The USB-C standard will let you plug in power-hungry devices like gaming laptops, docking stations, 4K monitors and printers with an upgrade that accommodates up to 240 watts starting this year. The jump in maximum power is more than double today's 100-watt top capacity. The USB Implementers Forum, the industry group that develops the technology, revealed the new power levels in the version 2.1 update to its USB Type-C specification on Tuesday. The new 240-watt option is called Extended Power Range, or EPR. "We expect devices supporting higher wattages in the second half of 2021," USB-IF said in a statement.

USB began as a useful but limited port for plugging keyboards, mice and printers into PCs. It later swept aside Firewire and other ports as faster speeds let it tackle more demanding tasks. It proved useful for charging phones as the mobile revolution began, paving the way for its use delivering power, not just data. The 240W Extended Power Range option means USB likely will expand its turf yet again. Cables supporting 240 watts will have additional requirements to accommodate the new levels. And USB-IF will require the cables to bear specific icons "so that end users will be able to confirm visually that the cable supports up to...240W," USB-IF said in the specification document.

Power

Joe Biden Opens Up California Coast To Offshore Wind (theverge.com) 232

An anonymous reader quotes a report from The Verge: Offshore wind is headed west. The Biden administration announced today that it will open up parts of the Pacific coast to commercial-scale offshore renewable energy development for the first time. The geography of the West Coast poses huge technical challenges for wind energy. But rising to meet those challenges is a big opportunity for both President Joe Biden and California Governor Gavin Newsom to meet their clean energy goals. There are two areas now slotted for development off the coast of Central and Northern California -- one at Morro Bay and another near Humboldt County. Together, these areas could generate up to 4.6GW of energy, enough power for 1.6 million homes over the next decade, according to a White House fact sheet.

Compared to the East Coast, waters off the West Coast get deeper much faster. That has stymied offshore wind development. So the White House says it's looking into deploying pretty futuristic technology there: floating wind farms. Until now, technical constraints have generally prevented companies from installing turbines that are fixed to the seafloor in waters more than 60 meters deep. That's left nearly 60 percent of offshore wind resources out of reach, according to the National Renewable Energy Laboratory (NREL). With the development of new technologies that could let wind turbines float in deeper waters, it looks like those resources might finally be within reach.

The Department of Energy says that it has funneled more than $100 million into moving floating offshore wind technology forward. There are only a handful of floating turbines in operation today, and no commercial-scale wind farms yet anywhere in the world. The Bureau of Ocean Energy Management still needs to officially designate the areas off the California coast as Wind Energy Areas for development and complete an environmental analysis. The plan is to auction off leases for the area to developers in mid-2022. It's also working with the Department of Defense to make sure the projects don't interfere with its ongoing "testing, training, and operations" off California's coast.

Slashdot Top Deals