×
United Kingdom

UK To Spend $127M in Global Race To Produce AI Chips (theguardian.com) 24

The UK government will spend $127m to try to win a toe-hold for the nation in the global race to produce computer chips used to power artificial intelligence. From a report: Taxpayer money will be used as part of a drive to build a national AI resource in Britain, similar to those under development in the US and elsewhere. It is understood that the funds will be used to order key components from major chipmakers Nvidia, AMD and Intel. But an official briefed on the plans told the Guardian that the $127m offered by the government is far too low relative to investment by peers in the EU, US and China. The official confirmed, in a move first reported by the Telegraph, which also revealed the investment, that the government is in advanced stages of an order of up to 5,000 graphics processing units (GPUs) from Nvidia. The company, which started out building processing capacity for computer games, has seen a sharp increase in its value as the AI race has heated up. Its chips can run language learning models such as ChatGPT.
Windows

Lenovo's Handheld 'Legion Go' Gaming Computer: Detachable Controls and AR Glasses? (arstechnica.com) 6

To one-up Valve's Steam Deck, Lenovo's handheld gaming device, the "Legion Go," will have "Switch-style detachable controllers," reports Ars Technica" The Legion Go wouldn't be the very first portable PC gaming device with removable controllers; the crowd-funded OneXplayer sported a similar design last year, for instance. But few other PC-based portables have similarly mimicked the Switch Joy-cons in their ability to slide smoothly off from the main screen of the system for detached play.

Combined with a nice, wide kickstand shown in the leaked images, you should be able to give your arms a rest by setting the bulky-looking Legion Go's screen on a tabletop. The slide-off controls also mean you don't need to purchase and/or drag out a separate controller when docking the device to a TV or monitor (which we assume will be a main use case of the device's two USB-C ports). And completely detachable controls for each hand means you can keep your hands as far apart as you want while you hold each "half-controller" separately (one of our favorite unique use cases on the Switch)... The Legion Go also reportedly sports an 8-inch diagonal screen, which is 1 inch larger than Valve's and ROG's devices.

The Legion Go leaks come just months after Lenovo abandoned its button- and cooler-packed Legion line of Android-based gaming phones as part of what it said was a "gaming portfolio consolidation." The Windows 11-based Legion Go — which Windows Central says will be based on AMD's Phoenix processors — should have the high-end PC gaming support that the Legion phones lacked, as well as a more market-proven form factor.

Windows Report believes Lenovo "is preparing to launch an entire gaming ecosystem alongside the Legion Go."

"Among the accessories is a new pair of Legion AR glasses specifically tweaked for gaming." Based on the images we have, the glasses should be small enough to wear through long gaming sessions, with only one USB cable connecting them to any device (most likely for power, which means no standalone battery). The Legion AR Glasess could also feature a high refresh rate and other gaming-specific features, as the Legion branding implies they're made specifically for that...
Firefox

Does Desktop Linux Have a Firefox Problem? (osnews.com) 164

OS News' managing editor calls Firefox "the single most important desktop Linux application," shipping in most distros (with some users later opting for a post-installation download of Chrome).

But "I'm genuinely worried about the state of browsers on Linux, and the future of Firefox on Linux in particular..." While both GNOME and KDE nominally invest in their own two browsers, GNOME Web and Falkon, their uptake is limited and releases few and far between. For instance, none of the major Linux distributions ship GNOME Web as their default browser, and it lacks many of the features users come to expect from a browser. Falkon, meanwhile, is updated only sporadically, often going years between releases. Worse yet, Falkon uses Chromium through QtWebEngine, and GNOME Web uses WebKit (which are updated separately from the browser, so browser releases are not always a solid metric!), so both are dependent on the goodwill of two of the most ruthless corporations in the world, Google and Apple respectively.

Even Firefox itself, even though it's clearly the browser of choice of distributions and Linux users alike, does not consider Linux a first-tier platform. Firefox is first and foremost a Windows browser, followed by macOS second, and Linux third. The love the Linux world has for Firefox is not reciprocated by Mozilla in the same way, and this shows in various places where issues fixed and addressed on the Windows side are ignored on the Linux side for years or longer. The best and most visible example of that is hardware video acceleration. This feature has been a default part of the Windows version since forever, but it wasn't enabled by default for Linux until Firefox 115, released only in early July 2023. Even then, the feature is only enabled by default for users of Intel graphics — AMD and Nvidia users need not apply. This lack of video acceleration was — and for AMD and Nvidia users, still is — a major contributing factor to Linux battery life on laptops taking a serious hit compared to their Windows counterparts... It's not just hardware accelerated video decoding. Gesture support has taken much longer to arrive on the Linux version than it did on the Windows version — things like using swipes to go back and forward, or pinch to zoom on images...

I don't see anyone talking about this problem, or planning for the eventual possible demise of Firefox, what that would mean for the Linux desktop, and how it can be avoided or mitigated. In an ideal world, the major stakeholders of the Linux desktop — KDE, GNOME, the various major distributions — would get together and seriously consider a plan of action. The best possible solution, in my view, would be to fork one of the major browser engines (or pick one and significantly invest in it), and modify this engine and tailor it specifically for the Linux desktop. Stop living off the scraps and leftovers thrown across the fence from Windows and macOS browser makers, and focus entirely on making a browser engine that is optimised fully for Linux, its graphics stack, and its desktops. Have the major stakeholders work together on a Linux-first — or even Linux-only — browser engine, leaving the graphical front-end to the various toolkits and desktop environments....

I think it's highly irresponsible of the various prominent players in the desktop Linux community, from GNOME to KDE, from Ubuntu to Fedora, to seemingly have absolutely zero contingency plans for when Firefox enshittifies or dies...

Linux

Should There Be an 'Official' Version of Linux? (zdnet.com) 283

Why aren't more people using Linux on the desktop? Slashdot reader technology_dude shares one solution: Jack Wallen at ZDNet says establishing an "official" version of Linux may (or may not) help Linux on the desktop increase the number of users, mostly as someplace to point new users. It makes sense to me. What does Slashdot think and what would be the challenges, other than acceptance of a particular flavor?
Wallen argues this would also create a standard for hardware and software vendors to target, which "could equate to even more software and hardware being made available to Linux." (And an "official" Linux might also be more appealing to business users.) Wallen suggests it be "maintained and controlled by a collective of people from users, developers, and corporations (such as Intel and AMD) with a vested interest in the success of this project... There would also be corporate backing for things like marketing (such as TV commercials)." He also suggests basing it on Debian, and supporting both Snap and Flatpak...

In comments on the original submission, long-time Slashdot reader bobbomo points instead to kernel.org, arguing "There already is an official version of Linux called mainline. Everything else is backports." And jd (Slashdot user #1,658) believes that the official Linux is the Linux Standard Base. "All distributions, more-or-less, conform to the LSB, which gives you a pseudo 'official' Linux. About the one variable is the package manager. And there are ways to work around that."

Unfortunately, according to Wikipedia... The LSB standard stopped being updated in 2015 and current Linux distributions do not adhere to or offer it; however, the lsb_release command is sometimes still available.[citation needed] On February 7, 2023, a former maintainer of the LSB wrote, "The LSB project is essentially abandoned."
That post (on the lsb-discuss mailing list) argues the LSB approach was "partially superseded" by Snaps and Flatpaks (for application portability and stability). And of course, long-time Slashdot user menkhaura shares the obligatory XKCD comic...

It's not exactly the same thing, but days after ZDNet's article, CIQ, Oracle, and SUSE announced the Open Enterprise Linux Association, a new collaborative trade association to foster "the development of distributions compatible with Red Hat Enterprise Linux."

So where does that leave us? Share your own thoughts in the comments.

And should there be an "official" version of Linux?
Intel

Intel's GPU Drivers Now Collect Telemetry, Including 'How You Use Your Computer' (extremetech.com) 44

An anonymous reader quotes a report from ExtremeTech: Intel has introduced a telemetry collection service by default in the latest beta driver for its Arc GPUs. You can opt out of it, but we all know most people just click "yes" to everything during a software installation. Intel's release notes for the drivers don't mention this change to how its drivers work, which is a curious omission. News of Intel adding telemetry collection to its drivers is a significant change to how its GPU drivers work. Intel has even given this new collation routine a cute name -- the Intel Computing Improvement Program. Gee, that sounds pretty wonderful. We want to improve our computing, so let's dive into the details briefly.

According to TechPowerUp, which discovered the change, Intel has created a landing page for the program that explains what is collected and what isn't. At a high level, it states, "This program uses information about your computer's performance to make product improvements that may benefit you in the future." Though that sounds innocuous, Intel provides a long list of the types of data it collects, many unrelated to your computer's performance. Those include the types of websites you visit, which Intel says are dumped into 30 categories and logged without URLs or information that identifies you, including how long and how often you visit certain types of sites. It also collects information on "how you use your computer" but offers no details. It will also identify "Other devices in your computing environment." Numerous performance-related data points are also captured, such as your CPU model, display resolution, how much memory you have, and, oddly, your laptop's average battery life.
The good news is that Intel allows you to opt out of this program, which is not the case with Nvidia. According to TechPowerUp, they don't even ask for permission! As for AMD, they not only give you a choice to opt out but they also explain what data they're collecting.
AMD

AMD Announces Radeon Pro W7600 and W7500 (anandtech.com) 6

As AMD continues to launch their full graphics product stacks based on their latest RDNA 3 architecture GPUs, the company is now preparing their next wave of professional cards under the Radeon Pro lineup. Following the launch of their high-end Radeon Pro W7900 and W7800 graphics cards back in the second quarter of this year, today the company is announcing the low-to-mid-range members of the Radeon Pro W7000 series: the Radeon Pro W7500 and Radeon Pro W7600. From a report: Both based on AMD's monolithic Navi 33 silicon, the latest Radeon Pro parts will hit the shelves a bit later this quarter. The two cards, as a whole, will make up what AMD defines as the mid-range segment of their professional video card market. And like their flagship counterparts, AMD is counting on a combination of RDNA 3's advanced features, including AV1 encoding support, improved compute and ray tracing throughput, and DisplayPort 2.1 outputs to help drive sales of the new video cards. That, and as is tradition, significantly undercutting NVIDIA's competing professional cards.

Not unlike their high-end counterparts, for this generation AMD has decided to expand the size of their mid-range pro graphics lineup. Whereas the previous generation had the sole W6600 (and W6400 at entry-level), the W7000 series gets both a W7600 card and a W7500 card. Besides the obvious performance difference, the other big feature separating the two cards is power consumption. The Radeon Pro W7600 is a full-height video card running at 130W, while the W7500 is explicitly designed as a sub-75W card that can be powered entirely by a PCIe slot, coming in at a cool 70 Watts.
The Radeon Pro W7600 is priced at $599 -- $50 cheaper than its predecessor -- whereas the W7500 will bring up the rear of the W7000 product stack at $429.
Linux

Steam On Linux Spikes To Nearly 2% In July, Larger Marketshare Than Apple macOS (phoronix.com) 99

The Steam Survey results for July 2023 were just published and it points to a large and unexpected jump in the Linux gaming marketshare. Phoronix reports; According to these new numbers from Valve, the Linux customer base is up to 1.96%, or a 0.52% jump over June! That's a huge jump with normally just moving 0.1% or so in either direction most months... It's also near an all-time high on a percentage basis going back to the early days of Steam on Linux when it had around a 2% marketshare but at that time the Steam customer size in absolute numbers was much smaller a decade ago than it is now. So if the percentage numbers are accurate, this is likely the largest in absolute terms that the Linux gaming marketshare has ever been.

When looking at the Steam Linux breakdown, the SteamOS Holo that powers the Steam Deck is now accounting for around 42% of all Linux gamers on Steam. Meanwhile, AMD CPU marketshare among Linux gamers has reached 69%. The Steam Survey results for July show Windows 10 64-bit losing 1.56% marketshare and Linux gaining the healthy 0.52% of that. This is also the first time the Linux gaming marketshare outpasses Apple macOS on Steam!

Windows

Lenovo Is Working On a Windows PC Gaming Handheld Called the 'Legion Go' (windowscentral.com) 17

According to Windows Central, Lenovo is working on a handheld gaming PC dubbed "Legion Go," featuring Windows 11 and Ryzen chips. From the report: While details are scant right now, we understand this will sport AMD's new Phoenix processors, which the chip firm describes as ultra-thin, focused on gaming, AI, and graphics for ultrabooks. The fact the Legion Go will sport Ryzen chips pretty much guarantees that this is a Windows PC gaming handheld, as part of Lenovo's popular gaming "Legion" brand. As of writing, there's no information on exactly when this device could become available, or if, indeed, it'll become available at all.

According to our information, the Legion Go could sport an 8-inch screen, making it larger than the ASUS ROG Ally or the Steam Deck, both of which have a 7-inch display. PC and console games ported to PC are often designed for larger monitors or even TVs, and on smaller screens, UI elements can be difficult to see, especially if the game doesn't have a UI scaling option. A larger display could give the Legion Go a decent advantage over its competitors if it remains lightweight and balanced, which of course remains to be seen. The AMD Phoenix 7040 series chips are described by the firm as "ultra-thin" for powerful, but elegant ultrabook-style devices. They should lend themselves well to a device like the Legion Go, supporting 15W low-power states for lightweight games and maximized battery life, similar to the Steam Deck and ROG Ally. The Z1 Extreme in the ASUS ROG Ally can perform with a TDP below 15W, however, which could give the ROG Ally some advantages there. There's every chance the Legion Go could have other configurations we're unaware of yet, though, we'll just have to wait and see.

AMD

AMD 'Zenbleed' Bug Leaks Data From Zen 2 Ryzen, EPYC CPUs (tomshardware.com) 40

Monday a researcher with Google Information Security posted about a new vulnerability he independently found in AMD's Zen 2 processors. Tom's Hardware reports: The 'Zenbleed' vulnerability spans the entire Zen 2 product stack, including AMD's EPYC data center processors and the Ryzen 3000/4000/5000 CPUs, allowing the theft of protected information from the CPU, such as encryption keys and user logins. The attack does not require physical access to the computer or server and can even be executed via JavaScript on a webpage...

AMD added the AMD-SB-7008 Bulletin several hours later. AMD has patches ready for its EPYC 7002 'Rome' processors now, but it will not patch its consumer Zen 2 Ryzen 3000, 4000, and some 5000-series chips until November and December of this year... AMD hasn't given specific details of any performance impacts but did issue the following statement to Tom's Hardware: "Any performance impact will vary depending on workload and system configuration. AMD is not aware of any known exploit of the described vulnerability outside the research environment..."

AMD describes the exploit much more simply, saying, "Under specific microarchitectural circumstances, a register in "Zen 2" CPUs may not be written to 0 correctly. This may cause data from another process and/or thread to be stored in the YMM register, which may allow an attacker to potentially access sensitive information."

The article includes a list of the impacted processors with a schedule for the release of the updated firmware to OEMs.

The Google Information Security researcher who discovered the bug is sharing research on different CPU behaviors, and says the bug can be patched through software on multiple operating systems (e.g., "you can set the chicken bit DE_CFG[9]") — but this might result in a performance penalty.

Thanks to long-time Slashdot reader waspleg for sharing the news.
AMD

AMD CPU Use Among Linux Gamers Approaching 70% Marketshare (phoronix.com) 127

The June Steam Survey results show that AMD CPUs have gained significant popularity among Linux gamers, with a market share of 67% -- a remarkable 7% increase from the previous month. Phoronix reports: In part that's due to the Steam Deck being powered by an AMD SoC but it's been a trend building for some time of AMD's increasing Ryzen CPU popularity among Linux users to their open-source driver work and continuing to build more good will with the community.

In comparison, last June the AMD CPU Linux gaming marketshare came in at 45% while Intel was at 54%. Or at the start of 2023, AMD CPUs were at a 55% marketshare among Linux gamers. Or if going back six years, AMD CPU use among Linux gamers was a mere 18% during the early Ryzen days. It's also the direct opposite on the Windows side. When looking at the Steam Survey results for June limited to Windows, there Intel has a 68% marketshare to AMD at 32%.

Beyond the Steam Deck, it's looking like AMD's efforts around open-source drivers, AMD expanding their Linux client (Ryzen) development efforts over the past two years, promises around OpenSIL, and other efforts commonly covered on Phoronix are paying off for AMD in wooing over their Linux gaming customer base.

Open Source

Linux Foundation's Yocto Project Expands LTS to 4 Years (linuxfoundation.org) 4

Wikipedia defines the Yocto Project as "a Linux Foundation collaborative open source project whose goal is to produce tools and processes that enable the creation of Linux distributions for embedded and IoT software that are independent of the underlying architecture of the embedded hardware."

This week the Linux Foundation shared an update on the 12-year-old Yocto Project: In an effort to support the community, The Yocto Project announced the first Long Term Support (LTS) release in October 2020. Today, we are delighted to announce that we are expanding the LTS release and extending the lifecycle from 2 to 4 years as standard.

The continued growth of the Yocto Project coincides with the welcomed addition of Exein as a Platinum Member, joining AMD/Xilinx, Arm, AWS, BMW Group, Cisco, Comcast, Intel, Meta and WindRiver. As a Member, Exein brings its embedded security expertise across billions of devices to the core of the Yocto Project...

"The Yocto Project has been at the forefront of OS technologies for over a decade," said Andrew Wafaa, Yocto Project Chairperson. "The adaptability and variety of the tooling provided are clearly making a difference to the community. We are delighted to welcome Exein as a member as their knowledge and experience in providing secure Yocto Project based builds to customers will enable us to adapt to the modern landscape being set by the US Digital Strategy and the EU Cyber Resilience Act."

"We're extremely excited to become a Platinum Partner of the Yocto Project," said Gianni Cuozzo, founder and CEO of Exein. "The Yocto Project is the most important project in the embedded Linux space, powering billions of devices every year. We take great pride in contributing our extensive knowledge and expertise in embedded security to foster a future that is both enhanced and secure for Yocto-powered devices. We are dedicated to supporting the growth of the Yocto Project as a whole, aiming to improve its support for modern languages like Rust, and assist developers and OEMs in aligning with the goals outlined in the EU Cyber Resilience Act."

AMD

Could AMD's AI Chips Match the Performance of Nvidia's Chips? (reuters.com) 37

An anonymous reader shared this report from Reuters: Artificial intelligence chips from Advanced Micro Devices are about 80% as fast as those from Nvidia Corp, with a future path to matching their performance, according a Friday report by an AI software firm.

Nvidia dominates the market for the powerful chips that are used to create ChatGPT and other AI services that have swept through the technology industry in recent months. The popularity of those services has pushed Nvidia's value past $1 trillion and led to a shortage of its chips that the Nvidia says it is working to resolve. But in the meantime, tech companies are looking for alternatives, with hopes that AMD will be a strong challenger. That prompted MosaicML, an AI startup acquired for $1.3 billion earlier this week, to conduct a test comparing between AI chips from AMD and Nvidia.

MosaicML evaluated the AMD MI250 and the Nvidia A100, both of which are one generation behind each company's flagship chips but are still in high demand. MosaicML found AMD's chip could get 80% of the performance of Nvidia's chip, thanks largely to a new version of AMD software released late last year and a new version of open-source software backed by Meta Platforms called PyTorch that was released in March.

Hardware

VMware, AMD, Samsung and RISC-V Push For Confidential Computing Standards (theregister.com) 15

VMware has joined AMD, Samsung, and members of the RISC-V community to work on an open and cross-platform framework for the development and operation of applications using confidential computing hardware. The Register reports: Revealing the effort at the Confidential Computing Summit 2023 in San Francisco, the companies say they aim to bring about an industry transition to practical confidential computing by developing the open source Certifier Framework for Confidential Computing project. Among other goals, the project aims to standardize on a set of platform-independent developer APIs that can be used to develop or adapt application code to run in a confidential computing environment, with a Certifier Service overseeing them in operation. VMware claims to have researched, developed and open sourced the Certifier Framework, but with AMD on board, plus Samsung (which develops its own smartphone chips), the group has the x86 and Arm worlds covered. Also on board is the Keystone project, which is developing an enclave framework to support confidential computing on RISC-V processors.

Confidential computing is designed to protect applications and their data from theft or tampering by protecting them inside a secure enclave, or trusted execution environment (TEE). This uses hardware-based security mechanisms to prevent access from everything outside the enclave, including the host operating system and any other application code. Such security protections are likely to be increasingly important in the context of applications running in multi-cloud environments, VMware reckons.

Another scenario for confidential computing put forward by Microsoft, which believes confidential computing will become the norm -- is multi-party computation and analytics. This sees several users each contribute their own private data to an enclave, where it can be analyzed securely to produce results much richer than each would have got purely from their own data set. This is described as an emerging class of machine learning and "data economy" workloads that are based on sensitive data and models aggregated from multiple sources, which will be enabled by confidential computing. However, VMware points out that like many useful hardware features, it will not be widely adopted until it becomes easier to develop applications in the new paradigm.

AI

Oracle Spending 'Billions' on Nvidia Chips This Year, Ellison Says (reuters.com) 27

Oracle is spending "billions" of dollars on chips from Nvidia as it expands a cloud computing service targeting a new wave of artificial intelligence companies, Oracle founder and Chairman Larry Ellison said. From a report: Oracle's cloud division is working to gain ground against larger rivals such as Amazon Web Services and Microsoft. To get an edge, Oracle has focused on building fast networks that can shuffle around the huge amount of data needed to create AI systems similar to ChatGPT.

Oracle is also buying huge numbers of GPUs designed to crunch that data for AI work. Oracle is also spending "billions" of dollars on Nvidia chips but even more on CPUs from Ampere Computing, a chip startup it has invested in, and AMD, Ellison said at an Ampere event.

Chrome

Google's New Standard For ChromeOS: 'Chromebook X' (9to5google.com) 27

Google is launching the "Chromebook X" program, aiming to differentiate high-quality laptops and tablets from standard Chromebooks by improving hardware specifications and adding exclusive features such as enhanced video conferencing capabilities and unique wallpapers. Chromebook X devices, expected to be priced between $350 and $500, will provide users with an elevated experience beyond the basic functionality of traditional Chromebooks. The devices are anticipated to be available in stores by the end of the year, coinciding with the release of ChromeOS version 115 or newer. 9to5Google reports: For the past few months, Google has been preparing new branding for above average devices from various Chromebook makers. Notably, we haven't yet seen any signs of Google making a Chromebook X device of its own, which is honestly a shame considering how long it's been since a Pixelbook has been released. The Chromebook X brand, which could change before launch, will appear somewhere on a laptop/tablet's chassis, with a mark that could be as simple as an "X" next to the usual "Chromebook" logo. There should also be a special boot screen instead of the standard "chromeOS" logo that's shown on all machines today.

Aside from the added "X," what actually sets a Chromebook X apart from other devices is the hardware inside. Specifically, Google appears to require a certain amount of RAM, a good-quality camera for video conferencing, and a (presumably) higher-end display. Beyond that, Google has so far made specific preparations for Chromebook X models to be built on four types of processors from Intel and AMD (though newer generations will likely also be included): AMD Zen 2+ (Skyrim), AMD Zen 3 (Guybrush), and Intel Core 12th Gen (Brya & Nissa).

To further differentiate Chromebook X models from low-end Chromebooks, Google is also preparing an exclusive set of features. As mentioned, one of the key focuses of Chromebook X is video conferencing, with Google requiring an up-to-spec camera. Complementing that hardware, Google is bringing unique features like Live Caption (adding generated captions to video calls), a built-in portrait blur effect, and "voice isolation." Earlier this year, we reported that ChromeOS was readying a set of "Time Of Day" wallpapers and screen savers that would change in appearance throughout the day, particularly to match the sunrise and sunset. We now know that these are going to be exclusive to Chromebook X devices. To ensure that those wallpapers only appear on Chromebook X and can't be forcibly enabled, Google is preparing a system it calls "feature management." At the moment, feature management is only used to check whether to enable Chromebook X exclusives. Based on that, some other exclusive features of Chromebook X include: Support for up to 16 virtual desks; "Pinned" (available offline) files from Google Drive; and A revamped retail demo mode.

Security

Latest SUSE Linux Enterprise Goes All in With Confidential Computing 7

SUSE's latest release of SUSE Linux Enterprise 15 Service Pack 5 (SLE 15 SP5) has a focus on security, claiming it as the first distro to offer full support for confidential computing to protect data. From a report: According to SUSE, the latest version of its enterprise platform is designed to deliver high-performance computing capabilities, with an inevitable mention of AI/ML workloads, plus it claims to have extended its live-patching capabilities. The release also comes just weeks after the community release openSUSE Leap 15.5 was made available, with the two sharing a common core. The Reg's resident open source guru noted that Leap 15.6 has now been confirmed as under development, which implies that a future SLE 15 SP6 should also be in the pipeline.

SUSE announced the latest version at its SUSECON event in Munich, along with a new report on cloud security issues claiming that more than 88 percent of IT teams have reported at least one cloud security incident over the the past year. This appears to be the justification for the claim that SLE 15 SP5 is the first Linux distro to support "the entire spectrum" of confidential computing, allowing customers to run fully encrypted virtual machines on their infrastructure to protect applications and their associated data. Confidential computing relies on hardware-based security mechanisms in the processor to provide this protection, so enterprises hoping to take advantage of this will need to ensure their servers have the necessary support, such as AMD's Secure Encrypted Virtualization-Secure Nested Paging (SEV-SNP) and Intel's Trust Domain Extensions (TDX).
Intel

Intel To Spend $33 Billion in Germany in Landmark Expansion (reuters.com) 12

Intel will invest more than 30 billion euros ($33 billion) in Germany as part of its expansion push in Europe, the U.S. company said on Monday, marking the biggest investment by a foreign company in Europe's top economy. From a report: The deal to build two leading-edge semiconductor facilities in the eastern city of Magdeburg involves 10 billion euros in German subsidies, a person familiar with the matter said. Intel CEO Pat Gelsinger said he was grateful to the German government and the state of Saxony-Anhalt, where Magdeburg is located, for "fulfilling the vision of a vibrant, sustainable, leading-edge semiconductor industry in Germany and the EU." Under Gelsinger, Intel has been investing billions in building factories across three continents to restore its dominance in chipmaking and better compete with rivals AMD, Nvidia and Samsung.
Bug

Dev Boots Linux 292,612 Times to Find Kernel Bug (tomshardware.com) 32

Long-time Slashdot reader waspleg shared this story from Hot Hardware: Red Hat Linux developer Richard WM Jones has shared an eyebrow raising tale of Linux bug hunting. Jones noticed that Linux 6.4 has a bug which means it will hang on boot about 1 in 1,000 times. Jones set out to pinpoint the bug, and prove he had caught it red handed. However, his headlining travail, involving booting Linux 292,612 times (and another 1,000 times to confirm the bug) apparently "only took 21 hours." It also seems that the bug is less common with Intel hardware than AMD based machines.
AI

AWS is Considering AMD's New AI Chips (reuters.com) 12

Amazon Web Services, the world's largest cloud computing provider, is considering using new artificial intelligence chips from AMD, though it has not made a final decision, an AWS executive told Reuters. From the report: The remarks came during an AMD event where the chip company outlined its strategy for the AI market, which is dominated by rival Nvidia. In interviews with Reuters, AMD Chief Executive Lisa Su outlined an approach to winning over major cloud computing customers by offering a menu of all the pieces needed to build the kinds of systems to power services similar to ChatGPT, but letting customers pick and choose which they want, using industry standard connections.

While AWS has not made any public commitments to use AMD's new MI300 chips in its cloud services, Dave Brown, vice president of elastic compute cloud at Amazon, said AWS is considering them. "We're still working together on where exactly that will land between AWS and AMD, but it's something that our teams are working together on," Brown said. "That's where we've benefited from some of the work that they've done around the design that plugs into existing systems."

Encryption

Hackers Can Steal Cryptographic Keys By Video-Recording Power LEDs 60 Feet Away (arstechnica.com) 26

An anonymous reader quotes a report from Ars Technica: Researchers have devised a novel attack that recovers the secret encryption keys stored in smart cards and smartphones by using cameras in iPhones or commercial surveillance systems to video record power LEDs that show when the card reader or smartphone is turned on. The attacks enable a new way to exploit two previously disclosed side channels, a class of attack that measures physical effects that leak from a device as it performs a cryptographic operation. By carefully monitoring characteristics such as power consumption, sound, electromagnetic emissions, or the amount of time it takes for an operation to occur, attackers can assemble enough information to recover secret keys that underpin the security and confidentiality of a cryptographic algorithm. [...]

On Tuesday, academic researchers unveiled new research demonstrating attacks that provide a novel way to exploit these types of side channels. The first attack uses an Internet-connected surveillance camera to take a high-speed video of the power LED on a smart card reader -- or of an attached peripheral device -- during cryptographic operations. This technique allowed the researchers to pull a 256-bit ECDSA key off the same government-approved smart card used in Minerva. The other allowed the researchers to recover the private SIKE key of a Samsung Galaxy S8 phone by training the camera of an iPhone 13 on the power LED of a USB speaker connected to the handset, in a similar way to how Hertzbleed pulled SIKE keys off Intel and AMD CPUs. Power LEDs are designed to indicate when a device is turned on. They typically cast a blue or violet light that varies in brightness and color depending on the power consumption of the device they are connected to.

There are limitations to both attacks that make them unfeasible in many (but not all) real-world scenarios (more on that later). Despite this, the published research is groundbreaking because it provides an entirely new way to facilitate side-channel attacks. Not only that, but the new method removes the biggest barrier holding back previously existing methods from exploiting side channels: the need to have instruments such as an oscilloscope, electric probes, or other objects touching or being in proximity to the device being attacked. In Minerva's case, the device hosting the smart card reader had to be compromised for researchers to collect precise-enough measurements. Hertzbleed, by contrast, didn't rely on a compromised device but instead took 18 days of constant interaction with the vulnerable device to recover the private SIKE key. To attack many other side channels, such as the one in the World War II encrypted teletype terminal, attackers must have specialized and often expensive instruments attached or near the targeted device. The video-based attacks presented on Tuesday reduce or completely eliminate such requirements. All that's required to steal the private key stored on the smart card is an Internet-connected surveillance camera that can be as far as 62 feet away from the targeted reader. The side-channel attack on the Samsung Galaxy handset can be performed by an iPhone 13 camera that's already present in the same room.
Videos here and here show the video-capture process of a smart card reader and a Samsung Galaxy phone, respectively, as they perform cryptographic operations. "To the naked eye, the captured video looks unremarkable," adds Ars.

"But by analyzing the video frames for different RGB values in the green channel, an attacker can identify the start and finish of a cryptographic operation."

Slashdot Top Deals