Open Source

Developer Successfully Boots Up Linux on Google Drive (ersei.net) 42

Its FOSS writes: When it comes to Linux, we get to see some really cool, and sometimes quirky projects (read Hannah Montana Linux) that try to show off what's possible, and that's not a bad thing. One such quirky undertaking has recently surfaced, which sees a sophomore trying to one-up their friend, who had booted Linux off NFS. With their work, they have been able to run Arch Linux on Google Drive.
Their ultimate idea included FUSE (which allows running file-system code in userspace). The developer's blog post explains that when Linux boots, "the kernel unpacks a temporary filesystem into RAM which has the tools to mount the real filesystem... it's very helpful! We can mount a FUSE filesystem in that step and boot normally.... " Thankfully, Dracut makes it easy enough to build a custom initramfs... I decide to build this on top of Arch Linux because it's relatively lightweight and I'm familiar with how it work."
Doing testing in an Amazon S3 container, they built an EFI image — then spent days trying to enable networking... And the adventure continues. ("Would it be possible to manually switch the root without a specialized system call? What if I just chroot?") After they'd made a few more tweaks, "I sit there, in front of my computer, staring. It can't have been that easy, can it? Surely, this is a profane act, and the spirit of Dennis Ritchie ought't've stopped me, right? Nobody stopped me, so I kept going..." I build the unified EFI file, throw it on a USB drive under /BOOT/EFI, and stick it in my old server... This is my magnum opus. My Great Work. This is the mark I will leave on this planet long after I am gone: The Cloud Native Computer.

Despite how silly this project is, there are a few less-silly uses I can think of, like booting Linux off of SSH, or perhaps booting Linux off of a Git repository and tracking every change in Git using gitfs. The possibilities are endless, despite the middling usefulness.

If there is anything I know about technology, it's that moving everything to The Cloud is the current trend. As such, I am prepared to commercialize this for any company wishing to leave their unreliable hardware storage behind and move entirely to The Cloud. Please request a quote if you are interested in True Cloud Native Computing.

Unfortunately, I don't know what to do next with this. Maybe I should install Nix?

Programming

Eclipse Foundation Releases Open-Source Theia IDE - Compatible with VS Code Extensions (adtmag.com) 25

"After approximately seven years in development, the Eclipse Foundation's Theia IDE project is now generally available," writes ADT magazine, "emerging from beta to challenge Microsoft's similar Visual Studio Code (VS Code) editor." The Eclipse Theia IDE is part of the Eclipse Cloud DevTools ecosystem. The Eclipse Foundation calls it "a true open-source alternative to VS Code," which was built on open source but includes proprietary elements, such as default telemetry, which collects usage data...

Theia was built on the same Monaco editor that powers VS Code, and it supports the same Language Server Protocol (LSP) and Debug Adapter Protocol (DAP) that provide IntelliSense code completions, error checking and other features. The Theia IDE also supports the same extensions as VS Code (via the Open VSX Registry instead of Microsoft's Visual Studio Code Marketplace), which are typically written in TypeScript and JavaScript. There are many, many more extensions available for VS Code in Microsoft's marketplace, while "Extensions for VS Code Compatible Editors" in the Open VSX Registry number 3,784 at the time of this writing...

The Eclipse Foundation emphasized another difference between its Theia IDE and VS Code: the surrounding ecosystem/community. "At the core of Theia IDE is its vibrant open source community hosted by the Eclipse Foundation," the organization said in a news release. "This ensures freedom for commercial use without proprietary constraints and fosters innovation and reliability through contributions from companies such as Ericsson, EclipseSource, STMicroelectronics, TypeFox, and more. The community-driven model encourages participation and adaptation according to user needs and feedback."

Indeed, the list of contributors to and adopters of the platform is extensive, also featuring Broadcom, Arm, IBM, Red Hat, SAP, Samsung, Google, Gitpod, Huawei and many others.

The It's FOSS blog has some screenshots and a detailed rundown.

ADT magazine stresses that there's also an entirely distinct (but related) project called the Eclipse Theia Platform (not IDE) which differs from VS Code by allowing developers "to create desktop and cloud IDEs using a single, open-source technology stack" [that can be used in open-source initiatives]. The Eclipse Theia platform "allows developers to customize every aspect of the IDE without forking or patching the code... fully tailored for the needs of internal company projects or for commercial resale as a branded product."
Windows

New Windows 11 Start Menu Annoyingly Hides Oft-Used Actions (pcworld.com) 100

An anonymous reader shares a report: A new test version of Windows 11 is available for Windows Insiders on the Dev Channel with Build 26120.961, which rolls out a significant change: a new Windows Start menu. You'll immediately notice that Microsoft has redesigned the Microsoft user account display, moving it to the center of the Start menu as soon as you click on the username or profile picture.

This new "account manager" feature gives you quicker access to your various Microsoft accounts, such as Microsoft 365, Xbox Game Pass, and OneDrive cloud storage. To no surprise, Microsoft is using this prominent display to remind you of their own products and services. The difference to the current Windows 11 Start menu is obvious, as the following screenshot shows:

AI

Amazon, Built by Retail, Invests in Its AI Future (wsj.com) 26

An anonymous reader shares a report: Amazon built a $2 trillion company through years of aggressive spending on its retail and logistics businesses. Its future gains will likely be determined by the billions designated to fund its artificial-intelligence push. Amazon is planning to spend more than $100 billion over the next decade on data centers, an impressive level of investment even for a company known for its spending ways. The Seattle company is now devoting more investment money to its cloud computing and AI infrastructure than to its sprawling network of e-commerce warehouses.

Amazon Web Services, the arm that manages Amazon's cloud business, has opened data centers for years, but executives said there is a surge in investment now to meet demand triggered by the excitement around AI. "We have to dive in. We have to figure it out," said John Felton, who took over as AWS's chief financial officer this year after spending most of his career in Amazon's retail fulfillment operations. The company's financial commitment reflects the importance and high costs of AI. Felton said building for AI today feels like building that massive delivery network in years past. "It's a little uncertain," he said. AWS is expanding in Virginia, Ohio and elsewhere.

Cloud

Could We Lower The Carbon Footprint of Data Centers By Launching Them Into Space? (cnbc.com) 114

The Wall Street Journal reports that a European initiative studying the feasibility data centers in space "has found that the project could be economically viable" — while reducing the data center's carbon footprint.

And they add that according to coordinator Thales Alenia Space, the project "could also generate a return on investment of several billion euros between now and 2050." The study — dubbed Ascend, short for Advanced Space Cloud for European Net zero emission and Data sovereignty — was funded by the European Union and sought to compare the environmental impacts of space-based and Earth-based data centers, the company said. Moving forward, the company plans to consolidate and optimize its results. Space data centers would be powered by solar energy outside the Earth's atmosphere, aiming to contribute to the European Union's goal of achieving carbon neutrality by 2050, the project coordinator said... Space data centers wouldn't require water to cool them, the company said.
The 16-month study came to a "very encouraging" conclusion, project manager Damien Dumestier told CNBC. With some caveats... The facilities that the study explored launching into space would orbit at an altitude of around 1,400 kilometers (869.9 miles) — about three times the altitude of the International Space Station. Dumestier explained that ASCEND would aim to deploy 13 space data center building blocks with a total capacity of 10 megawatts in 2036, in order to achieve the starting point for cloud service commercialization... The study found that, in order to significantly reduce CO2 emissions, a new type of launcher that is 10 times less emissive would need to be developed. ArianeGroup, one of the 12 companies participating in the study, is working to speed up the development of such reusable and eco-friendly launchers. The target is to have the first eco-launcher ready by 2035 and then to allow for 15 years of deployment in order to have the huge capacity required to make the project feasible, said Dumestier...

Michael Winterson, managing director of the European Data Centre Association, acknowledges that a space data center would benefit from increased efficiency from solar power without the interruption of weather patterns — but the center would require significant amounts of rocket fuel to keep it in orbit. Winterson estimates that even a small 1 megawatt center in low earth orbit would need around 280,000 kilograms of rocket fuel per year at a cost of around $140 million in 2030 — a calculation based on a significant decrease in launch costs, which has yet to take place. "There will be specialist services that will be suited to this idea, but it will in no way be a market replacement," said Winterson. "Applications that might be well served would be very specific, such as military/surveillance, broadcasting, telecommunications and financial trading services. All other services would not competitively run from space," he added in emailed comments.

[Merima Dzanic, head of strategy and operations at the Danish Data Center Industry Association] also signaled some skepticism around security risks, noting, "Space is being increasingly politicised and weaponized amongst the different countries. So obviously, there is a security implications on what type of data you send out there."

Its not the only study looking at the potential of orbital data centers, notes CNBC. "Microsoft, which has previously trialed the use of a subsea data center that was positioned 117 feet deep on the seafloor, is collaborating with companies such as Loft Orbital to explore the challenges in executing AI and computing in space."

The article also points out that the total global electricity consumption from data centers could exceed 1,000 terawatt-hours in 2026. "That's roughly equivalent to the electricity consumption of Japan, according to the International Energy Agency."
Privacy

Amazon Is Investigating Perplexity Over Claims of Scraping Abuse (wired.com) 7

Amazon's cloud arm is investigating Perplexity AI for potential violations of its web services rules, the e-commerce giant told Wired. The startup, backed by Jeff Bezos' family fund and Nvidia, allegedly scraped websites that had explicitly forbidden such access.

Earlier this month, WIRED uncovered evidence of Perplexity using an unmarked IP address to bypass restrictions on major news sites. The company's CEO, Aravind Srinivas, claimed a third-party contractor was responsible but declined to name them.
Space

Phosphate In NASA's OSIRIS-REx Asteroid Sample Suggests Ocean World Origins (space.com) 19

Early analysis of the near-Earth asteroid Bennu has revealed unexpected evidence of magnesium-sodium phosphate, suggesting Bennu might have originated from a primitive ocean world. Space.com reports: On Earth, magnesium-sodium phosphate can be found in certain minerals and geological formations, as well as within living organisms where it is present in various biochemical processes and is a component of bone and teeth. According to a NASA press release, however, its presence on Bennu surprised the research team because it wasn't seen in the OSIRIS-REx probe's remote sensing data prior to sample collection. The team says its presence "hints that the asteroid could have splintered off from a long-gone, tiny, primitive ocean world." "The presence and state of phosphates, along with other elements and compounds on Bennu, suggest a watery past for the asteroid," said Lauretta. "Bennu potentially could have once been part of a wetter world. Although, this hypothesis requires further investigation."

The OSIRIS-REx spacecraft obtained a sample of Bennu's regolith on October 20, 2020 using its Touch-and-Go Sample Acquisition Mechanism (TAGSAM), which comprises a specialized sampler head situated on an articulated arm. Bennu is a small B-type asteroid, which are relatively uncommon carbonaceous asteroids. "[Bennu] was selected as the mission target in part because telescopic observations indicated a primitive, carbonaceous composition and water-bearing minerals," stated the team in their paper. [...] Further analysis on the samples revealed the prevailing component of the regolith sample is magnesium-bearing phyllosilicates, primarily serpentine and smectite -- types of rock typically found at mid-ocean ridges on Earth. A comparison of these serpentinites with their terrestrial counterparts provides possible insights into Bennu's geological past. "Offering clues about the aqueous environment in which they originated," wrote the team.

While Bennu's surface may have been altered by water over time, it still preserves some of the ancient characteristics scientists believe were present during the early solar system's days. Bennu's surface materials still contain some original features from the cloud of gas and dust from which our solar system's planets formed -- known as the protoplanetary disk. The team's study also confirmed the asteroid is rich in carbon, nitrogen and some organic compounds -- all of which, in addition to the magnesium phosphate, are essential components for life as we know it on Earth.

Patents

Microsoft's Canceled Xbox Cloud Console Gets Detailed In New Patent (windowscentral.com) 4

Microsoft's canceled Xbox cloud console, codenamed Keystone, has been detailed in a new patent spotted by Windows Central's Zac Bowden. From the report: Back in 2021, Microsoft announced that it was working on a dedicated streaming device for Xbox Game Pass. That device was later revealed to be codenamed Keystone, which took the form of a streaming box that would sit under your TV, cost a fraction of the price of a normal Xbox, and enable the ability to play Xbox games via the cloud. Unfortunately, it appears Microsoft has since scrapped plans to ship Xbox Keystone due to an inability to bring the price down to a level where it made sense for customers. Xbox CEO Phil Spencer is on record saying the device should have costed around $99 or $129, but the company was unable to achieve this.

Thanks to a patent discovered by Windows Central, we can finally take a closer look at the box Microsoft had conjured up internally. First up, the patent reveals that the console took the form of an even square with a circle shape on top, similar to the black circular vent on an Xbox Series S. The front of the box had the Xbox power button, and a USB-A port. Around the back, there were three additional ports; HDMI, ethernet, and power. On the right side of the console there was appears to be an Xbox controller pairing button, and the underside featured a circular "Hello from Seattle" plate that the console sat on, similar to the Xbox Series X. This patent was filed in June 2022, which was around the time when the first details of Xbox Keystone were being revealed.

China

US Probing China Telecom, China Mobile Over Internet, Cloud Risks (reuters.com) 23

The Biden administration is investigating China Mobile, China Telecom and China Unicom over concerns the firms could exploit access to American data through their U.S. cloud and internet businesses by providing it to Beijing, Reuters reported Tuesday, citing sources familiar with the matter. From the report: The companies still have a small presence in the United States, for example, providing cloud services and routing wholesale U.S. internet traffic. That gives them access to Americans' data even after telecom regulators barred them from providing telephone and retail internet services in the United States.

Reuters found no evidence the companies intentionally provided sensitive U.S. data to the Chinese government or committed any other type of wrongdoing. The investigation is the latest effort by Washington to prevent Beijing from exploiting Chinese firms' access to U.S. data to harm companies, Americans or national security, as part of a deepening tech war between the geopolitical rivals. It shows the administration is trying to shut down all remaining avenues for Chinese companies already targeted by Washington to obtain U.S. data.

China

Chinese Rocket Seen Falling On a Village Spewing Highly Toxic Chemicals (gizmodo.com) 27

Passant Rabie reports via Gizmodo: A video circulating online appears to show debris from a Chinese rocket falling above a populated area, with residents running for cover as a heavy cloud of dark yellow smoke trails across the sky in a frightening scene. The suspected debris may have come from China's Long March 2C rocket, which launched on Saturday, June 22, carrying a joint mission by China and France to study Gamma-ray bursts. The launch was declared a success, but its aftermath was captured by videos posted to Chinese social media sites.

The videos show what appears to be the first stage rocket booster of the Long March 2C rocket tumbling uncontrollably over a village in southwest China, while local residents cover their ears and run for shelter from the falling debris. There are no reports of injuries or damage to property. That said, unverified video and images show a gigantic cloud erupting at the site of the crashed rocket, and the booster itself seemingly next to a roadway. The first stage of the rocket can be seen leaking fuel, the color of which is consistent with nitrogen tetroxide. The chemical compound is a strong oxidizing agent that is used for rocket propulsion but it can be fatally toxic, according to Jonathan McDowell, astrophysicist at the Harvard-Smithsonian Center.

"It's known in the rocket industry as BFRC, a big fucking red cloud," McDowell told Gizmodo. "And when you see a BFRC, you run for your life." Nitrogen tetroxide was accepted as the rocket propellant oxidizer of choice in the early 1950s by the U.S.S.R. and the United States, however it became less commonly used over the years because it is extremely toxic, according to NASA (PDF). If it comes in contact with skin, eyes, or respiratory system, it can destroy human tissue, and if inhaled through the lungs, it can lead to a build up of fluids or, in extreme cases, death. "It's pretty scary, but this is just how the Chinese do business," McDowell told Gizmodo. "They have a different level of acceptable public risk."
"I think over a 10 year period, we may see the older rockets phased out but they're not in any hurry to do so," added McDowell. "They're still launching one a week or something like that, and they are really quite dangerous."
Microsoft

Microsoft Ends 'Project Natick' Underwater Data Center Experiment Despite Success (techspot.com) 35

Microsoft has decided to end its Project Natick experiment, which involved submerging a datacenter capsule 120 miles off the coast of Scotland to explore the feasibility of deploying underwater datacenters. TechSpot's Rob Thubron reports: Project Natick's origins stretch all the way back to 2013. Following a three-month trial in the Pacific, a submersible data center capsule was deployed 120 miles off the coast of Scotland in 2018. It was brought back to the surface in 2020, offering what were said to be promising results. Microsoft lost six of the 855 servers that were in the capsule during its time underwater. In a comparison experiment being run simultaneously on dry land, it lost eight out of 135 servers. Microsoft noted that the constant temperature stability of the external seawater was a factor in the experiment's success. It also highlighted how the data center was filled with inert nitrogen gas that protected the servers, as opposed to the reactive oxygen gas in the land data center.

Despite everything going so well, Microsoft is discontinuing Project Natick. "I'm not building subsea data centers anywhere in the world," Noelle Walsh, the head of the company's Cloud Operations + Innovation (CO+I) division, told DatacenterDynamics. "My team worked on it, and it worked. We learned a lot about operations below sea level and vibration and impacts on the server. So we'll apply those learnings to other cases," Walsh added.

Microsoft also patented a high-pressure data center in 2019 and an artificial reef data center in 2017, but it seems the company is putting resources into traditional builds for now. "I would say now we're getting more focused," Walsh said. "We like to do R&D and try things out, and you learn something here and it may fly over there. But I'd say now, it's very focused." "While we don't currently have data centers in the water, we will continue to use Project Natick as a research platform to explore, test, and validate new concepts around data center reliability and sustainability, for example with liquid immersion."

Space

Tuesday SpaceX Launches a NOAA Satellite to Improve Weather Forecasts for Earth and Space (space.com) 20

Tuesday a SpaceX Falcon Heavy rocket will launch a special satellite — a state-of-the-art weather-watcher from America's National Oceanic and Atmospheric Administration.

It will complete a series of four GOES-R satellite launches that began in 2016. Space.com drills down into how these satellites have changed weather forecasts: More than seven years later, with three of the four satellites in the series orbiting the Earth, scientists and researchers say they are pleased with the results and how the advanced technology has been a game changer. "I think it has really lived up to its hype in thunderstorm forecasting. Meteorologists can see the convection evolve in near real-time and this gives them enhanced insight on storm development and severity, making for better warnings," John Cintineo, a researcher from NOAA's National Severe Storms Laboratory , told Space.com in an email.

"Not only does the GOES-R series provide observations where radar coverage is lacking, but it often provides a robust signal before radar, such as when a storm is strengthening or weakening. I'm sure there have been many other improvements in forecasts and environmental monitoring over the last decade, but this is where I have most clearly seen improvement," Cintineo said. In addition to helping predict severe thunderstorms, each satellite has collected images and data on heavy rain events that could trigger flooding, detected low clouds and fog as it forms, and has made significant improvements to forecasts and services used during hurricane season. "GOES provides our hurricane forecasters with faster, more accurate and detailed data that is critical for estimating a storm's intensity, including cloud top cooling, convective structures, specific features of a hurricane's eye, upper-level wind speeds, and lightning activity," Ken Graham, director of NOAA's National Weather Service told Space.com in an email.

Instruments such as the Advanced Baseline Imager have three times more spectral channels, four times the image quality, and five times the imaging speed as the previous GOES satellites. The Geostationary Lightning Mapper is the first of its kind in orbit on the GOES-R series that allows scientists to view lightning 24/7 and strikes that make contact with the ground and from cloud to cloud. "GOES-U and the GOES-R series of satellites provides scientists and forecasters weather surveillance of the entire western hemisphere, at unprecedented spatial and temporal scales," Cintineo said. "Data from these satellites are helping researchers develop new tools and methods to address problems such as lightning prediction, sea-spray identification (sea-spray is dangerous for mariners), severe weather warnings, and accurate cloud motion estimation. The instruments from GOES-R also help improve forecasts from global and regional numerical weather models, through improved data assimilation."

The final satellite, launching Tuesday, includes a new sensor — the Compact Coronagraph — "that will monitor weather outside of Earth's atmosphere, keeping an eye on what space weather events are happening that could impact our planet," according to the article.

"It will be the first near real time operational coronagraph that we have access to," Rob Steenburgh, a space scientist at NOAA's Space Weather Prediction Center, told Space.com on the phone. "That's a huge leap for us because up until now, we've always depended on a research coronagraph instrument on a spacecraft that was launched quite a long time ago."
Security

Linux Foundation's 'Open Source Security Foundation' Launches New Threat Intelligence Mailing List (openssf.org) 4

The Linux Foundation's "Open Source Security Foundation" (or OpenSSF) is a cross-industry forum to "secure the development, maintenance, and consumption of the open source software". And now the OpenSSF has launched a new mailing list "which aims to monitor the threat landscape of open-source project vulnerabilities," reports I Programmer, "in order to provide real time alerts to anyone subscribed."

The Record explains its origins: OpenSSF General Manager Omkhar Arasaratnam said that at a recent open source event, members of the community ran a tabletop exercise where they simulated a security incident involving the discovery of a zero-day vulnerability. They worked their way through the open source ecosystem — from cloud providers to maintainers to end users — clearly defining how the discovery of a vulnerability would be dealt with from top to bottom. But one of the places where they found a gap is in the dissemination of information widely.

"What we lack within the open source community is a place in which we can convene to distribute indicators of compromise (IOCs) and threats, tactics and procedures (TTPs) in a way that will allow the community to identify threats when our packages are under attack," Arasaratnam said... "[W]e're going to be standing up a mailing list for which we can share this information throughout the community and there can be discussion of things that are being seen. And that's one of the ways that we're responding to this gap that we saw...." The Siren mailing list will encourage public discussions on security flaws, concepts, and practices in the open source community with individuals who are not typically engaged in traditional upstream communication channels...

Members of the Siren email list will get real-time updates about emerging threats that may be relevant to their projects... OpenSSF has created a signup page for those interested and urged others to share the email list to other open source community members...

OpenSSF ecyosystem strategist Christopher Robinson (also security communications director for Intel) told the site he expects government agencies and security researchers to be involved in the effort. And he issued this joint statement with OpenSSF ecosystem strategist Bennett Pursell: By leveraging the collective knowledge and expertise of the open source community and other security experts, the OpenSSF Siren empowers projects of all sizes to bolster their cybersecurity defenses and increase their overall awareness of malicious activities. Whether you're a developer, maintainer, or security enthusiast, your participation is vital in safeguarding the integrity of open source software.
In less than a month, the mailing list has already grown to over 800 members...
United Kingdom

Microsoft Admits No Guarantee of Sovereignty For UK Policing Data (computerweekly.com) 88

An anonymous reader shared this report from Computer Weekly: Microsoft has admitted to Scottish policing bodies that it cannot guarantee the sovereignty of UK policing data hosted on its hyperscale public cloud infrastructure, despite its systems being deployed throughout the criminal justice sector.

According to correspondence released by the Scottish Police Authority (SPA) under freedom of information (FOI) rules, Microsoft is unable to guarantee that data uploaded to a key Police Scotland IT system — the Digital Evidence Sharing Capability (DESC) — will remain in the UK as required by law. While the correspondence has not been released in full, the disclosure reveals that data hosted in Microsoft's hyperscale public cloud infrastructure is regularly transferred and processed overseas; that the data processing agreement in place for the DESC did not cover UK-specific data protection requirements; and that while the company has the ability to make technical changes to ensure data protection compliance, it is only making these changes for DESC partners and not other policing bodies because "no one else had asked".

The correspondence also contains acknowledgements from Microsoft that international data transfers are inherent to its public cloud architecture. As a result, the issues identified with the Scottish Police will equally apply to all UK government users, many of whom face similar regulatory limitations on the offshoring of data. The recipient of the FOI disclosures, Owen Sayers — an independent security consultant and enterprise architect with over 20 years' experience in delivering national policing systems — concluded it is now clear that UK policing data has been travelling overseas and "the statements from Microsoft make clear that they 100% cannot comply with UK data protection law".

Security

Hacker Claims To Have 30 Million Customer Records From Ticket Giant TEG (techcrunch.com)

An anonymous reader quotes a report from TechCrunch: A hacker is advertising customer data allegedly stolen from the Australia-based live events and ticketing company TEG on a well-known hacking forum. On Thursday, a hacker put up for sale the alleged stolen data from TEG, claiming to have information of 30 million users, including the full name, gender, date of birth, username, hashed passwords, and email addresses. In late May, TEG-owned ticketing company Ticketek disclosed a data breach affecting Australian customers' data, "which is stored in a cloud-based platform, hosted by a reputable, global third party supplier."

The company said that "no Ticketek customer account has been compromised," thanks to the encryption methods used to store their passwords. TEG conceded, however, that "customer names, dates of birth and email addresses may have been impacted" -- data that would line up with that advertised on the hacking forum. The hacker included a sample of the alleged stolen data in their post. TechCrunch confirmed that at least some of the data published on the forum appears legitimate by attempting to sign up for new accounts using the published email addresses. In a number of cases, Ticketek's website gave an error, suggesting the email addresses are already in use.
There's evidence that the company's "cloud-based platform" provider is Snowflake, "which has been at the center of a recent series of data thefts affecting several of its customers, including Ticketmaster, Santander Bank, and others," notes TechCrunch.

"A now-deleted post on Snowflake's website from January 2023 was titled: 'TEG Personalizes Live Entertainment Experiences with Snowflake.' In 2022, consulting company Altis published a case study (PDF) detailing how the company, working with TEG, 'built a modern data platform for ingesting streaming data into Snowflake.'"
Businesses

Stability AI Appoints New CEO 4

British startup Stability AI has appointed Prem Akkaraju as its new CEO. The 51-year-old Akkaraju, former CEO of visual effects company Weta Digital, "is part of a group of investors including former Facebook President Sean Parker that has stepped in to save Stability with a cash infusion that could result in a lower valuation for the firm," reports the Information (paywalled). "The new funding will likely shrink the stakes of some existing investors, who have collectively contributed more than $100 million."

In March, Stability AI founder and CEO Emad Mostaque stepped down from the role to pursue decentralized AI. "In a series of posts on X, Mostaque opined that one can't beat 'centralized AI' with more 'centralized AI,' referring to the ownership structure of top AI startups such as OpenAI and Anthropic," reported TechCrunch at the time. The move followed a report in April that claimed the company ran out of cash to pay its bills for its rented cloud GPUs. Last year, the company raised millions at a $1 billion valuation.
SuSE

SUSE Upgrades Its Distros With 19 Years of Support (zdnet.com) 36

An anonymous reader quotes a report from ZDNet: At SUSECon in Berlin, SUSE, a global Linux and cloud-native software leader, announced significant enhancements across its entire Linux distribution family. These new capabilities focus on providing faster time-to-value and reduced operational costs, emphasizing the importance of choice in today's complex IT landscape. SUSE Linux Enterprise Server (SLES) 15 Service Pack (SP) 6 is at the heart of these upgrades. This update future-proofs IT workloads with a new Long Term Service (LTS) Pack Support Core. How long is long-term? Would you believe 19 years? This gives SLES the longest-term support period in the enterprise Linux market. Even Ubuntu, for which Canonical recently extended its LTS to 12 years, doesn't come close.

You may ask yourself, "Why 19 years?" SUSE General Manager of Business Critical Linux (BCL) Rick Spencer, explained in an interview that the reason is that on 03:14:08 Greenwich Mean Time (GMT, aka Coordinated Universal Time) Tuesday, January 19, 2038, we reach the end of computing time. Well, not really, but Linux, and all the other Unix-based operating systems, including some versions of MacOS, reach what's called the Epoch. That's when the time-keeping code in 32-bit Unix-based operating systems reaches the end of the seconds it's been counting since the beginning of time -- 00:00:00 GMT on January 1, 1970, as far as Linux and Unix systems are concerned -- and resets to zero. Just like the Y2K bug, that means that all unpatched 32-bit operating systems and software will have fits. The Linux kernel itself had the problem fixed in 2020's Linux 5.6 kernel, but many other programs haven't dealt with it. Until then, though, if you're still running SLES 15 SP6, you'll be covered. I strongly suggest upgrading before then, but if you want to stick with that distro to the bitter end, you can.
The new SLES also boasts enhanced security features like confidential computing support with encryption in memory, utilizing Intel TDX and AMD SEV processors, along with remote attestation via SUSE Manager. Additionally, SLES for SAP Applications 15 SP6 offers a secure and reliable platform for running mission-critical SAP workloads, incorporating innovations from Trento to help system administrators avoid infrastructure issues.
Technology

Former Cisco CEO: Nvidia's AI Dominance Mirrors Cisco's Internet Boom, But Market Dynamics Differ (wsj.com) 24

Nvidia has become the U.S.'s most valuable listed company, riding the wave of the AI revolution that brings back memories of one from earlier this century. The last time a big provider of computing infrastructure was the most valuable U.S. company was in March 2000, when networking-equipment company Cisco took that spot at the height of the dot-com boom.

Former Cisco CEO John Chambers, who led the company during the dot-com boom, said the implications of AI are larger than the internet and cloud computing combined, but the dynamics differ. "The implications in terms of the size of the market opportunity is that of the internet and cloud computing combined," he told WSJ. "The speed of change is different, the size of the market is different, the stage when the most valuable company was reached is different." The story adds: Chambers said [Nvidia CEO] Huang was working from a different playbook than Cisco but was facing some similar challenges. Nvidia has a dominant market share, much like Cisco did with its products as the internet grew, and is also fending off rising competition. Also like Nvidia, Cisco benefited from investments before the industry became profitable. "We were absolutely in the right spot at the right time, and we knew it, and we went for it," Chambers said.
IT

Asda IT Staff Shuffled Off To TCS Amid Messy Tech Divorce From Walmart (theregister.com) 22

An anonymous reader quotes a report from The Register: Asda is transferring more than 100 internal IT workers to Indian outsourcing company TCS as it labors to meet deadlines to move away from IT systems supported by previous owner Walmart by the end of the year. According to documents seen by The Register, a collective consultation for a staff transfer under TUPE -- an arrangement by which employment rights are protected under UK law -- begins today (June 17). The UK's third-largest supermarket expects affected staff to meet line managers from June 24, while the transfer date is set for September 16. Contractors will be let go at the end of their current contracts. Asda employs around 5,000 staff in its UK offices. Between 130 and 135 members of the IT team have entered the collective consultation to move to TCS.

The move came as private equity company TDR Capital gained majority ownership of the supermarket group. It was acquired from Walmart by the brothers Mohsin and Zuber Issa and TDR Capital in February 2021 at a value of 6.8 billion pounds. The US retail giant retained "an equity investment." Project Future is a massive shift in the retailer's IT function. It is upgrading a legacy ERP system from SAP ECC -- run on-prem by Walmart -- to the latest SAP S/4HANA in the Microsoft Azure cloud, changing the application software, infrastructure, and business processes at the same time. Other applications are also set to move to Azure, including ecommerce and store systems, while Asda is creating an IT security team for the first time -- the work had previously been carried out by its US owner.

Asda signed up to SAP's "RISE" program in a deal to lift, shift, and transform its ERP system -- a vital plank in the German vendor's strategy to get customers to the cloud -- in December 2021. But the project has already been beset by delays. The UK retailer had signed a three-year deal with Walmart in February 2021 to continue to support its existing system, but was forced to renegotiate to extend the arrangement, saying it planned to move away from the legacy systems before the end of 2024. Although one insider told El Reg that deadline was "totally unachievable," the Walmart deal extends to September 2025, giving the UK retailer room to accommodate further delays without renegotiating the contract.

Asda has yet to migrate a single store to the new infrastructure. The first -- Yorkshire's Otley -- is set to go live by the end of June. One insider pointed out that project managers were trying to book resources from the infrastructure team for later this year and into the next, but, as they were set to transfer to TCS, the infrastructure team did not know who would be doing the work or what resources would be available. "They have a thousand stores to migrate and they're going to be doing that with an infrastructure team who have their eyes on the door. They'll be very professional, but they're not going above and beyond and doing on-call they don't have to do," the insider said.

Supercomputing

$2.4 Million Texas Home Listing Boasts Built-In 5,786 sq ft Data Center (tomshardware.com) 34

A Zillow listing for a $2.4 million house in a Dallas suburb is grabbing attention for its 5,786-square-foot data center with immersion cooling tanks, massive server racks, and two separate power grids. Tom's Hardware reports: With a brick exterior, cute paving, and mini-McMansion arch stylings, the building certainly looks to be a residential home for the archetypal Texas family. Prospective home-buyers will thus be disappointed by the 0 bedroom, 1 bathroom setup, which becomes a warehouse-feeling office from the first step inside where you are met with a glass-shielded reception desk in a white-brick corridor. The "Crypto Collective" branding betrays the former life of the unit, which served admirably as a crypto mining base.

The purchase of the "upgraded turnkey Tier 2 Data Center" will include all of its cooling and power infrastructure. Three Engineered Fluids "SLICTanks," single-phase liquid immersion cooling tanks for use with dielectric coolant, will come with pumps and a 500kW dry cooler. The tanks are currently filled with at least 80 mining computers visible from the photos, though the SLICTanks can be configured to fit more machines. Also visible in proximity to the cooling array is a deep row of classic server racks and a staggering amount of networking.

The listing advertises a host of potential uses for future customers, from "AI services, cloud hosting, traditional data center, servers or even Bitcoin Mining". Also packed into the 5,786 square feet of real estate is two separate power grids, 5 HVAC units, a hefty amount of four levels of warehouse-style storage aisles, a lounge/office space, and a fully-paved backyard. In other good news, its future corporate residents will not have an HOA to deal with, and will only be 20 minutes outside of the heart of Dallas, sitting just out of earshot of two major highways.

Slashdot Top Deals