DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×
Programming

Stack Overflow Reveals Which Programming Languages Are Most Used At Night (stackoverflow.blog) 97

Stack Overflow data scientist David Robinson recently calculated when people visit the popular programming question-and-answer site, but then also calculated whether those results differed by programming language. Quoting his results:
  • "C# programmers start and stop their day earlier, and tend to use the language less in the evenings. This might be because C# is often used at finance and enterprise software companies, which often start earlier and have rigid schedules."
  • "C programmers start the day a bit later, keep using the language in the evening, and stay up the longest. This suggests C may be particularly popular among hobbyist programmers who code during their free time (or perhaps among summer school students doing homework)."
  • "Python and Javascript are somewhere in between: Python and Javascript developers start and end the day a little later than C# users, and are a little less likely than C programmers to work in the evening."

The site also released an interactive app which lets users see how the results for other languages compared to C#, JavaScript, Python, and C, though of those four, "C# would count as the 'most nine-to-five,' and C as the least."

And they've also calculated the technologies used most between 9 to 5 (which "include many Microsoft technologies, such as SQL Server, Excel, VBA, and Internet Explorer, as well as technologies like SVN and Oracle that are frequently used at enterprise software companies.") Meanwhile, the technologies most often used outside the 9-5 workday "include web frameworks like Firebase, Meteor, and Express, as well as graphics libraries like OpenGL and Unity. The functional language Haskell is the tag most visited outside of the workday; only half of its visits happen between 9 and 5."


The Internet

Newest Firefox Browser Bashes Crashes (cnet.com) 133

Nobody likes it when a web browser bombs instead of opening up a website. Mozilla is addressing that in the newly released v53 of its Firefox browser, which it claims crashes 10 percent fewer times. CNET adds: The improvement comes through the first big debut of a part of Project Quantum, an effort launched in 2016 to beef up and speed up Firefox. To improve stability, Firefox 53 on Windows machines isolates software called a compositor that's in charge of painting elements of a website onto your screen. That isolation into a separate computing process cuts down on trouble spots that can occur when Firefox employs computers' graphics chips, Mozilla said.
Android

Benchmarks Show Galaxy S8 With Snapdragon 835 Is a Much Faster Android Handset (hothardware.com) 81

MojoKid writes: Samsung recently launched the Galaxy S8 series of Android smartphones to much fanfare but only recently did the handsets begin to arrive in market for testing and review. Though the high-polish styling of the Galaxy S8 and Galaxy S8+ may or may not appeal to you, few would argue with its claims of significant performance gains and improved battery life. As it turns out, in deep-dive testing and benchmarking, the Galaxy S8 series is significantly faster than any other Android handset on the market currently, especially when it comes to graphics and gaming workloads. The Qualcomm Snapdragon 835 processor on board the GS8 is currently a Samsung exclusive, though it's expected to arrive in other handsets later this year. The Adreno 540 graphics engine on board the new Snapdragon chip is roughly 25% faster than the previous generation 820/821 series, though the chip is only about 10 percent faster in standard CPU-intensive tasks. Regardless, these are appreciable gains, especially in light of the fact that the new Galaxy S8 also has much better battery life than the previous generation Galaxy S7 series. The Samsung Galaxy S8 (5.8-inch) and Galaxy S8+ (6.2-inch) are expected to arrive at retail this week and though pricing is carrier-dependent, list for roughly $720 and $850 respectively, off contract.
Desktops (Apple)

StarCraft Is Now Free, Nearly 20 Years After Its Release (techcrunch.com) 237

An anonymous reader quotes a report from TechCrunch: Nearly two decades after its 1998 release, StarCraft is now free. Legally! Blizzard has just released the original game -- plus the Brood War expansion -- for free for both PC and Mac. You can find it here. Up until a few weeks ago, getting the game with its expansion would've cost $10-15 bucks. The company says they've also used this opportunity to improve the game's anti-cheat system, add "improved compatibility" with Windows 7, 8.1, and 10, and fix a few long lasting bugs. So why now? The company is about to release a remastered version of the game in just a few months, its graphics/audio overhauled for modern systems. Once that version hits, the original will probably look a bit ancient by comparison -- so they might as well use it to win over a few new fans, right?
AMD

AMD Launches Higher Performance Radeon RX 580 and RX 570 Polaris Graphics Cards (hothardware.com) 93

Reader MojoKid writes: In preparation for the impending launch of AMD's next-generation Vega GPU architecture, which will eventually reside at the top of the company's graphics product stack, the company unveiled a refresh of its mainstream graphics card line-up with more-powerful Polaris-based GPUs. The new AMD Radeon RX 580 and RX 570 are built around AMD's Polaris 20 GPU, which is an updated revision of Polaris 10. The Radeon RX 580 features 36 Compute Units, with a total of 2,304 shader processors and boost / base GPU clocks of 1340MHz and 1257MHz, respectively, along with 8GB of GDDR5 over a 256-bit interface. The Radeon RX 580 offers up a total of 6.17 TFLOPs of compute performance with up to 256GB/s of peak memory bandwidth. Though based on the same chip, the Radeon RX 570 has only 32 active CUs and 2048 shader processors. Boost and base reference clocks are 1244MHz and 1168MHz, respectively with 4GB of GDDR5 memory also connected over a 256-bit interface. At reference clocks, the peak compute performance of the Radeon RX 570 is 5.1TFLOPs with 224GB/s of memory bandwidth. In the benchmarks, the AMD Radeon RX 580 clearly outpaced AMD's previous gen Radeon RX 480, and was faster than an NVIDIA GeForce GTX 1060 Founder's Edition card more often than not. It was more evenly matched with factory-overclocked OEM GeForce GTX 1060 cards, however. Expected retail price points are around $245 and $175 for 8GB Radeon RX 580 and 4GB RX 570s cards, though more affordable options will also be available.
United States

Steve Ballmer's New Project: Find Out How the Government Spends Your Money (theverge.com) 249

Former Microsoft CEO Steve Ballmer isn't satisfied with owning the Los Angeles Clippers and teaching at Stanford and USC. On Tuesday, the billionaire announced USAFacts, his new startup that aims to improve political discourse by making government financial data easier to access. A small "army" of economists, professors and other professionals will be looking into and publishing data structured similarly to the 10-K filings companies issue each year -- expenses, revenues and key metrics pulled from dozens of government data sources and compiled into a single massive collection of tables. From a report on The Verge: The nonpartisan site traces $5.4 trillion in government spending under four categories derived from language in the US Constitution. Defense spending, for example, is categorized under the header "provide for the common defense," while education spending is under "secure the blessing of liberty to ourselves and our prosperity." Spending allocation and revenue sources are each mapped out in blue and pink graphics, with detailed breakdowns along federal, state and local lines. Users can also search for specific datasets, such as airport revenue or crime rates, and the site includes a report of "risk factors" that could inhibit economic growth. The New York Times has the story on how this startup came to be.
Movies

Slashdot Asks: What's Your Favorite Sci-Fi Movie? 1222

Many say it's the golden age of science fiction cinema. And rightly so, every month, we have a couple of movies that bend the rules of science to explore possibilities that sometimes make us seriously consider if things we see on the big screen could actually be true. The advances in graphics, and thanks to ever-so-increasing video resolution, we're increasingly leaving the theaters with visually appealing memories. That said, there are plenty of movies made back in the day that are far from ever getting displaced by the reboots spree that the Hollywood is currently embarking. With readers suggesting us this question every week, we think it's time we finally asked, what's your favorite science-fiction movie? Also, what are some other sci-fi movies that you have really enjoyed but think they have not received enough praises or even much acknowledgement?

Editor's note: the story has been moved up on the front page due its popularity.
Hardware

Ask Slashdot: What Was Your First Home Computer? 857

We've recently seen stories about old computers and sys-ops resurrecting 1980s BBS's, but now an anonymous reader has a question for all Slashdot readers: Whenever I meet geeks, there's one question that always gets a reaction: Do you remember your first home computer? This usually provokes a flood of fond memories about primitive specs -- limited RAM, bad graphics, and early versions of long-since-abandoned operating systems. Now I'd like to pose the same question to Slashdot's readers.

Use the comments to share details about your own first home computer. Was it a back-to-school present from your parents? Did it come with a modem? Did you lovingly upgrade its hardware for years to come? Was it a Commodore 64 or a BeBox?

It seems like there should be some good stories, so leave your best answers in the comments. What was your first home computer?
Classic Games (Games)

Celebrating '21 Things We Miss About Old Computers' (denofgeek.com) 467

"Today, we look back at the classic era of home computing that existed alongside the dreariness of business computing and the heart-pounding noise and colour of the arcades," writes the site Den of Geek. An anonymous reader reports: The article remembers the days of dial-up modems, obscure computer magazines, and the forgotten phenomenon of computer clubs. ("There was a time when if you wanted to ask a question about something computer related, or see something in action, you'd have to venture outside and into another building to go and see it.") Gamers grappled with old school controllers, games distributed on cassette tapes, low-resolution graphics and the "playground piracy" of warez boards -- when they weren't playing the original side-scrolling platformers like Mario Bros and Donkey Kong at video arcades.

In a world where people published fanzines on 16-bit computers, shared demo programs, and even played text adventures, primitive hardware may have inspired future coders, since "Old computers typically presented you with a command prompt as soon as you switched them on, meaning that they were practically begging to be programmed on." Home computers "mesmerised us, educated us, and in many cases, bankrupted us," the article remembers -- until they were replaced by more powerful hardware. "You move on, but you never fully get over your first love," it concludes -- while also adding that "what came next was pretty amazing."

Does this bring back any memories for anybody -- or provoke any wistful nostalgic for a bygone era? Either way, I really liked the way that the article ended. "The most exciting chapter of all, my geeky friends? The future!"
Hardware

Nvidia Titan Xp Introduced as 'the World's Most Powerful Graphics Card' (pcgamer.com) 69

Nvidia has unveiled its new Titan, the Xp. It features 3840 Cuda cores running at 1.6GHz, and 12GB of DDR5X memory. The card runs on Nvidia's Pascal architecture and comes with a suitably titanic price tag of $1200. From a report: "They made 1080 Ti so fast that they need a new top-tier Titan," says PC Gamer hardware expert Jarred Walton. "It's the full GP102 chip, so just like we had GTX 780, the Titan, the 780 Ti and the Titan Black, we're getting the 1080, Titan X (Pascal), 1080 Ti, and Titan Xp."
Businesses

Apple To Develop Its Own GPU, UK Chip Designer Imagination Reveals In 'Bombshell' PR (anandtech.com) 148

From a report on AnandTech: In a bombshell of a press release issued this morning, Imagination has announced that Apple has informed their long-time GPU partner that they will be winding down their use of Imagination's IP. Specifically, Apple expects that they will no longer be using Imagination's IP in 15 to 24 months. Furthermore the GPU design that replaces Imagination's designs will be, according to Imagination, "a separate, independent graphics design." In other words, Apple is developing their own GPU, and when that is ready, they will be dropping Imagination's GPU designs entirely. This alone would be big news, however the story doesn't stop there. As Apple's long-time GPU partner and the provider for the basis of all of Apple's SoCs going back to the very first iPhone, Imagination is also making a case to investors (and the public) that while Apple may be dropping Imagination's GPU designs for a custom design, that Apple can't develop a new GPU in isolation -- that any GPU developed by the company would still infringe on some of Imagination's IP. As a result the company is continuing to sit down with Apple and discuss alternative licensing arrangements, with the intent of defending their IP rights.
Emulation (Games)

Ask Slashdot: Can Linux Run a GPU-Computing Application Written For Windows? 117

dryriver writes: I have been told that Linux can run Windows software using Wine or perhaps a VM. What happens if that Windows software is a GPU-computing application -- accessing the GPU through HLSL/GLSL/CUDA/OpenCL or similar interfaces? Can Wine or other solutions run that software at a decent speed under Linux? Or is GPU-computing software written for the Windows platform unsuitable for use -- emulated or otherwise -- under Linux? This sounds like one of those cases where there's a theoretical answer and then your own real-world experiences. So leave your best answers in the comments. Can Linux run a GPU-computing application that's written for Windows?
Data Storage

Next-Generation DDR5 RAM Will Double the Speed of DDR4 In 2018 (arstechnica.com) 77

An anonymous reader quotes a report from Ars Technica: You may have just upgraded your computer to use DDR4 recently or you may still be using DDR3, but in either case, nothing stays new forever. JEDEC, the organization in charge of defining new standards for computer memory, says that it will be demoing the next-generation DDR5 standard in June of this year and finalizing the standard sometime in 2018. DDR5 promises double the memory bandwidth and density of DDR4, and JEDEC says it will also be more power-efficient, though the organization didn't release any specific numbers or targets. Like DDR4 back when it was announced, it will still be several years before any of us have DDR5 RAM in our systems. That's partly because the memory controllers in processors and SoCs need to be updated to support DDR5, and these chips normally take two or three years to design from start to finish. DDR4 RAM was finalized in 2012, but it didn't begin to go mainstream until 2015 when consumer processors from Intel and others added support for it. DDR5 has no relation to GDDR5, a separate decade-old memory standard used for graphics cards and game consoles.
AMD

AMD Ryzen Game Patch Optimizations Show Significant Gains On Zen Architecture (hothardware.com) 121

MojoKid writes: AMD got the attention of PC performance enthusiasts everywhere with the recent launch of its Ryzen 7 series processors. The trio of 8-core chips competitively take on Intel's Core i7 series at the high-end of its product stack. However, with the extra attention AMD garnered, came significant scrutiny as well. With any entirely new platform architecture, there are bound to be a few performance anomalies -- as was the case with the now infamous lower performance "1080p gaming" situation with Ryzen. In a recent status update, AMD noted they were already working with developers to help implement "simple changes" that can help a game engine's understanding of the AMD Zen core topology that would likely provide an additional performance uplift with Ryzen. Today, we have some early proof-positive of that, as Oxide Games, in concert with AMD, released a patch for its game title Ashes Of The Singularity. Ashes has been a "poster child" game engine of sorts for AMD Radeon graphics over the years (especially with respect to DX12) and it was one that ironically showed some of the worst variations in Ryzen CPU performance versus Intel. With this new patch that is now public for the game, however, AMD claims to have regained significant ground in benchmark results at all resolutions. In the 1080p benchmarks with powerful GPUs, a Ryzen 7 1800X shows an approximate 20% performance improvement with the latest version of the Ashes, closing the gap significantly versus Intel. This appears to be at least an early sign that AMD can indeed work with game and other app developers to tune for the Ryzen architecture and wring out additional performance.
IBM

A 21st-Century Version Of OS/2 Warp May Be Released Soon (arcanoae.com) 232

dryriver writes: A company named Arca Noae is working on a new release of the X86 OS/2 operating system code named "Blue Lion" and likely called ArcaOS 5 in its final release. Blue Lion wants to be a modern 21st Century OS/2 Warp, with support for the latest hardware and networking standards, a modern accelerated graphics driver, support for new cryptographic security standards, full backward compatibility with legacy OS/2, DOS and Windows 3.1 applications, suitability for use in mission-critical applications, and also, it appears, the ability to run "ported Linux applications". Blue Lion, which appears to be in closed beta with March 31st 2017 cited as the target release date, will come with up to date Firefox browser and Thunderbird mail client, Apache OpenOffice, other productivity tools, a new package manager, and software update and support subscription to ensure system stability. It is unclear from the information provided whether Blue Lion will be able to run modern Windows applications.
Government

After Healthcare Defeat, Can The Trump Administration Fix America's H-1B Visa Program? (bloomberg.com) 566

Friday the Trump administration suffered a political setback when divisions in the president's party halted a move to repeal healthcare policies passed in 2010. But if Trump hopes to turn his attention to how America's H-1B visa program is affecting technology workers, "time is running out," writes Slashdot reader pteddy. Bloomberg reports: [T]he application deadline for the most controversial visa program is the first week of April, which means new rules have to be in place for that batch of applicants or another year's worth of visas will be handed out under the existing guidelines... There probably isn't enough time to pass legislation on such a contentious issue. But Trump could sign an executive order with some changes. The article points out that under the current system, one outsourcing firm was granted 6.5 times as many U.S. visas as Amazon. There's also an interesting map showing which countries' workers received the most H-1B visas in 2015 -- 69.4% went to workers in India, with another 10.5% going to China -- and a chart showing which positions are most in demand, indicating that two-thirds of the visa applications are for tech workers.
Patents

Apple Explores Using An iPhone, iPad To Power a Laptop (appleinsider.com) 76

According to the U.S. Patent and Trademark Office, Apple has filed a patent for an "Electronic accessory device." It describes a "thin" accessory that contains traditional laptop hardware like a large display, physical keyboard, GPU, ports and more -- all of which is powered by an iPhone or iPad. The device powering the hardware would fit into a slot built into the accessory. AppleInsider reports: While the accessory can take many forms, the document for the most part remains limited in scope to housings that mimic laptop form factors. In some embodiments, for example, the accessory includes a port shaped to accommodate a host iPhone or iPad. Located in the base portion, this slot might also incorporate a communications interface and a means of power transfer, perhaps Lightning or a Smart Connector. Alternatively, a host device might transfer data and commands to the accessory via Wi-Fi, Bluetooth or other wireless protocol. Onboard memory modules would further extend an iOS device's capabilities. Though the document fails to delve into details, accessory memory would presumably allow an iPhone or iPad to write and read app data. In other cases, a secondary operating system or firmware might be installed to imitate a laptop environment or store laptop-ready versions of iOS apps. In addition to crunching numbers, a host device might also double as a touch input. For example, an iPhone positioned below the accessory's keyboard can serve as the unit's multitouch touchpad, complete with Force Touch input and haptic feedback. Coincidentally, the surface area of a 5.5-inch iPhone 7 Plus is very similar to that of the enlarged trackpad on Apple's new MacBook Pro models. Some embodiments also allow for the accessory to carry an internal GPU, helping a host device power the larger display or facilitate graphics rendering not possible on iPhone or iPad alone. Since the accessory is technically powered by iOS, its built-in display is touch-capable, an oft-requested feature for Mac. Alternatively, certain embodiments have an iPad serving as the accessory's screen, with keyboard, memory, GPU and other operating guts located in the attached base portion. This latter design resembles a beefed up version of Apple's Smart Case for iPad.
Government

US Federal Budget Proposal Cuts Science Funding (washingtonpost.com) 649

hey! writes: The U.S. Office of Management and Budget has released a budget "blueprint" which outlines substantial cuts in both basic research and applied technology funding. The proposal includes a whopping 18% reduction in National Institutes of Health medical research. NIH does get a new $500 million fund to track emerging infectious agents like Zika in the U.S., but loses its funding to monitor those agents overseas. The Department of Energy's research programs also get an 18% cut in research, potentially affecting basic physics research, high energy physics, fusion research, and supercomputing. Advanced Research Projects Agency (ARPA-E) gets the ax, as does the Advanced Technology Vehicle Manufacturing Program, which enabled Tesla to manufacture its Model S sedan. EPA loses all climate research funding, and about half the research funding targeted at human health impacts of pollution. The Energy Star program is eliminated; Superfund funding is drastically reduced. The Chesapeake Bay and Great Lakes cleanup programs are also eliminated, as is all screening of pesticides for endocrine disruption. In the Department of Commerce, Sea Grant is eliminated, along with all coastal zone research funding. Existing weather satellites GOES and JPSS continue funding, but JPSS-3 and -4 appear to be getting the ax. Support for transfer of federally funded research and technology to small and mid-sized manufacturers is eliminated. NASA gets a slight trim, and a new focus on deep space exploration paid for by an elimination of Earth Science programs. You can read more about this "blueprint" in Nature, Science, and the Washington Post, which broke the story. The Environmental Protection Agency, the State Department and Agriculture Department took the hardest hits, while the Defense Department, Department of Homeland Security, and Department of Veterans Affairs have seen their budgets grow.
Operating Systems

NetBSD 7.1 Released (netbsd.org) 45

New submitter fisted writes: The NetBSD Project is pleased to announce NetBSD 7.1, the first feature update of the NetBSD 7 release branch. It represents a selected subset of fixes deemed important for security or stability reasons, as well as new features and enhancements. Some highlights of the 7.1 release are:

-Support for Raspberry Pi Zero.
-Initial DRM/KMS support for NVIDIA graphics cards via nouveau (Disabled by default. Uncomment nouveau and nouveaufb in your kernel config to test).
The addition of vioscsi, a driver for the Google Compute Engine disk.
-Linux compatibility improvements, allowing, e.g., the use of Adobe Flash Player 24.
-wm(4): C2000 KX and 2.5G support; Wake On Lan support; 82575 and newer SERDES based systems now work.
-ODROID-C1 Ethernet now works.
-Numerous bug fixes and stability improvements.

NetBSD is free. All of the code is under non-restrictive licenses, and may be used without paying royalties to anyone. Free support services are available via our mailing lists and website. Commercial support is available from a variety of sources. More extensive information on NetBSD is available from http://www.NetBSD.org.
You can download NetBSD 7.1 from one of these mirror sites.
Firefox

Will WebAssembly Replace JavaScript? (medium.com) 235

On Tuesday Firefox 52 became the first browser to support WebAssembly, a new standard "to enable near-native performance for web applications" without a plug-in by pre-compiling code into low-level, machine-ready instructions. Mozilla engineer Lin Clark sees this as an inflection point where the speed of browser-based applications increases dramatically. An anonymous reader quotes David Bryant, the head of platform engineering at Mozilla. This new standard will enable amazing video games and high-performance web apps for things like computer-aided design, video and image editing, and scientific visualization... Over time, many existing productivity apps (e.g. email, social networks, word processing) and JavaScript frameworks will likely use WebAssembly to significantly reduce load times while simultaneously improving performance while running... developers can integrate WebAssembly libraries for CPU-intensive calculations (e.g. compression, face detection, physics) into existing web apps that use JavaScript for less intensive work... In some ways, WebAssembly changes what it means to be a web developer, as well as the fundamental abilities of the web.
Mozilla celebrated with a demo video of the high-resolution graphics of Zen Garden, and while right now WebAssembly supports compilation from C and C++ (plus some preliminary support for Rust), "We expect that, as WebAssembly continues to evolve, you'll also be able to use it with programming languages often used for mobile apps, like Java, Swift, and C#."

Slashdot Top Deals