Slash Boxes

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password

Site News

Join our Folding@Home team:
Main F@H site
Our team page

Funding Goal
For 6-month period:
2019-07-01 to 2019-12-31
(All amounts are estimated)
Base Goal:


Covers transactions:
2019-01-01 00:00:00 ..
2019-09-07 11:28:34 UTC
(SPIDs: [1128..1154)
Last Update:
2019-09-07 11:57:08 UTC

Support us: Subscribe Here
and buy SoylentNews Swag

We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What is your marital status?

  • Single (Never Married)
  • Separated
  • Legally Separated
  • Divorced
  • Married
  • Married with Children
  • Other (Specify)

[ Results | Polls ]
Comments:112 | Votes:285

posted by martyb on Monday September 16, @09:31PM   Printer-friendly
from the How-much-is-that-in-kilograms-per-liter? dept.

West Virginia University researchers have helped discover the most massive neutron star to date, a breakthrough uncovered through the Green Bank Telescope in Pocahontas County.

The neutron star, called J0740+6620, is a rapidly spinning pulsar that packs 2.17 times the mass of the sun (which is 333,000 times the mass of the Earth) into a sphere only 20-30 kilometers, or about 15 miles, across. This measurement approaches the limits of how massive and compact a single object can become without crushing itself down into a black hole.

The star was detected approximately 4,600 light-years from Earth. One light-year is about six trillion miles.

Original Submission

posted by martyb on Monday September 16, @08:05PM   Printer-friendly
from the today's-borrowers-are-tomorrow's-buyers dept.

In July, Macmillan CEO John Sargent outlined the changes in response to "growing fears that library lending was cannibalizing sales." On September 11, the American Library Association (ALA) started circulating a petition in hopes of pressuring Macmillan to not go through with its plan, which is scheduled to go into effect in November. "To treat libraries as an inferior consumer to the general population, it's the wrong thing to do," said Alan Inouye, director of the Office for Information Technology Policy at the ALA. "Libraries are generally held as amongst the highest esteemed institutions in the community."

"Allowing a library like the Los Angeles Public Library (which serves 18 million people) the same number of initial e-book copies as a rural Vermont library serving 1,200 people smacks of punishment, not support," librarian Jessamyn West wrote on CNN. She also points out that Sargent's claim that apps let people check out books in states and countries where they don't live "betrays a fundamental misunderstanding of how public libraries work." There are a few that let you pay for a library card regardless of where you live, but not many. Digital Trends reached out to Macmillan for comment but did not receive a response.


Original Submission

posted by martyb on Monday September 16, @06:36PM   Printer-friendly
from the dept.

New measurements of Proton radius have shown that 2010 measurements, which showed an unknown interaction between protons and muons causing a puzzling reduction in proton radius, were in error.

Almost ten years ago German physicists reported a reduction in the radius of Protons when electrons were replaced with Muons in Hydrogen atoms. This lead to hopes for new physics (the best kind) but these hopes have now been dashed.

If the discrepancy was real, meaning protons really shrink in the presence of muons, this would imply unknown physical interactions between protons and muons—a fundamental discovery. Hundreds of papers speculating about the possibility have been written in the years since.

But hopes that the "proton radius puzzle" would upend particle physics and reveal new laws of nature have now been dashed by a new measurement reported on September 6 in Science.

The new measurements by Distinguished Research Professor Eric Hessels of York University in Toronto and his team "suggest that the proton does not change size depending on context; rather, the old measurements using electronic hydrogen were wrong."

According to Hessels, the result, which "points to the most mundane explanation" is bittersweet.

See also:

Original Submission

posted by martyb on Monday September 16, @05:00PM   Printer-friendly
from the things-expand-to-exceed-the-space-provided dept.

A couple years ago, I took a road trip from Wisconsin to Washington and mostly stayed in rural hotels on the way. I expected the internet in rural areas too sparse to have cable internet to be slow, but I was still surprised that a large fraction of the web was inaccessible. Some blogs with lightweight styling were readable, as were pages by academics who hadn't updated the styling on their website since 1995. But very few commercial websites were usable (other than Google). When I measured my connection, I found that the bandwidth was roughly comparable to what I got with a 56k modem in the 90s. The latency and packetloss were significantly worse than the average day on dialup: latency varied between 500ms and 1000ms and packetloss varied between 1% and 10%. Those numbers are comparable to what I'd see on dialup on a bad day.

Despite my connection being only a bit worse than it was in the 90s, the vast majority of the web wouldn't load. Why shouldn't the web work with dialup or a dialup-like connection? It would be one thing if I tried to watch youtube and read pinterest. It's hard to serve videos and images without bandwidth. But my online interests are quite boring from a media standpoint. Pretty much everything I consume online is plain text, even if it happens to be styled with images and fancy javascript. In fact, I recently tried using w3m (a terminal-based web browser that, by default, doesn't support css, javascript, or even images) for a week and it turns out there are only two websites I regularly visit that don't really work in w3m (twitter and zulip, both fundamentally text based sites, at least as I use them)[1].

More recently, I was reminded of how poorly the web works for people on slow connections when I tried to read a joelonsoftware post while using a flaky mobile connection. The HTML loaded but either one of the five CSS requests or one of the thirteen javascript requests timed out, leaving me with a broken page. Instead of seeing the article, I saw three entire pages of sidebar, menu, and ads before getting to the title because the page required some kind of layout modification to display reasonably. Pages are often designed so that they're hard or impossible to read if some dependency fails to load. On a slow connection, it's quite common for at least one depedency to fail. After refreshing the page twice, the page loaded as it was supposed to and I was able to read the blog post, a fairly compelling post on eliminating dependencies.

[1] excluding internal Microsoft stuff that's required for work. Many of the sites are IE only and don't even work in edge. I didn't try those sites in w3m but I doubt they'd work! In fact, I doubt that even half of the non-IE specific internal sites would work in w3m.

Original Submission

posted by martyb on Monday September 16, @03:25PM   Printer-friendly
from the Pixels-to-the-people! dept.,40400.html

On Friday, Beamr Imaging claims to have achieved the world's first real-time 8K HEVC encoding by using a single EPYC 7742, AMD's flagship server CPU based on its new Rome architecture.

A single 64-core EPYC 7742, which features the 7nm process and the Zen 2 microarchitecture (the same type of cores found in Ryzen 3000), encoded 8K footage in real time at 79 frames per second with 10-bit color required for HDR.

It's a significant achievement for both hardware and software; the Epyc 7742 is the world's first 64-core x86 CPU to come in a standard general-purpose socket, and the Beamr encoding software is designed to use all 64 of those cores. Parallelization is a significant concern for CPUs with increasingly larger core counts, from consumer to server applications, so it's nice to see the 7742 used to the fullest.

Original Submission

posted by martyb on Monday September 16, @01:48PM   Printer-friendly
from the COBOL-is-often-fractionally-better dept.

Face it: nobody likes fractions, not even computers.

When we talk about COBOL the first question on everyone's mind is always Why are we still using it in so many critical places? Banks are still running COBOL, close to 7% of the GDP is dependent on COBOL in the form of payments from the Centers for Medicare & Medicaid Services, The IRS famously still uses COBOL, airlines still use COBOL (Adam Fletcher dropped my favorite fun fact on this topic in his Systems We Love talk: the reservation number on your ticket used to be just a pointer), lots of critical infrastructure both in the private and public sector still runs on COBOL.


The traditional answer is deeply cynical. Organizations are lazy, incompetent, stupid. They are cheap: unwilling to invest the money needed upfront to rewrite the whole system in something modern. Overall we assume that the reason so much of civil society runs on COBOL is a combination of inertia and shortsightedness. And certainly there is a little truth there. Rewriting a mass of spaghetti code is no small task. It is expensive. It is difficult. And if the existing software seems to be working fine there might be little incentive to invest in the project.

But back when I was working with the IRS the old COBOL developers used to tell me: "We tried to rewrite the code in Java and Java couldn't do the calculations right."

[Ed note: The referenced article is extremely readable and clearly explains the differences between floating-point and fixed-point math, as well as providing an example and explanation that clearly shows the tradeoffs.]

Original Submission

posted by janrinok on Monday September 16, @12:13PM   Printer-friendly
from the face-facts,-face-hacks dept.

Apple is planning to release iOS 13 next week, but one security researcher has already discovered a lockscreen bypass. The exploit allows you to bypass the lockscreen and gain access to all contact information on an iPhone. Jose Rodriguez discovered the exploit and revealed to The Verge that he reported it to Apple on July 17th, but it's still working in the Gold Master (GM) version of iOS 13 that will be released on September 19th.

Rodriguez discovered a lockscreen exploit last year for iOS 12.1, and this latest iOS 13 bypass uses a similar technique. It involves activating a FaceTime call and then accessing the voiceover feature from Siri to enable access to the contact list. You can then obtain email addresses, phone numbers, address information, and more from the list of contacts.

[...] Rodriguez says the exploit appears to be fixed in beta copies of iOS 13.1, which Apple is planning to make available on September 30th.

Source: iOS 13 exploit bypasses the lockscreen for access to contacts

Original Submission

posted by janrinok on Monday September 16, @10:42AM   Printer-friendly
from the fold-and-go dept.

Submitted via IRC for SoyCow2718

Dual-screen and foldable Windows devices are on the way.

Microsoft has been working on a dual-screen Surface device that may well resemble the company's Courier concept. The software giant is expected to tease the device at its Surface hardware event next month, but new patents show that Microsoft's work goes far beyond just dual-screen hardware. A new patent, spotted by WindowsUnited, has surfaced that reveals Microsoft has been working on a special hinge that uses liquid to reduce the stress on flexible and foldable displays.

The liquid can be filled inside cavities around the flexible display to help it bend and move into different positions. Microsoft's example shows a device with two separate sides and a flexible OLED display that extends across the entire device. Microsoft has long been focused on complex and impressive hinge work with its Surface devices, and this particular hinge is described in a lot of detail in the patent filing.

[...] The most interesting part of this patent filing is that it's been filed by "Microsoft technology licensing," which could indicate the company will license this technology to other partners and PC makers. Microsoft is currently working closely with Intel and a number of OEMs to ready a new version of Windows, codenamed "Windows Lite," that will run on foldable and dual-screen devices. This new Windows version will be targeted toward dual-screen and foldable hardware. Microsoft has also previously licensed some of its Surface Pro hinge work to partners.


Original Submission

posted by janrinok on Monday September 16, @09:06AM   Printer-friendly
from the use-whatever-you-want dept.

Arthur T Knackerbracket has found the following story:

Python sits firmly in top place in the newest annual ranking of popular programming languages by IEEE Spectrum.   

The ranking and others like it are meant to help developers understand the popularity of languages in a world where no one really knows what programmers are using on their laptops. 

IEEE Spectrum has placed Python in first spot since 2017, and last year it was just ahead of C++. The top language is given a score of 100, and all languages with lower scores are scaled in relation to it. C++ last year scored 99.7, followed by Java at 97.5, and C with 96.7.

Today, in the IEEE Spectrum's sixth annual ranking, Python's 100 is a long way ahead of runner-up Java's 96.3 score, while C is in third place with 94.4. C++ has slipped to fourth with 87.5, while in fifth is specialist statistical computing language R with a score of 81.5. 

The magazine for engineering members of IEEE, the world's biggest engineering and applied-science organization, attributes Python's popularity to the vast number of specialized libraries it has, especially for developers building artificial-intelligence applications. 

[...] They go on to note that Facebook, which was originally built with PHP, launched its alternative to PHP, Hack, in 2014 and since then JavaScript, TypeScript and Python have become the most popular languages for web development. 

Original Submission

posted by janrinok on Monday September 16, @07:41AM   Printer-friendly
from the putting-a-brave-face-on-it dept.

Time is running out for engineers from the Indian space program to establish contact with the troubled Chandrayaan-2 lunar lander.

[...] Communications were lost with the lander during this time and have not been re-established since.

Officials at the Indian Space Research Organization (ISRO) were able to locate the lander on the moon last week, and they have been trying to re-establish communications with the hope of saving either the Vikram lander or the Pragyan rover inside it. There is a time window they need to consider, however. The lander and rover were both designed to last for one lunar day, which is the equivalent of 14 days here on Earth. With a week passed since the landing date, the ISRO engineers have only one more week to get in contact with the lander before it runs out of power.



[...] ISRO officials have not yet released the Chandrayaan-2 image of Vikram on the lunar surface or described the potential condition of the lander. But they have said that despite the lander's presumed failed moon landing, the craft has already demonstrated key technologies for future missions.

"The Vikram Lander followed the planned descent trajectory from its orbit of 35 km (22 miles) to just below 2 km above the surface," ISRO officials wrote in an update Saturday (Sept. 7). "All the systems and sensors of the Lander functioned excellently until this point and proved many new technologies such as variable thrust propulsion technology used in the Lander."

Original Submission

posted by janrinok on Monday September 16, @06:04AM   Printer-friendly
from the don't-they-have-those-songs-anymore? dept.

The National Music Publishers' Association (NMPA) has asked the courts to allow it to double its claims against exercise bike and treadmill startup Peloton, after discovering more unlicensed music — including songs by Taylor Swift and Kesha — being used in workout videos that play on the bikes' built-in screens. It's now seeking $300 million in damages, as reported by Forbes.

The original complaint, filed in March, accused Peloton of using over 1,000 songs without getting the proper license. At the time, NMPA requested $150 million in damages. After the lawsuit was filed, the offending songs conveniently disappeared, upsetting connected exercise equipment owners who'd gotten used to their beloved playlists.

But not all the songs disappeared, as reported by The Verge. Here's what the NMPA now has to say about that:

Indeed, it is only as a result of initial discovery in this lawsuit that the full scope and extent of Peloton's unlawful infringement has started to come into focus, revealing more than 1,000 additional musical works [...] those newly discovered works include some of the most famous and popular songs ever recorded, such as "Georgia On My Mind," "I Can See For Miles" and "I Saw Her Standing There."


Original Submission

posted by janrinok on Monday September 16, @04:41AM   Printer-friendly
from the leading-to-better-things dept.

Submitted via IRC for SoyCow3997

Research Shows Open Source Program Offices Improve Software Practices - The New Stack

Using open source software is commonplace, with only a minority of companies preferring a proprietary-first software policy. Proponents of free and open source software (FOSS) have moved to the next phases of open source adoption, widening FOSS usage within the enterprise as well as gaining the "digital transformation" benefits associated with open source and cloud native best practices.

Companies, as well as FOSS advocates, are determining the best ways to promote these business goals, while at the same time keeping alive the spirit and ethos of the non-commercial communities that have embodied the open source movement for years.

As we learned last year in our first survey on this topic, purpose-built program offices and less formal initiatives play an important role in helping companies create policies that will increase adoption of open source components into commercial software products, promote open source culture, and guide corporate citizenship within open source communities.

In partnership with The Linux Foundation's TODO Group and co-sponsored by VMware, The New Stack conducted its second annual "Open Source Programs in the Enterprise" survey to investigate how and if these open source programs are succeeding. We found an emerging consensus about how best to manage and promote open source initiatives.

Over 2,700 people participated in the survey. Company size was broadly represented, with 21% of respondents working at large companies with more than 10,000 employees, and 39% from small and mid-size companies with 250 or fewer employees. Developers and software engineers represented 43% of respondents, with at least another 36% holding an IT-related role.

This article starts with the study's key findings, proceeds to charts and analysis and concludes with a discussion of the study's methodology.

Original Submission

posted by janrinok on Monday September 16, @03:07AM   Printer-friendly
from the better-late-than-never dept.

Submitted via IRC for SoyCow3997

The Eclipse Foundation has announced the release of the Jakarta EE 8, the first truly open-source, vendor-neutral version of Java Enterprise Edition.

Java being fully open-sourced has been a long, long time coming. While Sun open-sourced some of Java as long ago as November 2006, actually using Java in an open-source way was... troublesome. Just ask Google about Android and Java. But for Java in the enterprise things have changed.

On September 10, The Eclipse Foundation announced the full open-source release of the Jakarta EE 8 Full Platform and Web Profile specifications and related Technology Compatibility Kits (TCKs).

This comes after Oracle let go of most of Java Enterprise Edition's (JEE) intellectual property. Oracle retains Java's trademarks though -- thus Java EE naming convention has been changed to Jakarta EE. But for practical programming and production purposes Jakarta EE 8 is the next generation of enterprise Java.


Original Submission

posted by janrinok on Monday September 16, @01:35AM   Printer-friendly
from the wired dept.

Underneath London's bustling streets lie several kilometers of 2.5-meter-wide concrete tunnels lined with power distribution cables that can reach blistering temperatures. To cool the tunnels, vertical shafts spaced out every kilometer or two supply fresh air and eject hot air out into the open.

Researchers at London South Bank University (LSBU) want to put that waste heat to use. A typical 1.8-km tunnel stretch between ventilation shafts produces 400 kilowatts of heat, enough to heat 100 homes or a small commercial office, they have found in a preliminary analysis done with the city's electricity network operator UK Power Networks.

This heat recovery scheme would have a third of the carbon emissions of a gas boiler delivering the same amount of heat. The researchers presented this work at the International Congress of Refrigeration in August.

"It's very difficult to identify exactly how many cable tunnels there are in the world," says Graeme Maidment, an LBSU professor of air conditioning and refrigeration engineering who led the work. But cable tunnels lie below most major cities. Several tunnels are now under construction in London, and the UK likely has tunnels spanning several hundred kilometers, he says.

[...] But capturing heat from underground cable tunnels provides an extra-strong business case, Maidment says. Not only could the power distribution company sell the waste heat, but "cooling the cables reduces their resistance so you can put more juice through them."

[...] "We now need to take this from a desk study to some practical demonstrations," Maidment says. "There are lots of high voltage cables so there is safety and risk that needs to be considered." But, he points out, "what we're proposing doesn't necessarily need a large cable tunnel. It could be any power cable running in the ground and you could put a pipe next to it to absorb heat."


Original Submission

posted by Fnord666 on Monday September 16, @12:09AM   Printer-friendly
from the nutty-idea dept.

For kids who've already developed a peanut allergy, though, a similar but more controversial treatment is up for approval by the Food and Drug Administration. The agency is holding a hearing today with the pharmaceutical company and its advocates, expected to inform a final ruling in coming months.

There is currently no "treatment" for a peanut allergy. As it is, patients are told to avoid peanuts. They are prescribed a syringe full of epinephrine (trade name: EpiPen) and taught to inject themselves if needed. Despite much advancement in medical science and technology over the decades, nothing has given families peace of mind that the allergy itself could be treated, or at least tempered. Until now. The new approach involves trying to reprogram the immune system by giving a person ... peanuts.

This might sound dangerous, because it is.

Original Submission