2018-01-01 00:00:00 ..
2018-03-20 21:26:46 UTC
2018-03-22 10:55:01 UTC
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
A few Soylentils wrote in to tell us about a fatal accident between a pedestrian and an autonomous Uber vehicle.
The Uber vehicle was operating in autonomous mode when it crashed into 49-year-old Elaine Herzberg on Sunday evening. Herzberg was transported to a hospital, where she later died from her injuries, in what may be the first known pedestrian fatality in a self-driving crash.
The video footage does not conclusively show who is at fault. Tempe police initially reported that Herzberg appeared suddenly; however, the video footage seems to show her coming into view a number of seconds before the crash. It also showed the vehicle operator behind the wheel intermittently looking down while the car was driving itself.
The link shows video of the seconds just before the accident.
The pedestrian did not step out in front of the vehicle, she was essentially out in the middle of the road, and all her lateral movement was nearly irrelevant. She might as well have been a stationary object in the middle of the road. You can see the headlights bring her feet into view first, (meaning she was pretty much in the line before the headlights could see her, and then move up her body; she's already in the middle of the road in front of him when she comes into view.
If I were driving that car, I think I'd have had time to hit brakes (but not stop in time). I also think that that if the camera view is an accurate representation of what was really visible, then the car was overdriving its headlights. Although given my experience with cameras, I wouldn't be surprised if actual visibility was better than what the video shows.
This, in my opinion, is pretty damning.
The chief of the Tempe Police has told the San Francisco Chronicle that Uber is likely not responsible for the Sunday evening crash that killed 49-year-old pedestrian Elaine Herzberg. “I suspect preliminarily it appears that the Uber would likely not be at fault in this accident," said chief Sylvia Moir.
Herzberg was "pushing a bicycle laden with plastic shopping bags," according to the Chronicle's Carolyn Said, when she "abruptly walked from a center median into a lane of traffic."
After viewing video captured by the Uber vehicle, Moir concluded that “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." Moir added that "it is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available."
The death of a woman who was struck by a self-driving Uber in Arizona on Sunday has auto-safety advocates demanding that U.S. regulators and lawmakers slow down the rush to bring autonomous vehicles to the nation's roadways. Don't count on it.
Efforts to streamline regulations to accommodate the emerging technology have been under way since the Obama administration with strong bipartisan support. And the Trump administration's aversion to restrictions and regulations makes it even more unlikely that the accident in Tempe, Arizona, in which an autonomous Uber sport utility vehicle struck and killed a pedestrian, will result in significant new barriers, according to former U.S. officials and some safety advocates.
"Honestly, the last thing under this administration that car companies and self-driving vehicle developers have to worry about is heavy regulation," said David Friedman, a former National Highway Traffic Safety Administration administrator under President Barack Obama who's now director of cars and product policy for Consumers Union.
[Partial] or full autonomy raises the question of who is to blame in the case of an accident involving a self-driving car? In conventional (human-driven) cars, the answer is simple: the driver is responsible because they are in control. When it comes to autonomous vehicles, it isn't so clear cut. We propose a blockchain-based framework that uses sensor data to ascertain liability in accidents involving self-driving cars.
Seagate has announced a 14 terabyte helium-filled hard drive that uses perpendicular magnetic recording (PMR) rather than shingled magnetic recording (SMR). Toshiba announced a similar drive in December:
Seagate this week formally introduced its first hard drive with 14 TB capacity aimed at cloud datacenters that does not use shingled magnetic recording. The new Exos X14 HDDs are filed with helium and are based on the latest-generation PMR (perpendicular magnetic recording) platters, running at 7200 RPM.
[...] The Exos X14 is Seagate's response to Toshiba's MG07ACA HDD with 14 TB capacity announced last year, although until we recieve further information, we cannot do a direct comparison. The major benefit of both drives is their increased capacity that enables datacenter operators to store 3360 TB of data per rack (compared to 2440 TB with 10 TB HDDs), which is a key advantage for companies that need to maximize their storage capacity per square meter and per watt, while meeting other TCO objectives. Another indisputable win of 14 TB hard disks from Seagate and Toshiba (vs. HGST's Ultrastar Hs14) is their conventional magnetic recording technology, which ensures predictable writing performance and permits drop in compatibility of the HDDs with existing storage applications.
The author guesses it will have nine ~1.55 TB platters, like Toshiba's version. 9th-generation and beyond PMR platters that can store 1.8 TB or more may be seen before the technology is phased out:
[November 2017's] top-of-the-range enterprise-class 3.5" HDDs from Seagate and Western Digital can store up to 12 TB of data. They are based on eight 8th generation PMR platters featuring ~1.5 TB capacities. Toshiba is a little bit behind its rivals with their 10 TB units featuring seven 8th gen platters with 1.43 TB capacity. With the arrival of the 9th gen PMR platters in 2018, hard drive makers will be able to increase the capacities of their eight-platter models to 14 TB, while designs with seven platters can go up to 12 TB.
Related: Western Digital Announces 12-14 TB Hard Drives and an 8 TB SSD
Western Digital to Use Microwave Assisted Magnetic Recording to Produce 40 TB HDDs by 2025
Western Digital Shipping 14 TB Helium-Filled Shingled Magnetic Recording Hard Drives
Seagate to Stay the Course With HAMR HDDs, Plans 20 TB by 2020, ~50 TB Before 2025
The hop plant Humulus lupulus L. produces a flower with remarkable biochemical properties. When boiled, various compounds are isomerized and produce bitter-tasting compounds that serve to cut what would be a very sweet drink to make it palatable, but it also has good antimicrobal characteristics that keeps the beer from spoiling. Depending upon the beer style, they can also make a significant flavor contribution, such as the pine notes from Northern Brewer and Chinook, the citrus and tropical notes from Citra, or even the chocolate notes from Southern Cross. A thousand different chemical compounds have been identified in hops, but two major ones that drive hop flavor are linalool and geraniol. It is the subtle relative differences between hops varieties of these and other compounds that lets one produce beers with a very broad range of flavor profiles.
A team of researchers wanted to insert the genetic material that produces linalool and geraniol into the brewing yeast and have those compounds generated during the fermentation process. They inserted the gene sequences that are known to produce these compounds into a widely used commercial strain of beer yeast (White Labs WLP001). They made beer using these engineered strains as well as one with an unmodified strain and presented the results to a tasting panel. The found that the engineered strains produced a product that had a "hoppier" flavor than the unmodified strain.
Dinosaurs like the Triceratops may have had horns and frills to attract a mate, a new study suggests.
Ceratopsian, or horned dinosaurs, were previously thought to have developed this ornamentation to distinguish between different species.
This has now been ruled out in a study published in a Royal Society journal.
Instead, the aggressive-looking armour may actually have evolved to signal an animal's suitability as a partner, known as socio-sexual selection.
"Individuals are advertising their quality or genetic make-up," explained Andrew Knapp, lead author of the research reported in Proceedings of the Royal Society B.
"We see that in peacocks too, with their tail feathers."
Also at Science Magazine.
Patterns of divergence in the morphology of ceratopsian dinosaurs: sympatry is not a driver of ornament evolution (open, DOI: 10.1098/rspb.2018.0312) (DX)
The Guardian mentions that the Gold Coast council in Australia, where next month's Commonwealth Games are hosted, will use a new city WiFi service to harvest Facebook data from visitors.
The data mining, which the council says is legal and will be used to help the city market itself to tourists, relies on visitors using their Facebook accounts to log into a new high-speed WiFi service. Users who object to sharing their Facebook data can still access the free WiFi, but the speed will be much slower and downloads restricted.
The city switched on the wifi service in the tourist hubs of Surfers Paradise, Southport and Broadbeach on Tuesday, 15 days ahead of the Games opening ceremony. It spent $5m to build its own infrastructure for the Games due to concerns about the speed and rollout of the national broadband network.
A city spokeswoman insisted the council would only make "limited use" of the data it mined from tourists. She insisted data would not be shared with "other agencies" although reports about tourist activity based on the information could be made available to the tourism sector "and other sectors as appropriate".
"The most important information is about country of origin, to better understand the use by overseas tourists, who are one of the primary target groups for the service," the spokeswoman said.
"The city will be able to understand patterns of demand and use, including how many people are accessing the service, times of day and the amounts of data used. It will also be very useful for understanding numbers during events and seasonal effects."
Doctors say a stem cell transplant could be a "game changer" for many patients with multiple sclerosis. Results from an international trial show that it was able to stop the disease and improve symptoms. It involves wiping out a patient's immune system using cancer drugs and then rebooting it with a stem cell transplant.
Louise Willetts, 36, from Rotherham, is now symptom-free and told me: "It feels like a miracle." A total of 100,000 people in the UK have MS, which attacks nerves in the brain and spinal cord.
There are just a few problems, however: The experimental procedure is under scrutiny from regulators, the experiment's web site may have overstated the effectiveness of the not-yet-proven treatment, and patients have to foot the bill. Oh, and no one has seen the study yet.
[...] The results reported in the BBC piece are just the preliminary findings. And that leaves a number of questions still unanswered — are these results permanent? What are the risks? Who isn't suited to have their immune system wiped out through aggressive chemo?
The U.S. Food and Drug Administration (FDA) has also flagged some serious issues in the study's protocol. If that sounds boring and bureaucratic, think of it this way: for a few months, the lead investigator somehow forgot to report a number of nasty side effects of the treatment, including chest infection and the worsening of conditions as diverse as vertigo, narcolepsy, stuttering, and hyperglycemia, among others.
One thing we know for sure? It's real expensive. The BBC noted it cost patients £30,000 ($42,000) to receive the experimental treatment, but biomedical scientist and science writer Paul Knoepfler, who has been following the trial since last year, says it ran some patients between $100,000 and $200,000.
Last week Chinese eCommerce giant Alibaba announced its Q3 earnings. Cloud revenue was $553 million, an impressive 104 percent year-over-year increase. That comes out to a run rate in the range of $2.2 billion, well behind Google which announced it is pulling in a billion dollars a quarter and still buried behind the market leaders all of whom reported around $4 billion+ a quarter.
While the growth was impressive, keep in mind when you have a small market share, it’s much easier to grow a big number than when you have a larger market share. In other words, it gets harder to grow, the larger you get.
It is worth noting, however that the growth spurt allowed Alibaba to show up in the top five of Synergy Research’s most recent Cloud Infrastructure Market Share report for the first time. While the market share was only around 3 or 4 percent. it’s still significant because no longer being lumped together with “next 10” or “rest of market.”
Synergy reports that the cloud market grew 46 percent in the fourth quarter, and each of the biggest cloud companies benefited over the smaller ones. “In large part the expansion was driven by aggressive growth of Amazon (AWS), Microsoft, Google and Alibaba, who all increased their share of the worldwide market at the expense of smaller cloud providers,” Synergy wrote in their report.
TRAPPIST-1's exoplanets appear to have migrated closer to TRAPPIST-1 over time until they reached their current orbits. This migration appears to have allowed them to retain too much water to support life:
What [the ASU-Vanderbilt team] found through their analyses was that the relatively "dry" inner planets ("b" and "c") were consistent with having less than 15 percent water by mass (for comparison, Earth is 0.02 percent water by mass). The outer planets ("f" and "g") were consistent with having more than 50 percent water by mass. This equates to the water of hundreds of Earth-oceans. The masses of the TRAPPIST-1 planets continue to be refined, so these proportions must be considered estimates for now, but the general trends seem clear.
"What we are seeing for the first time are Earth-sized planets that have a lot of water or ice on them," said Steven Desch, ASU astrophysicist and contributing author.
But the researchers also found that the ice-rich TRAPPIST-1 planets are much closer to their host star than the ice line. The "ice line" in any solar system, including TRAPPIST-1's, is the distance from the star beyond which water exists as ice and can be accreted into a planet; inside the ice line water exists as vapor and will not be accreted. Through their analyses, the team determined that the TRAPPIST-1 planets must have formed much farther from their star, beyond the ice line, and migrated in to their current orbits close to the host star.
[...] "We typically think having liquid water on a planet as a way to start life, since life, as we know it on Earth, is composed mostly of water and requires it to live," Hinkel explained. "However, a planet that is a water world, or one that doesn't have any surface above the water, does not have the important geochemical or elemental cycles that are absolutely necessary for life."
Also at Phys.org.
Inward migration of the TRAPPIST-1 planets as inferred from their water-rich compositions (DOI: 10.1038/s41550-018-0411-6) (DX) (arXiv)
Related: Powerful Solar Flares Found at TRAPPIST-1 Could Dim Chances for Life
TRAPPIST-1 Older than Our Solar System
Hubble Observations Suggest TRAPPIST-1 Exoplanets Could Have Water
Induction Heating Could Cause TRAPPIST-1 Exoplanets to Melt
Another TRAPPIST-1 Habitability Study
Google is announcing new efforts today to support the media industry by fighting misinformation and bolstering journalism, which will live under a newly announced umbrella called the Google News Initiative. Google already offers something similar in Europe through the Digital News Initiative, but the Google News Initiative is intended to be a wider worldwide expansion of those kinds of efforts.
There are three specific goals of the Google News Initiative: highlight accurate journalism while fighting misinformation, particularly during breaking news events; help news sites continue to grow from a business perspective; and create new tools to help journalists do their jobs. Google is serious about supporting these goals, too, pledging to invest $300 million over the next three years.
Fighting fake news is obviously one of the most crucial parts of Google's forthcoming efforts. The company has had several brushes with disinformation propagating through search following events like the Las Vegas shooting last fall, making this an area where Google has room to improve. According to Google, the company is working to train its system to be better at recognizing contentious breaking news and adjust toward displaying more accurate results, using the recently added "Breaking News" section on YouTube as an example. But even with those improvements, Google still has problems with search results on YouTube, including issues where conspiracy videos topped the trending results last month following the Parkland shooting. It's good to see that Google is working to improve this, but it'll have a long way to go to regain users' trust.
Also at Bloomberg.
The study, published in the Lancet Public Health journal and believed to be the first to research the effects of low levels of lead exposure on the general public, also concludes there is no safe level of the toxic metal: people with the lowest detectable amounts were still affected.
Researchers at four North American universities, led by Bruce Lanphear, of Simon Fraser University in Vancouver, studied the fate of 14,289 people whose blood had been tested in an official US survey between 1988 and 1994. Four fifths of them had harboured levels of the toxic metal below what has, hitherto, been thought safe.
The study found that deaths, especially from cardiovascular disease, increased markedly with exposure, even at the lowest levels. It concluded that lead kills 412,000 people a year – accounting for 18% of all US mortality, not much less than the 483,000 who perish as a result of smoking.
March 20th, 2018, Daniel Stenberg notes twenty years of his flexible, multi-protocol, text-based utility, curl. It is a very common client-side file transfer utility. The associated development libraries, libcurl are a couple of years younger.
curl itself and components from libcurl are found nearly everywhere these days. Due to such widespread use, it is hard to be precise with usage numbers, but conservative estimates suggest billions of people every day are using it, though mostly under the hood several layers down inside devices they own. It is the Internet transfer utility of choice for thousands of software applications. It is found in cars, television sets, routers, printers, audio equipment, mobile phones, tablets, settop boxes, and media players for starters.
A detailed, free-of-charge, ebook, Everything curl, covers basically everything there is to know about curl, libcurl, and the associated project.
Earlier on SN:
Multiple sources tell us that Google is acquiring Lytro, the imaging startup that began as a ground-breaking camera company for consumers before pivoting to use its depth-data, light-field technology in VR.
One source described the deal as an "asset sale" with Lytro going for no more than $40 million. Another source said the price was even lower: $25 million. A third source tells us that not all employees are coming over with the company's technology: some have already received severance and parted ways with the company, and others have simply left. Assets would presumably also include Lytro's 59 patents related to light-field and other digital imaging technology.
The sale would be far from a big win for Lytro and its backers. The startup has raised just over $200 million in funding and was valued at around $360 million after its last round in 2017, according to data from PitchBook.
Despite a lot of hype, Lytro had little success with its expensive, ergonomically challenged, and low resolution light field cameras for consumers.
Related: LinkedIn's Top 10 Silicon Valley Startups for 'Talent Brand' - Note: Both Lytro and Theranos are on the list.
A Pocket Camera with Many Eyes - Inside the Development of Light
The settlement was reached in October 2016, and originally PS3 owners were told to expect up to $55. That's increased to $65 now, possibly because fewer claims than expected were submitted in the 18 months since.
This resolves, legally anyway, the removal of the so-called "OtherOS" feature from the PS3's operating system eight years ago. That feature allowed users to partition their PS3's hard drive and install Linux on it. You may remember that, before then, the console was pitched and even used as a computer, including by the Air Force (which created a supercomputer cluster out of more than 1,700 of the consoles) and in distributed computing applications such as Folding@home and SETI@home.
But in April 2010, Sony stripped out the OtherOS feature, citing security concerns, which pissed off a small but very vocal contingent of PS3 users. That led to the lawsuit, which alleged false advertising, breach of warranty and etc. Sony admits no wrongdoing, which is customary in civil settlements.
"Phat" refers to the original console, which weighed approximately 5 kg and measured 325 mm (W) × 98 mm (H) × 274 mm (D).
Also at BetaNews.
Anonymous coders can be identified using stylometry and machine learning techniques applied to executable binaries:
Source code stylometry – analyzing the syntax of source code for clues about the author – is an established technique used in digital forensics. As the US Army Research Laboratory (ARL) puts it, "Stylometry research has proven that anonymous code contributors can be de-anonymized to reveal the original author, provided the author has published code before."
The technique can help identify virus makers as well as unmask the creators of anti-censorship tools and other outlawed programs. It has the potential to pierce the privacy that many programmers assume they have.
Source code is designed to be human-readable, but binaries – typically produced by compiling or assembling source code – have fewer characteristics that may suggest authorship. Toolchains can be instructed to strip out variable names, function names and other symbols and metadata – which may say something about the author – and alter the structure of code through optimization.
Nonetheless, the researchers – Aylin Caliskan, Fabian Yamaguchi, Edwin Dauber, Richard Harang, Konrad Rieck, Rachel Greenstadt and Arvind Narayanan – building on work described in a 2011 paper, demonstrate that binary files can be analyzed using machine-learning and stylometric techniques.
If you want to remain an anonymous coder, you'd better not contribute anything under your own name publicly:
When Coding Style Survives Compilation: De-anonymizing Programmers from Executable Binaries (arXiv:1512.08546 [cs.CR])
We evaluate our approach on data from the Google Code Jam, obtaining attribution accuracy of up to 96% with 100 and 83% with 600 candidate programmers. We present an executable binary authorship attribution approach, for the first time, that is robust to basic obfuscations, a range of compiler optimization settings, and binaries that have been stripped of their symbol tables. We perform programmer de-anonymization using both obfuscated binaries, and real-world code found "in the wild" in single-author GitHub repositories and the recently leaked Nulled.IO hacker forum. We show that programmers who would like to remain anonymous need to take extreme countermeasures to protect their privacy.
The ashes of Professor Stephen Hawking will be interred next to the grave of Sir Isaac Newton at Westminster Abbey, it has been revealed. The renowned theoretical physicist's final resting place will also be near that of Charles Darwin, who was buried there in 1882.
A few months before physicist Stephen Hawking died, he published a paper that several media outlets touted as a way to finally prove (or disprove) the existence of parallel worlds. But that claim may be a bit of cosmic inflation, said several physicists who were not involved in Hawking's research.
"The paper makes no statements about observational tests. It's not entirely uninteresting, but it's one of literally several thousand ideas for what might possibly have happened in the early universe" many of which include parallel worlds, said Sabine Hossenfelder, a physicist at the Frankfurt Institute for Advanced Studies in Germany, who blogs at backreaction.blogspot.com.
Sabine Hossenfelder's blog post: Hawking's "Final Theory" is not groundbreaking
A Smooth Exit from Eternal Inflation? (arXiv:1707.07702 [hep-th])
Hawking spent much of his later years trying to figure out how a black hole could regurgitate information—although he also worked on theories of what triggered the big bang. Three years ago he began his last work on black holes with Malcolm Perry, a theoretical physicist and Hawking's colleague at the University of Cambridge in the United Kingdom, and Andrew Strominger, a theorist at Harvard University. "It was only 2 weeks ago that I saw him," Perry says. "He certainly wasn't in the best shape, but his mind was clearly focused on the problem."
[...] [Strictly] speaking, Strominger says, the theorem states only that two similar black holes can be "transformed" into each other by a handful of mathematical relations called diffeomorphisms, which relabel the coordinates of space-time. An infinite family of other diffeomorphisms has been neglected for decades, he says. They imply that a black hole's event horizon might be bedecked with an infinity of charges, a bit like electric charges. The charges could distinguish one black hole from another and encode infalling information, Strominger says. "We're cautiously optimistic about this idea," he says. "Stephen was very optimistic."
However, the charges may not encode enough information or may not do so in a unique way, Giddings cautions. One theorist who requested anonymity out of respect for Hawking says his various solutions for the black hole information problem pale next to his best work. Hawking's latest work also misses a bigger issue, the theorist says. If a black hole preserves information, he argues, then an unavoidable conclusion of Einstein's theory of gravity—that there's no way to tell if you're falling into a huge black hole—must be wrong.
Previously: Stephen Hawking Dead at 76