×
Data Storage

Backblaze Hard Drive Stats Q3 2020 (backblaze.com) 37

Backblaze's Q3 2020 hard drive stats: As of September 30, 2020, Backblaze had 153,727 spinning hard drives in our cloud storage ecosystem spread across four data centers. Of that number, there were 2,780 boot drives and 150,947 data drives. This review looks at the Q3 2020 and lifetime hard drive failure rates of the data drive models currently in operation in our data centers and provides a handful of insights and observations along the way. [...] There are several models with zero drive failures in the quarter. That's great, but when we dig in a little we get different stories for each of the drives.

The 18TB Seagate model (ST18000NM000J) has 300 drive days and they've been in service for about 12 days. There were no out of the box failures which is a good start, but that's all you can say.
The 16TB Seagate model (ST16000NM001G) has 5,428 drive days which is low, but they've been around for nearly 10 months on average. Still, I wouldn't try to draw any conclusions yet, but a quarter or two more like this and we might have something to say.
The 4TB Toshiba model (MD04ABA400V) has only 9,108 drive days, but they have been putting up zeros for seven quarters straight. That has to count for something.
The 14TB Seagate model (ST14000NM001G) has 21,120 drive days with 2,400 drives, but they have only been operational for less than one month. Next quarter will give us a better picture.
The 4TB HGST (model: HMS5C4040ALE640) has 274,923 drive days with no failures this quarter.

Intel

Intel Agrees To Sell Storage Unit To SK Hynix for $9 Billion (bloomberg.com) 49

Intel has agreed to sell its Nand memory unit to South Korea's SK Hynix for about $9 billion, a deal that allows the U.S. chipmaker to concentrate on its main business while shoring up the Asian company's position in a booming market. From a report: The chipmaker will pay 10.3 trillion won for the Intel unit, which makes flash memory components for computers and other devices. The acquisition, which will take place in stages through 2025, includes Intel's solid-state drive, Nand flash and wafer businesses, as well as a production facility in the northeastern Chinese city of Dalian. The deal should shore up Hynix's position in a business that's boomed after Covid-19 drove demand for the chips used in everything from Apple's iPhones to data centers. It whittles down another player in an industry the Korean company dominates alongside Samsung Electronics and Micron Technology, potentially buoying Nand flash prices.
Power

A Group of Materials Called Perovskites Could Be a Game-Changer For Solar Power (independent.co.uk) 62

Researchers from Australia have discovered that the widely acclaimed mineral perovskite can be used to transform the solar industry through cheaper and more efficient photovoltaics. The Independent reports: Perovskite, which is forged deep within the Earth's mantle, has been hailed for its unprecedented potential to convert sunlight into electricity. Researchers have already improved its sunlight-to-energy efficiency from around 3 per cent to over 20 per cent in the space of just a few years. "It's unbelievable, a miracle material," Z. Valy Vardeny, a materials science professor from the University of Utah, said about perovskite in 2017. At the time it was thought that it would be at least 10 years before it reached a point that the material could be used in commercial solar cells, however the latest breakthrough could see the wide uptake of the technology much sooner. "It was one of those unusual discoveries that you sometimes hear about in science," said Dr Hall from the University of Melbourne.

With the help of researchers at the University of Sydney, the scientists were able to use computational modeling to solve the problem of instability within the material when exposed to sunlight. The unlikely solution was to undo the disruption caused by light at lower intensities by focussing the light into a high-intensity beam. Dr Hall added: "What we've shown is that you can actually use the material in the state that you want to use it, for a solar cell - all you need to do is focus more light onto it." The research could also have significant implications for data storage, with perovskites offering a way to dramatically increase a device's potential capacity.
The study has been published in the journal Nature Materials.
Power

Tesla Powerwall Rival Seeks To Bring Hydrogen Into Your Home (bloomberg.com) 133

An anonymous reader quotes a report from Bloomberg: It's about the size of Tesla Inc.'s Powerwall, but can store up to three times as much energy over a longer period. That's the promise of a new hydrogen-based energy-storage system for homes and businesses being developed by Australian startup Lavo Hydrogen Technology Ltd. The technology, developed with scientists at the University of New South Wales, uses power from rooftop solar panels to produce hydrogen from water by electrolysis. The gas is stored in a metal hydride container and converted back into electricity when needed using a fuel cell.

Australia's world-beating rooftop-solar take-up rates make it an ideal early market, said Lavo Chief Executive Officer Alan Yu. The unit will go on sale from November, with installations starting in June 2021, subject to final approvals. The company plans to sell 10,000 units a year by 2022. At about triple the price of a Powerwall, the Lavo unit's main selling point will be its ability to store more energy for longer. Each system will initially cost A$34,750 ($24,620) and will be able to hold 40 kilowatt-hours of power -- enough to supply an average household for more than two days, according to the company. Tesla's Powerwall holds about 13.5 kilowatt-hours. Lavo's Yu acknowledged that the higher cost of the system might initially limit interest to energy-technology enthusiasts initially, but he also sees it as a solution for small off-grid rural villages to replace diesel generators or a compact solution for communities and homes cut off from the main grid by natural disasters such as bushfires.

Science

Physicists Successfully Carry Out Controlled Transport of Stored Light (phys.org) 39

schwit1 shares a report from Phys.Org: A team of physicists led by Professor Patrick Windpassinger at Johannes Gutenberg University Mainz (JGU) has successfully transported light stored in a quantum memory over a distance of 1.2 millimeters. They have demonstrated that the controlled transport process and its dynamics has only little impact on the properties of the stored light. The researchers used ultra-cold rubidium-87 atoms as a storage medium for the light as to achieve a high level of storage efficiency and a long lifetime. The controlled manipulation and storage of quantum information as well as the ability to retrieve it are essential prerequisites for achieving advances in quantum communication and for performing corresponding computer operations in the quantum world. Optical quantum memories, which allow for the storage and on-demand retrieval of quantum information carried by light, are essential for scalable quantum communication networks.

In their recent publication, Professor Patrick Windpassinger and his colleagues have described the actively controlled transport of such stored light over distances larger than the size of the storage medium. Some time ago, they developed a technique that allows ensembles of cold atoms to be transported on an 'optical conveyor belt' which is produced by two laser beams. The advantage of this method is that a relatively large number of atoms can be transported and positioned with a high degree of accuracy without significant loss of atoms and without the atoms being unintentionally heated. The physicists have now succeeded in using this method to transport atomic clouds that serve as a light memory. The stored information can then be retrieved elsewhere. Refining this concept, the development of novel quantum devices, such as a racetrack memory for light with separate reading and writing sections, could be possible in the future.
The findings have been published in the journal Physical Review Letters.
Google

Google Is Killing Unlimited Drive Storage For Non-Enterprise Users (petapixel.com) 50

If you're one of the Google Drive users who is taking advantage of unlimited storage for $12 per month on G Suite, beware. Workspace is replacing G Suite and offers more features for those who do, but you might not want to switch: unlimited storage on Workspace will cost you at least $20 a month. Jaron Schneider reports via PetaPixel: Currently G Suite business subscribers (which do not need to be actual businesses, but any individuals looking for greater storage capacity) can access unlimited storage on Drive for just $12 a month. For photographers with considerable backlogs of photos, this was a relatively inexpensive cloud storage backup solution. Google states in its plans that groups using this particular plan with four or fewer members are supposed to be only eligible for 1 TB of storage each, but in testing by Android Police and others have shown that Google has never enforced that limit.

Unfortunately, this appears to be changing with the transition to Workplace. According to the company's list of plans, which you can view here, there is a limit of 2 TB for individual Business Standard users and 5 TB per person on its new Business Plus plan. To get more, you will have to go to the Enterprise level which Google says requires you to work directly with a Google sales representative (this appears to actually be the case), but Google does promise they can offer as much storage "as you need" in this category. That doesn't explicitly say unlimited, but should realistically operate as such for now. Pricing in that Enterprise level will cost you $20 per month ($30 per month on Enterprise Plus), nearly double the previous price for the same amount of storage. For now, G Suite customers will be able to stick with their current plans if they do not switch to Workplace, but Google is intending to transition all users over to the new system eventually.

Businesses

Finnish Startup Unveils Machine That Takes Office-Air CO2 and Converts It Into Fuel (arstechnica.com) 114

Over a video call, Finnish start-up Soletair Power showed Ars Technica their machine that converts office-air carbon dioxide into fuel. Scott K. Johnson reports: The value proposition for the first part of the device is pretty straightforward. Carbon dioxide accumulates in buildings full of people, and higher CO2 concentrations may impact your ability to think clearly. The usual way to manage that is to introduce more outside air (which may need to be heated/cooled). Another could be to selectively filter out CO2. This device could do the latter for you. That CO2 could simply be vented outside or used to produce an unwieldy amount of seltzer. Instead, what makes Soletair's idea more interesting is that the rest of its device turns the CO2 into fuel. The configuration the company demonstrated makes methane but could be swapped for a liquid fuel process. Depending on the source of the energy running the machines, these fuels could be carbon-neutral since the carbon comes from the air. Whether it's economically viable is another question.

The CO2 capture technique they're using is a scaled-down version of those designed for combustion power plants. Air goes through a chamber full of small granules that contain amines -- compounds that bind with CO2 molecules. Periodically, the granules are cycled through a heating step. The temperature only needs to rise to shy of 120C, Soletair's Petri Laakso and Cyril Bajamundi told Ars, so steam from the local heat system and/or an electric heating element is sufficient. This makes the amine granules release the CO2 they're holding, which accumulates in a storage tank. The granules are then ready to absorb more CO2. The other two-thirds of the machine, which measures about 2 meters tall, 5 meters long, and 1 meter wide, deal with turning that CO2 into a usable fuel. First, there's an electrolyzer that splits water to make hydrogen gas. Then hydrogen is combined with CO2 in a methanation reactor to produce pure methane gas.

Cloud

Amazon's Latest Gimmicks Are Pushing the Limits of Privacy (wired.com) 49

At the end of September, Amazon debuted two especially futuristic products within five days of each other: a small autonomous surveillance drone, called Ring Always Home Cam, and a palm recognition scanner, called Amazon One. "Both products aim to make security and authentication more convenient -- but for privacy-conscious consumers, they also raise red flags," reports Wired. From the report: Amazon's latest data-hungry innovations are not launching in a vacuum. The company also owns Ring, whose smart doorbells have had myriad security issues and have been widely criticized for bringing unprecedented surveillance to traditionally semi-private spaces. Meanwhile, the biometric data that Amazon Go will collect is particularly sensitive, because unlike a password you can't simply change it if a hacker steals it or it gets unintentionally exposed. Amazon has a strong record for maintaining the security of its massive cloud infrastructure, but there have been lapses across the sprawling business. The stakes are already phenomenally high; the more data the company holds the more risk it takes on. "Amazon has a major genomics cloud platform, so maybe they hold your DNA and now they're going to have your palm as well? Plus all of these devices inside your house. And your purchase history on Prime. That's a lot of information. That's a lot of personal information," says Nina Alli, executive director of Defcon's Biohacking Village and a health care security researcher. "When you give away this data you're giving a company the ability to access and manage you, not the other way around."
[...]
Additionally, while companies like Apple and Samsung have brought biometric fingerprint and face scanners to the masses by making sure the data never leaves the device, Amazon One takes the opposite approach. Kumar writes that "palm images are never stored" on Amazon One itself. Instead they are encrypted and sent to a special high security area of Amazon's cloud to be converted into "palm signatures" based on the unique and distinctive features of a user's hand. Then the service compares that signature to the one on file in each user's account and returns a match or no match answer back down to the device. It makes sense that Amazon doesn't want to store databases of people's palm data locally on publicly accessible machines that could be manipulated. But the system could perhaps have been set up to generate a palm signature locally, delete the image of a person's hand, and send only the encrypted signature on for analysis. The fact that all of those palm images will be going for cloud processing creates a single point of failure.
"I'm worried that people could read your palm vein pattern in other ways and construct an analog. It's only a matter of time," says Joseph Lorenzo Hall, a longtime security and privacy researcher and a senior vice president at the nonprofit Internet Society. "Both the home drone and the palm payment are going to rely heavily on the cloud and on the security provided by that cloud storage. That's worrying because it means all the risks -- rogue employees, government data requests, data breach, secondary uses -- associated with data collection on the server-side could be possible. I'm much more comfortable having a biometric template stored locally rather than on a server where it might be exfiltrated."

An Amazon spokesperson told WIRED, "We are confident that the cloud is highly secure. In addition, Amazon One palm data is stored separately from other personal identifiers, and is uniquely encrypted with its own keys in a secure zone in the cloud."
Security

Apple's T2 Security Chip Has an Unfixable Flaw (wired.com) 81

A recently released tool is letting anyone exploit an unusual Mac vulnerability to bypass Apple's trusted T2 security chip and gain deep system access. The flaw is one researchers have also been using for more than a year to jailbreak older models of iPhones. But the fact that the T2 chip is vulnerable in the same way creates a new host of potential threats. Worst of all, while Apple may be able to slow down potential hackers, the flaw is ultimately unfixable in every Mac that has a T2 inside. From a report: In general, the jailbreak community haven't paid as much attention to macOS and OS X as it has iOS, because they don't have the same restrictions and walled gardens that are built into Apple's mobile ecosystem. But the T2 chip, launched in 2017, created some limitations and mysteries. Apple added the chip as a trusted mechanism for securing high-value features like encrypted data storage, Touch ID, and Activation Lock, which works with Apple's "Find My" services. But the T2 also contains a vulnerability, known as Checkm8, that jailbreakers have already been exploiting in Apple's A5 through A11 (2011 to 2017) mobile chipsets. Now Checkra1n, the same group that developed the tool for iOS, has released support for T2 bypass.

On Macs, the jailbreak allows researchers to probe the T2 chip and explore its security features. It can even be used to run Linux on the T2 or play Doom on a MacBook Pro's Touch Bar. The jailbreak could also be weaponized by malicious hackers, though, to disable macOS security features like System Integrity Protection and Secure Boot and install malware. Combined with another T2 vulnerability that was publicly disclosed in July by the Chinese security research and jailbreaking group Pangu Team, the jailbreak could also potentially be used to obtain FileVault encryption keys and to decrypt user data. The vulnerability is unpatchable, because the flaw is in low-level, unchangeable code for hardware. "The T2 is meant to be this little secure black box in Macs -- a computer inside your computer, handling things like Lost Mode enforcement, integrity checking, and other privileged duties," says Will Strafach, a longtime iOS researcher and creator of the Guardian Firewall app for iOS. "So the significance is that this chip was supposed to be harder to compromise -- but now it's been done."

Linux

Linux 5.9 Boosts CPU Performance With FSGSBASE Support (phoronix.com) 75

FSGSBASE support in Linux "has the possibility of helping Intel/AMD CPU performance especially in areas like context switching that had been hurt badly by Spectre/Meltdown and other CPU vulnerability mitigations largely on the Intel side," Phoronix wrote back in August. As it started its journey into the kernel, they provided a preview on August 10: The FSGSBASE support that was finally mainlined a few days ago for Linux 5.9 is off to providing a nice performance boost for both Intel and AMD systems... FSGSBASE support for the Linux kernel has been around a half-decade in the making and finally carried over the finish line by one of Microsoft's Linux kernel engineers...

FSGSBASE particularly helps out context switching heavy workloads like I/O and allowing user-space software to write to the x86_64 GSBASE without kernel interaction. That in turn has been of interest to Java and others...On Linux 5.9 where FSGSBASE is finally mainlined, it's enabled by default on supported CPUs. FSGSBASE can be disabled at kernel boot time via the "nofsgsbase" kernel option.

Today on the Linux kernel mailing list, Linus Torvalds announced the release of Linux 5.9: Ok, so I'll be honest - I had hoped for quite a bit fewer changes this last week, but at the same time there doesn't really seem to be anything particularly scary in here. It's just more commits and more lines changed than I would have wished for.
And Phoronix reported: Linux 5.9 has a number of exciting improvements including initial support for upcoming Radeon RX 6000 "RDNA 2" graphics cards, initial Intel Rocket Lake graphics, NVMe zoned namespaces (ZNS) support, various storage improvements, IBM's initial work on POWER10 CPU bring-up, the FSGSBASE instruction is now used, 32-bit x86 Clang build support, and more. See our Linux 5.9 feature overview for the whole scoop on the many changes to see with this kernel.
Earth

The World's First Carbon Dioxide Removal Law Database 15

Today, researchers at Columbia University launched the world's first database of carbon dioxide removal laws, providing an annotated bibliography of legal materials related to carbon dioxide removal and carbon sequestration and use. It is publicly available at cdrlaw.org. Phys.Org reports: The site has 530 resources on legal issues related to carbon dioxide removal, including such techniques as: direct air capture; enhanced weathering; afforestation/reforestation; bioenergy with carbon capture and storage; biochar; ocean and coastal carbon dioxide removal; ocean iron fertilization; and soil carbon sequestration. The database also includes 239 legal resources on carbon capture and storage, utilization, and transportation. New resources are constantly being added.

This site was created by the Sabin Center for Climate Change Law at Columbia Law School, in cooperation with the Carbon Management Research Initiative at the Center on Global Energy Policy at Columbia's School of International and Public Affairs. Generous financial support was provided by the ClimateWorks Foundation and the Earth Institute at Columbia University. The Sabin Center is also undertaking a series of white papers with in-depth examinations of the legal issues in particular carbon dioxide removal technologies. The first of these, "The Law of Enhanced Weathering for Carbon Dioxide Removal," by Romany M. Webb, has just been released.
Chrome

Chrome Changes How Its Cache System Works To Improve Privacy (zdnet.com) 21

Google has changed how a core component of the Chrome browser works in order to add additional privacy protections for its users. From a report: Known as the HTTP Cache or the Shared Cache, this Chrome component works by saving copies of resources loaded on a web page, such as images, CSS files, and JavaScript files. The idea is that when a user revisits the same site or visits another website where the same files are used, Chrome will load them from its internal cache, rather than waste time re-downloading each file all over again.

[...] With Chrome 86, released earlier this week, Google has rolled out important changes to this mechanism. Known as "cache partitioning," this feature works by changing how resources are saved in the HTTP cache based on two additional factors. From now on, a resource's storage key will contain three items, instead of one: The top-level site domain (http://a.example), the resource's current frame (http://c.example), and the resource's URL (https://x.example/doge.png). By adding additional keys to the cache pre-load checking process, Chrome has effectively blocked all the past attacks against its cache mechanism, as most website components will only have access to their own resources and won't be able to check resources they have not created themselves.

Power

Tesla Co-Founder Aims To Build World's Top Battery Recycler (reuters.com) 18

An anonymous reader quotes a report from Reuters: Tesla co-founder J.B. Straubel wants to build his startup Redwood Materials into the world's top battery recycling company and one of the largest battery materials companies, he said at a technology conference Wednesday. Straubel aims to leverage two partnerships, one with Panasonic, the Japanese battery manufacturer that is teamed with Tesla at the Nevada gigafactory, and one announced weeks ago with e-commerce giant Amazon. With production of electric vehicles and batteries about to explode, Straubel says his ultimate goal is to "make a material impact on sustainability, at an industrial scale."

Established in early 2017, Redwood this year will recycle more than 1 gigawatt-hours' worth of battery scrap materials from the gigafactory -- enough to power more than 10,000 Tesla cars. That is a fraction of the half-million vehicles Tesla expects to build this year. At the company's Battery Day in late September, Chief Executive Elon Musk said he was looking at recycling batteries to supplement the supply of raw materials from mining as Tesla escalates vehicle production. [...] Straubel's broader plan is to dramatically reduce mining of raw materials such as nickel, copper and cobalt over several decades by building out a circular or "closed loop" supply chain that recycles and recirculates materials retrieved from end-of-life vehicle and grid storage batteries and from cells scrapped during manufacturing.

PlayStation (Games)

PS5 Teardown Video Confirms Faster Wi-Fi and USB Ports Than Xbox Series X (gamesradar.com) 56

Sony's recently-released PS5 teardown video gives us a closer look at the PS5, and confirms that the speed of the console's Wi-Fi antenna and USB ports are faster than those available in the Xbox Series X. GamesRadar+ reports: As spotted by VG247, the teardown confirms a few new hardware details about PS5. For starters, we know the console's Wi-Fi antenna supports the new Wi-Fi 6 standard, which allows for a new maximum speed of 9.6 Gbps -- more than twice the 3.5 Gbps ceiling for Wi-Fi 5. This doesn't mean your PS5 will be able to use all of that to send your download speeds through the roof. The practical benefit is that Wi-Fi 6 routers can better distribute all that speed to a bunch of devices at once, and to maintain their performance over time. So if you have a Wi-Fi 6 router and a home full of connected devices, there's a good chance you will notice the improvement. For reference, the Xbox Series X Wi-Fi antenna supports Wi-Fi 5.

As for the USB ports, we already knew that PS5 has a USB-C port and a USB-A port on the front. The teardown video confirms the type-C port will support 10Gbps transfer speeds, and it confirms that the two USB-A ports on the back will as well. The type-A port on the front isn't as quick, so if you plan to plug in an external PS5 SSD make sure you use one of the ports on the back. Xbox Series X doesn't include any type-C ports, and all of its type-A ports run at the standard 5gbps speed. If you know that fast connection speeds will make a big difference to your play experience, you may want to lean toward PS5 -- but as always, the biggest deciding factor should be what games you want to play and how well each console plays them.
The Verge also notes the PS5 includes removable sides, dust catchers, and storage expansion.
Power

Study Shows Renewables Are Kicking Natural Gas To the Curb (cleantechnica.com) 267

An anonymous reader quotes a report from CleanTechnica: After analyzing the most recent data from two of America's largest electricity markets -- ERCOT in Texas and PJM in the Northeast -- the Rocky Mountain Institute has come to a startling conclusion. Renewables are muscling in on natural gas as the preferred choice for new electricity generation. In fact, according to RMI, what happened to coal is now happening to gas. What is needed, the organization argues, is a move away from the monopoly markets that have been the norm in the utility industry for more than 100 years and toward more open competition. Because when renewables compete head to head with thermal generation, they win hands down 95% of the time.

The data doesn't lie. RMI looked at the interconnection queues for both ERCOT and PJM and found over the past two years there has been a dramatic shift away from building new gas fired generating plants and toward more renewable energy projects. Interconnection queues track new generation projects proposed to be added to regional grid. That information provides a leading indicator of market trends for new power plants. Not all projects in these queues are ultimately built, but the mix of resources in the queue represents the investments the market is prioritizing, according to RMI. [...] RMI finds that since 2018, the queue for clean energy projects has more than doubled while the queue for gas projects has been cut in half. In all, more than $30 billion worth of gas projects have been canceled or abandoned. Currently, the capacity of wind, solar, and storage projects slated for construction in ERCOT and PJM is ten times greater than for new gas projects.
"Though COVID-19 may be contributing to some recent decline in planned gas additions, it's not the only driver," says RMI. "The trend has been building for years and investors more broadly are now waking up to the implications. For example, just five years ago in ERCOT, the interconnection queue contained an even split between proposed gas and renewables generation capacity. However, gas capacity in the queue started falling steadily in 2015, well before the COVID-19 pandemic and associated economic downturn. Meanwhile, renewable energy and storage projects in the queue have continued to grow even during the pandemic."

"Therefore, it is likely that a more fundamental driver is at play -- raw economics, driven by the continually falling costs of clean energy and the associated risks of investment in new gas-fired capacity."
Music

Is Streaming Music Worse For the Environment? (newyorker.com) 63

"The environmental cost of music is now greater than at any time during recorded music's previous eras," argues Kyle Devine, in his recent book, "Decomposed: The Political Ecology of Music."

The New Yorker's music critic writes: He supports that claim with a chart of his own devising, using data culled from various sources, which suggests that, in 2016, streaming and downloading music generated around a hundred and ninety-four million kilograms of greenhouse-gas emissions — some forty million more than the emissions associated with all music formats in 2000... Exploitative regimes of labor enable the production of smartphone and computer components. Conditions at Foxconn factories in China have long been notorious; recent reports suggest that the brutally abused Uighur minority has been pressed into the production of Apple devices. Child laborers are involved in the mining of cobalt, which is used in iPhone batteries. Spotify, the dominant streaming service, needs huge quantities of energy to power its servers. No less problematic are the streaming services' own exploitative practices, including their notoriously stingy royalty payments to working musicians...

When the compact disk entered circulation, in the nineteen-eighties, audio snobs attacked it as a degradation of listening culture — a descent from soulful analog sound to soulless digital. In environmental terms, however, the CD turned out to be somewhat less deleterious [than vinyl records]. Devine observes that polycarbonate, the medium's principal ingredient, is not as toxic as polyvinyl chloride. Early on, the widespread use of polystyrene for CD packaging wiped out that advantage, but a turn toward recyclable materials in recent years has made the lowly CD perhaps the least environmentally harmful format on the market.

In a chapter on the digital and streaming era, Devine drives home the point that there is no such thing as a nonmaterial way of listening to music: "The so-called cloud is a definitely material and mainly hardwired network of fiber-optic cables, servers, routers, and the like." This concealment of industrial reality, behind a phantasmagoria of virtuality, is a sleight of hand typical of Big Tech, with its genius for persuading consumers never to wonder how transactions have become so shimmeringly effortless. In much the same way, it has convinced us not to think too hard about the regime of mass surveillance on which the economics of the industry rests.... At the end of "Decomposed," Devine incorporates his ecology of music into a more comprehensive vision of anthropogenic crisis. "Musically, we may need to question our expectations of infinite access and infinite storage," he writes. Our demand that all of musical history should be available at the touch of a finger has become gluttonous. It may seem a harmless form of consumer desire, but it leaves real scars on the face of the Earth.

Linux

Will New Object Storage Protocol Mean the End For POSIX? (enterprisestorageforum.com) 76

"POSIX has been the standard file system interface for Unix-based systems (which includes Linux) since its launch more than 30 years ago," writes Enterprise Storage Forum, noting the POSIX-compliant Lustre file system "powers most supercomputers."

Now Slashdot reader storagedude writes: POSIX has scalability and performance limitations that will become increasingly important in data-intensive applications like deep learning, but until now it has retained one key advantage over the infinitely scalable object storage: the ability to process data in memory. That advantage is now gone with the new mmap_obj() function, which paves the way for object storage to become the preferred approach to Big Data applications.
POSIX features like statefulness, prescriptive metadata, and strong consistency "become a performance bottleneck as I/O requests multiply and data scale..." claims the article.

"The mmap_obj() developers note that one piece of work still needs to be done: there needs to be a munmap_obj() function to release data from the user space, similar to the POSIX function."
Japan

How One Piece of Hardware Took Down a $6 Trillion Stock Market (bloomberg.com) 26

An anonymous reader quotes a report from Bloomberg on how a data storage and distribution device brought down Tokyo's $6 trillion stock market: At 7:04 a.m. on an autumn Thursday in Tokyo, the stewards of the world's third-largest equity market realized they had a problem. A data device critical to the Tokyo Stock Exchange's trading system had malfunctioned, and the automatic backup had failed to kick in. It was less than an hour before the system, called Arrowhead, was due to start processing orders in the $6 trillion equity market. Exchange officials could see no solution. The full-day shutdown that ensued was the longest since the exchange switched to a fully electronic trading system in 1999. It drew criticism from market participants and authorities and shone a spotlight on a lesser-discussed vulnerability in the world's financial plumbing -- not software or security risks but the danger when one of hundreds of pieces of hardware that make up a trading system decides to give up the ghost.

The TSE's Arrowhead system launched to much fanfare in 2010, billed as a modern-day solution after a series of outages on an older system embarrassed the exchange in the 2000s. The "arrow" symbolizes speed of order processing, while the "head" suggests robustness and reliability, according to the exchange. The system of roughly 350 servers that process buy and sell orders had had a few hiccups but no major outages in its first decade. That all changed on Thursday, when a piece of hardware called the No. 1 shared disk device, one of two square-shaped data-storage boxes, detected a memory error. These devices store management data used across the servers, and distribute information such as commands and ID and password combinations for terminals that monitor trades. When the error happened, the system should have carried out what's called a failover -- an automatic switching to the No. 2 device. But for reasons the exchange's executives couldn't explain, that process also failed. That had a knock-on effect on servers called information distribution gateways that are meant to send market information to traders.

At 8 a.m., traders preparing at their desks for the market open an hour later should have been seeing indicative prices on their terminals as orders were processed. But many saw nothing, while others reported seeing data appearing and disappearing. They had no idea if the information was accurate. At 8:36 a.m., the bourse finally informed securities firms that trading would be halted. Three minutes later, it issued a press release on its public website -- although only in Japanese. A confusingly translated English release wouldn't follow for more than 90 minutes. It was the first time in almost fifteen years that the exchange had suffered a complete trading outage. The Tokyo bourse has a policy of not shutting even during natural disasters, so for many on trading floors in the capital, this experience was a first.
After trading was called off for the day, four TSE executives held a press conference, "discussing areas such as systems architecture in highly technical terms," reports Bloomberg. "They also squarely accepted responsibility for the incident, rather than trying to deflect blame onto the system vendor Fujitsu Ltd."

One of the biggest questions that remained unanswered is whether the same kind of hardware-driven failure could happen in other stock markets. "There's nothing uniquely Japanese about this," said Nicholas Smith of CLSA Ltd. in Tokyo. "I think we've just got to put that in the box of 'stuff happens.' These things happen. They shouldn't, but they do."
Data Storage

Microsoft Testing Windows 10 Feature That'll Detect If Your SSD Is Failing (reuters.com) 39

Microsoft is testing a new feature for Windows 10 that will alert you if your SSD drive is failing. Microsoft is also testing an update to Your Phone that will allow it to work with multiple devices. PCWorld reports: Both features arrived as part of Windows 10 Insider Build 20226 for the Dev Channel, Microsoft's laboratory for future features. The Dev Channel is truly experimental, meaning that these two new features may or may not become official features of the operating system. Fortunately, both are straightforward. An aftermarket SSD may ship with utility software that monitors an NVMe SSD drive's health, but Windows itself does not monitor the drive. In this test feature, Windows 10 will add NVMe SSD drives to its monitoring processes, and let you know if it's about to fail. If you then go into the Windows 10 Settings menu for Storage, you'll see that the SSD drive in question is listed as unreliable. In that case you're advised to back up everything. "Attempting to recover data after drive failure is both frustrating and expensive," Microsoft said in a blog post. "This feature is designed to detect hardware abnormalities for NVMe SSDs and notify users with enough time to act. It is strongly recommended that users immediately back up their data after receiving a notification."
Microsoft

Microsoft Unveils Surface Laptops To Fulfill 'PC For Every Single Person' Vision (venturebeat.com) 65

Microsoft today unveiled the Surface Laptop Go with a 12.4-inch touchscreen for $549, its cheapest and lightest (2.45lbs) laptop yet. The company also updated the Surface Pro X with SQ2 -- Microsoft's second-generation custom ARM chip co-engineered with Qualcomm -- for $1,500. Both are available for preorder today and ship on October 13. From a report: Those are the highlights. But a single sentence in Microsoft's announcement stood out to us. "What started as a vision for a PC in every single home has now evolved to the need for a PC for every single person," Panos Panay, head of engineering for all of Microsoft's devices, said in press briefing. For decades, Bill Gates' vision was "A computer on every desk, and in every home, running Microsoft software." That's why even in 2020, Windows 10 is running on 1 billion devices.

[...] Surface Laptop Go is powered by Intel's 10th generation i5 QuadCore processor, up to 16GB RAM and 256GB storage, and up to 13 hours of battery life. Microsoft is also touting a full-size keyboard with 1.3mm key travel and a fingerprint power button for one touch sign-in. Then there's a 720p HD camera, Studio Mics, Omnisonic Speakers, Dolby Audio, USB A, USB C, audio jack, and the Surface connector.

Slashdot Top Deals