Covering Scientific & Technical AI | Saturday, November 30, 2024

AT&T Looks to Cut Network Storage Costs 

Big network operators like telecommunications and cable carriers are looking for ways to manage storage costs as data volumes grow and applications like streaming video begin to dominate datacenter workloads and network bandwidth.

With monthly data storage costs running as high as $15,000 per terabyte, AT&T Labs Research said it expects storage requirements to double over the next two years. Hence, researchers there are looking to software-defined storage as a way to rein in costs as data volumes grow exponentially.

Separately, the lab is investigating network virtualization and new routing approaches as it seeks to improve efficiency through metrics like network utilization.

Chris Rice, vice president of advanced technologies and architecture at AT&T Labs, noted in a blog post this week that the carrier has embraced software-defined storage as part of its "network transformation." AT&T's "proof-of-concept" technology creates a software layer on top of commercial disk drives to provide customized cloud storage for multitenant enterprise customers.

The trade-offs for AT&T involve delivering storage reliability and security while still meeting service level agreements. Another is offering the ability to shift storage requirements in real time without degrading network performance.

The result, Rice said, was an automated means of customizing storage. The software-defined storage approach is touted as allowing enterprise customers to "tweak the dials" on cost, performance and reliability. A visualization tool allows customers to see the impact of a change in one parameter on another.

The approach targets multitenant cloud and storage configuration applications. "In a multitenant cloud, it's very important the you guarantee your quality-of-services," Robin Chen, a software researcher at AT&T Labs, noted that in a company video. The labs' approach involves mapping storage security requirements into configurations required by individual customers. "You make sure that [cost, performance and reliability parameters] don't interfere with each other," Chen said.

The software-defined storage approach that separates hardware from software aims to allow each tenant to securely share cloud storage, Chen added.

AT&T Labs has also jumped on the erasure coding bandwagon as a way of reducing raw storage requirements. Rice touted AT&T's algorithm as helping to reduce raw storage needs while protecting data integrity better than triple redundant backup storage.

AT&T is currently beta testing its software-defined storage prototype internally as it seeks to scale the system and "harden" technologies with erasure coding that would be deployed in its datacenters.

Along with software-defined networking, the telecom giant is also pressing ahead with network feature virtualization efforts. Rice said AT&T's goal is to virtualize 75 percent of its network by 2020.

The company launched an SDN design challenge in May with the immediate goal of improving network routing capabilities. As part of AT&T's network upgrades, "we are evaluating how to virtualize routing, using software on standard compute equipment," Rice said.

One promising approach called "flow-based" routing has been used in datacenters. Carrier network operators are investigating whether flow-based routing approaches could improve both routing and network utilization, Rice added.

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

AIwire