IBM Brings Cloud-Based Spectrum Storage to Microsoft Azure

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

IBM is extending the reach of its cloud-based Spectrum storage product to Microsoft Azure in an effort to reduce the complexity enterprises can encounter when they adopt a hybrid cloud strategy.

The vendor already supports its own IBM Cloud as well as Amazon Web Services (AWS) with Spectrum Virtualize for Public Cloud, a cloud-based alternative to the software found in IBM’s FlashSystem and SAN Volume Controller. 

Now the software-defined storage (SDS) will support Azure, the world’s second-largest public cloud provider after AWS. Spectrum Virtualize for Public Cloud offers the same storage features and functionalities in the cloud that are found in on-premises data centers. 

Commonality is Key

Having common storage capabilities in the cloud and the data center makes it easier for enterprises to implement hybrid cloud storage capabilities like disaster recovery, cloud DevOps and data migration, Chris Saul, program director of product marketing at IBM, wrote in a blog post.

“Since it provides this same function across clouds, it also makes it easy to use multiple clouds or to move from cloud to cloud,” Saul wrote.

The latest offering is one of a number of enhancements IBM is making across its Spectrum storage portfolio to address the challenges that come when moving applications and data into the cloud. Saul noted a recent report from the IBM Institute for Business Value, which found that 97 percent of businesses are piloting or implementing cloud initiatives or integrating cloud into their operations.

“However, as hybrid cloud environments become the norm, businesses must contend with additional IT complexity, public cloud costs, and threats from cyberattack and other data destructive events,” he wrote, adding that the new capabilities and integrations from IBM are “designed to help organizations reduce IT complexity, deploy cost-effective solutions and improve data and cyber resilience for hybrid cloud environments.”

Also read: Hitachi Vantara Eyes Hybrid Clouds with New Storage Offerings

Reducing Hybrid Cloud Complexity

The idea is that by ensuring consistent functionality, APIs, management and user interfaces in public cloud and on-premises environments, IBM can reduce hybrid cloud issues. On Azure, Spectrum Virtualize for Public Cloud supports IBM Safeguarded Copy, which automatically creates isolated and immutable snapshot copies that can’t be accessed by software, including malware, and can quickly recover data on premises or in the cloud if something goes awry.

IBM’s move to bring its Spectrum storage capabilities to the public cloud is important to organizations that want to leverage the cloud, according to Charles King, principal analyst with Pund-IT.

“IBM’s customers are among the largest users of hybrid multi-cloud services and infrastructures,” King told Enterprise Storage Forum. “IBM recognized early on that it made little sense to emulate the [low cost-per-volume of] public cloud players. Instead, it focused largely on developing solutions and services aimed at its large enterprise clients while recognizing that those same companies were also engaging AWS, Azure, and other public cloud companies. By supporting heterogeneous cloud platforms and environments, it has increased its own value to IBM customers.”

Consistent Capabilities

Jane Clabby, an analyst with Clabby Analytics, wrote in a report that “IBM’s Spectrum Virtualize supports over 500 IBM and non-IBM Storage systems, providing a consistent management interface and storage software capabilities across a broad and varied range of storage,” adding that its data reduction capabilities can translate into as much as 65 percent savings in cloud storage use.

“By virtualizing leading public clouds, AWS and Microsoft Azure, customers have consistent data operation, data services, data protection and security across the entire IT estate — legacy, private cloud, hybrid cloud, IBM Cloud, AWS, and Microsoft Azure,” Clabby wrote.

IBM also made enhancements to a range of other Spectrum products to ramp up the cybersecurity capabilities in the platform and protect enterprises from such threats as ransomware.

In Spectrum Protect, the offering now supports replicating backup data to additional data protection servers and supports using object storage or long-term data retention, which reduces the cost of backup. In addition, Spectrum Protect Plus includes improvements aimed at Red Hat’s OpenShift Kubernetes platform to protect data in containers, including Red Hat certification, support for OpenShift workloads deployed on Azure and direct backup to S3 object storage.

Spectrum Scale data fabric now has a high-performance S3 object interface, enabling cloud-native S3 applications to deliver faster results without the delays typical of object storage. A new GPU director storage interface means that Nvidia applications can run up to 100-percent faster with Spectrum Scale. IBM’s Elastic Storage System 3200 now includes a 38 terabyte FlashCore Module, which is twice as large as its predecessors and doubles the capacity of the ESS 3200 to 912TB in two rack units.

Also read: Mapping Out a Hybrid Multicloud Strategy

The Promise of Turbonomic

IBM also is making promises based on its acquisition this summer of Turbonomic, an application and network performance management software maker that Big Blue bought as part of a larger effort to expand the use of artificial intelligence (AI)-based automation in hybrid clouds. Turbonomic will collect data from FlashSystem storage, such as storage capacity, IOPS and latency for each array, and its analysis engine will combine with the data from FlashSystem products as well as virtualization and applications to ensure software gets the storage performance it needs.

The goal is to free organizations from having to over-provision their storage and to improve capacity by 30 percent without harming performance. 

“Turbonomic solutions are designed to enhance application performance by utilizing tools, including AI, to effectively manage compute, memory, storage and database resources on-prem and in the cloud,” Pund-IT’s King said. “As such, it is a great fit for a vendor as focused on maximizing system and service performance as IBM.”

Storage will continue to be a key part of the hybrid cloud story, he said.

“Data storage is in an odd place — not particularly popular as a subject yet continuing to drive remarkable performance and capacity innovations that are vital to key workloads, including AI, machine learning, and advanced analytics on premises and in the cloud.” King said. “Storage features, like data portability, play crucial roles in services supporting business critical workloads. The continuing focus on AI and related processes has underscored the value and importance of data management processes. GIGO [garbage-in, garbage-out] in the cloud has the same deleterious impact as GIGO on premises.” 

Clabby noted that IBM continues to support emerging technologies like containers and hybrid clouds by introducing new solutions for technologies like OpenShift and driving support for AWS, Azure, and other public cloud providers.

“This latest set of storage announcements reinforces that strategy by improving common management, consistent data access and data sharing, as well as data resiliency across multi-vendor public, private and on-premise environments — eliminating data silos and gaps in data security, and enabling IT specialists to focus on higher-level issues,” Clabby wrote.

Read next: Best Hybrid Cloud Storage Vendors & Software 2021

Jeff Burt
Jeff Burt
Jeffrey Burt has been a journalist for more than three decades, the last 20-plus years covering technology. During more than 16 years with eWEEK, he covered everything from data center infrastructure and collaboration technology to AI, cloud, quantum computing and cybersecurity. A freelance journalist since 2017, his articles have appeared on such sites as eWEEK, The Next Platform, ITPro Today, Channel Futures, Channelnomics, SecurityNow, Data Breach Today, InternetNews and eSecurity Planet.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.