Interview

Mark Ward

Cobalt Iron & Data Protection Modernized / Mark Ward, who’s the COO of Cobalt Iron

We all have to protect data. Data’s getting bigger, & operations are getting more distributed, complexity is rampant, and costs are high, especially if we don’t protect our data and we have to try and recover and keep some business continuity going. There are lots of problems. And yesterday’s solutions just don’t close the gaps anymore. Mark Ward, the COO of Cobalt Iron, tells about their solution for data protection with a unique angle, and it seems to have something that ticks off all the boxes for large enterprises.

 

Who is Cobalt Iron?

So, before we get into adaptive data protection, which is what Cobalt Iron sort of wraps their solutions around, where did Cobalt Iron come from? What’s the background of that? – Yeah, and the background of Cobalt Iron is a number of 20 plus year storage and data protection experts got together about six years ago to remove the complexity, the cost, the consumability of data protection in this new landscape, where cloud, as a service, and massive reorganization of cost structures from capex to opex exists. So, the lineage is a bunch of EMC, IBM, Hitachi data systems, Convault folks, who’ve been working to solve those enterprise problems for many years.

What is it tangibly that Cobalt Iron offers and brings to market?

Fundamentally it starts with what we have seen the customer requiring of the enterprise IT world. And that is, get us out of having to manage the day-to-day complexity of provisioning assets, managing, maintaining, and driving, let’s call service levels, to the business. For us, we are 100% focused on the data protection elements of that. So what we have done is, we have used some of the technology innovation of the past 10 or 15 years, to really change the way data protection is delivered.

The Role of Virtualization

I think our colleagues at VMware did a great job solving server virtualization. What if we could take the same constructs of virtualization to the data protection landscape? Making assets, whether they are physical, hybrid, or cloud assets, to be deployed in managing data protection? The other key element is salesforce.com innovation around as a service, or software as a service, delivery. So you don’t require your own IT teams to deliver, and use smart technology. In our case, analytic-driven automation to simplify these distributions and the service level management of data protection, and that eliminates cost, and we do that by leveraging common off the shelf hardware, as well as cloud assets, to bring a much lower cost and higher level service to data protection. In short we are going to come in to someone who’s got this complex, heterogeneous, probably incomplete data protection environment, if they’ve just been patching together their legacy stuff over years. And you can say to them, “We are going to come in, “and would put in something cohesive.

Manage Data Protection For You

“We are going to remotely manage it for you. “We have got automation, we have got intelligence. “And we are going to apply all our best practice. “And we have got this commodity hardware, virtualized approach “that really reduces your costs where you could possibly even think that they would go. So number one, we are not a rip and replace vendor. There’s been a lot of – Okay innovation in the industry. Meaning everything you’ve done before I showed up is bad.” We don’t believe in that, particularly when it comes to the data protection landscape. So we’ll leverage the existing application, we’ll leverage some of the existing assets. We will modernize assets that require it; those that are into their maintenance years. We will upgrade storage, server, network assets that require upgrade. But we’ll leverage existing infrastructure as we journey them to this, as a service cloud modern approach to it. And that’s where, to your point, we leverage the intelligence of our software as a service delivery, to manage, maintain, and drive costs down of existing assets and new assets in the IT infrastructure, particularly servers, storage, and network devices. – So it’s not a throw everything away lift and shift. You’re really going to come in and say, “Look, we can deliver you data protection as a service, and we’ll use what you have. But we are going to bring in lots of other things around it and put it all together into something we know is a best practice the expertise way to deliver this, and do that. When folks do this, then do they have to go though a big internal justification process to do this? It sounds like it’s more of, I can save money and I can get done what I’m trying to do anyway.

When should one review their Data Protection Investments?

Data protection tends to be very much a cyclical buy. When you look at both the infrastructure assets and how folks address it. So we tend to grab customers when they’re doing one of two things. Firstly, upgrading the existing environment for new workloads, such as new virtual workloads or new cloud workloads. Or we catch them on a cycle where their existing infrastructure, quite frankly, is moving to its maintenance period and is very expensive to maintain, so they’re looking for a refresh. They’re looking to take advantage of the lower cost at the infrastructure compute level, and they’re looking at lower cost service delivery capabilities. I think the neat thing that we have to figure out is, what really changes the industry approach to backup. There have been 800 pound gorillas in the market that throw a lot of smart people at the problem and say, instead of you managing the environment, let my people manage it for you. Over the past seven years we have realized that that’s really a flawed approach, and what we have done is we have taken very smart people-based processes, turned it into software, built that software in the cloud in 22 different data centers around the world, and it’s that cloud that automates, through analytics, the delivery of data protection services, we are currently deployed in 44 different countries, managing tens of petabytes of data protection per night.

The Role of Automation

And we do that with a call support staff of just a couple of folks that manage questions about using the product. Not actually, our people don’t manage, they don’t touch client`s data, they don’t manage server connections or network connections. Our software is the intelligence that does that and it’s all in the cloud. So we touched on cost. Obviously you can help someone manage the complexities of things, multi-product environments, distribute environments. But it sounds like when you start talking about what your scale and reach is, that you’re really saying, “If you use Cobalt Iron for your data protection services, “you can achieve a kind of data center modernization or you can at least help accelerate that. And particularly folks who are looking and saying, I’ve got applications now that are hybrid. “I’ve got applications migrating to the cloud. “I’ve got things in flight. “I’ve just got rampant virtualization “but I still got some physical servers over there “in the corner, so I’ve got this heterogeneity.

Do you find folks really resonate with the modernization kind of aspect for what you’re doing as well? – Yes, I think 90 plus percent of the clients and prospects that I meet with on a day-to-day basis, are looking to get out of that day-to-day grind of managing the delivery of data protection. They’re looking for a smarter system, they’re obviously looking for lower cost systems.

Data Protection & ServiceNow

They’re looking to leverage technologies, for example, ServiceNow is used as an incident and event management system. Why doesn’t data protection fall underneath the ServiceNow umbrella? Well, the answer in Cobalt Iron’s world is it does. Example for simplification of delivery. A lot of our customers use VMware’s vRealize from a provisioning and a decommissioning system technology for virtualization. Well, why at the time you decide to provision a new virtual machine? Why wouldn’t you apply data protection elements to that new machine at the exact same time? So we found the integration into vRealize, to be very valuable and scalable deployment architecture for our cloud based, and on prime physical system customers. So, that combination of making the modernization decisions you’re taking as an IT organization, consistently aligned with what you’re trying to do in all your stovepipes, whether it is data protection, server provisioning, IT incident management. At Cobalt Iron, with a restful API stack and our Fortune 1,000 customers that we deployed over the past seven years, we have really figured out how to bring those technologies together, so one plus one in our case actually is equal to three. I know, I’ve seen some of the cost arguments, that you can recover unused licenses and reapply them and realign them, and similar things, where you can actually justify a lower cost for what you do, than what’s already probably being paid for, for the incomplete solution today. And then, I know that there’s this modernization angle to it as well, which is all very interesting. I don’t know! It seems like coming from the data protection world, that those things, we have known about them for 30, 40 years in IT, they should all be built-in.

Comment here