Friday, October 1, 2010

Cloud on the Top of the Hype Curve

I know there's a lot of talk out there about the Cloud and how hyped it is.  It's true.  If you look at Gartner's latest Hype cycle on Emerging technologies, you can see where they place it: literally at the top of the cycle. 

But that's OK - A lot of hype means a lot of awareness. As a vendor this adds wind in our sails and as a customer it makes it easy to find the content that will help make informed decisions.  In fact, last I checked, the term 'Cloud Computing' was yielding over 34 million hits on Google.  Sounds like lots of information for folks who want to learn more.

Another popular topic about Cloud is how long will it take to become a significant movement.  Well, if you believe the leading, trusted analysts in the space like Frank Gens from IDC (I know I do), he already puts the Cloud market -- and let's be specific -- this is the market for External Cloud services (not 'private cloud' spending) at $15.6 billion as of 2009.  That's a big Total Addressable Market.  Far bigger than many segments of IT as we know it today.   And the growth rate is expected to be 27% annually between now and 2014, with some areas such as Cloud Storage racing ahead at a 37% compound annual growth rate. 

Let's look at just Cloud Storage.  According to those same IDC numbers, it's already 9% of the market, and at the current growth rate, that would make it about $1.8bil this year.  That's almost as large as the entire SAN storage networking market, and growing far faster.  Projecting forward, by 2014 Cloud Storage spending will exceed $7 bil.  And by that time external Cloud spending will account for 10% of the IT budget. 

So some are saying Cloud is over-hyped.  Some, that it's under-hyped.  I'd say it's about right for where it is in its maturity.  The perception for some is that Cloud is just for deep file archiving, Test/Dev and limited web-based development.  In reality, early adoptors are already running 60-70% of their business apps in the cloud.  We're seeing the Government laying the groundwork for serious investments in Cloud-based services through programs such as FedRAMP.  And enterprises are swarming shows like VMworld to understand the art-of-the possible, and continue to build upon their internal 'cloud' efforts to be able to extend them towards carrier and service provider clouds.

So long as Cloud vendors continue to deliver capex and opex savings, improved efficiencies, decent quality and security, and fast time-to-solution, then the growth will continue, with or without the hype.

Tuesday, September 21, 2010

Newest Cloud Storage Enabler

I was fielding an inquiry from one of my engineers and thought I'd share it, as it's probably a common question, esp. as we continue to see Cloud awareness spread, while the network barriers to external Cloud service adoption remain. Enjoy.

-----Original Message-----

Sent: Monday, September 20, 2010 5:30 PM
Subject: Cirtas

Hi Mike,
I'm interested in Cirtas, because a user asked me. What do you think of this kind of storage?

-----Original Message-----

From: Mike Harding
Sent: Tuesday, September 21, 2010 9:36 AM
Subject: RE: Cirtas


This is a very valuable solution. Much of 'Cloud storage' is not useful for companies for at least a few key reasons:

1) Security -- It's the #1 inhibitor to using external cloud services. Customers are worried that their data will be compromised either in transit outside of the corporate firewall, or after it's been stored, especially within a multi-tenant hosting environment.

2) Performance -- The second biggest problem with hosting your data in the cloud is that it's far away from the applications and users. So the distance between you and your data creates latency as well as other common WAN issues such as jitter, lost packets, etc. This is why we've seen much of cloud storage being for deep archiving or uses where you don't care how long it takes to either put or retrieve your data, such as email archiving for regulatory reasons.

3) The need to change your application -- Many providers, even those using brand name enterprise-class storage hardware such as EMC Atmos, are only allowing access via a RESTful API. This means that the customer needs to write an application that uses this API in order to store and access the data, and for all intents and purposes, limits the use of that cloud storage for web application media and data.

Cirtas, which just launched publically, is a great example of what I call a Cloud Storage Enabler in that they allow customers to overcome these barriers to adopting the external cloud. Their product, Bluejet, encrypts your data so it's secure both in transit and at-rest. It accelerates the transit with data compression and deduplication. And it emulates local storage, so it looks like any NFS/CIFS target to your applications and users. Cirtas is one of a number of companies with similar Cloud Enabler solutions - you should also consider Nasuni, TwinStrata, Panzura and StorSimple.

Thanks, and good luck,

Mike Harding

Thursday, July 8, 2010

Hybrid Cloud: The Preferred Approach

I've been a big believer for awhile now that 'Hybrid cloud' was where the industry needs to be going.  It's the big idea that truly allows companies and their IT organizations to seamlessly knit together their internal resources with those of their technology vendors.  And I'm a little biased these days working for a networking company, where it has been painfully obvious for some time that the Network is the key to unlocking a lot of latent demand in external cloud computing... with concerns over data Security, Availability and Performance as the biggest barriers to adopting externally-based Cloud services.

But aren't these insurmountable barriers, only leaving companies able to do 'Private Clouds'?  Well what is a Private cloud?  Some would say it doesn't exist - it's just a new label for what a lot of companies have been trying to accomplish within their data centers for some time now, with key capabilities being Server Virtualization, Automation, Self-service provisioning, and Chargeback for usage-based cost allocation.  Gartner called this Real Time Infrastructure. Others termed it Data Center Automation.  Others Utility Computing

The reality is that Hybrid cloud computing is already the preferred approach for organizations as seen in this chart from The Info Pro.  Almost 60% of companies expect to be using Both external cloud services as well as developing internal cloud capabilities.


Hybrid computing is getting top vendor support from the likes of Intel, Microsoft and others.  So we can expect to see the ability to connect across data centers -- safely, efficiently and in a manageable way -- becoming an embedded capability within the component resources that we purchase in the future.

The move to Cloud computing is happening today and the only question is at what rate. A recent Brocade study showed that 60 percent of enterprises expect to have started the planning and migration to a cloud computing model within the next two years, with key business drivers being to reduce cost (30 percent), improve business efficiency (21 percent) and enhance business agility (16 percent).  Other interesting Cloud findings included:
  • More than a quarter of large organizations are planning to migrate a cloud model within the next two years; 11 percent within one year
  • A quarter of organizations stated that the ability to consolidate the number of data centers was also a critical driver
  • The availability of bandwidth was also a deciding factor amongst 14 percent of respondents
Conclusion: Embrace the concept of the Hybrid Cloud, and team with technology vendors that are delivering a roadmap allowing you to execute on that vision as soon as possible.

Tuesday, June 22, 2010

Cloud Network Optimization validated

With the explosive growth of the Cloud Services market, exceeding $68 billion this year, it was obvious to expect lots of innovation in and around this space.  I had been mostly blogging about efforts in the Cloud Infrastructure and Storage space, and was seeing a microcosm of this opportunity with new classes of optimized gear to speed and secure the links from the customer to the cloud.

We recently have witnessed the emergence of Cloud gateways that appear as local NAS but act as intelligent controllers that cache, optimize, encrypt and convert data from the LAN out to the Storage cloud. Along with TwinStrata, Cirtas, StorSimple, and Nasuni, we can now add Panzura to this vendor list.

But I still expected to see a pure-play Cloud Network product in this area.  A device that performed the caching, dedupe, and encryption of a WAN opt. appliance but specifically aimed at datacenter-to-cloud traffic, where the protocols and acceleration are tweaked for storage data and larger pipes. It would be a network device, not a storage device, thus complementing new products such as EMC VPLEX to speed storage virtualization between data centers and enabling use cases such as VMotion over distance.  I had assumed an established network player would be first to meet this need, but a new player, Infineta, took the brass ring.

Infineta has been very distinct in their positioning, focusing on datacenter to datacenter and not branch office traffic which is the established realm of Riverbed and traditional WAN optimization.  And they released a cool new video on youtube

It's ultimately up to the analysts as to how the markets get defined, but with this latest product entry, I'm considering Cloud Storage Networking to be a validated market.  For IT organizations, now's the time to start thinking how you can use these new products to safely and cost-effectively transition archival and nearstore data out to the Cloud.

Tuesday, May 4, 2010

Cloud Storage Optimization market

A new product category is shaping up in direct response to a new customer need.  As seen in recent stories like "What's keeping Data Storage Out of the Cloud?", companies want to use new Cloud Storage services from providers such as IBM, but they are concerned about the security, availability and cost of the required network connection.  Enter Cloud Storage Optimization.

To bridge this network gap customers face a number of sub-optimal alternatives:
  1. Don't worry about it -- probably the most popular approach, which only works if you don't care about your job, or the value of the data going across the wire is so low that it's not a big deal if it takes forever for the transit, or it's hacked, or both.
  2. Lease a private connection -- this is an option for 'too big to fail banks' or other major organizations where cost isn't an issue.  But for most companies, the incremental cost of the circuit eliminates the economic savings of the Cloud service.
  3. Use a generic WAN optimization box -- Not a great solution as these are software-based appliances designed for lower-bandwidth branch-office connections and a broad mix of transactional data.  The Cloud Storage connection is really a SAN-like 'channel' which will be very data intensive, and will benefit from hardware-based compression and offload processing.  And similar to the private circuit, the cost of the WAN optimization appliance that supports the higher throughput needed for the Cloud storage will cost you more than your annual Cloud storage bill.
What we're seeing in response is an initial first step towards closing this market gap: a new category of Cloud Storage Optimization solutions, or Cloud Storage Gateways.  Representative companies include Cirtas, Twinstrata, Nasuni and StorSimple.  These are all start-ups who seem to be quickly gaining awareness and traction with companies and Cloud Service Providers.

The anatomy of a Cloud Storage Gateway is made of software that either resides within an x86 server (i.e. an appliance) or completely as software that can be deployed within a VM.  They typically are asymmetric (i.e. single-device) solutions often positioned as a NAS filer.  Typical capabilities include NAS-to-Cloud API emulation, WAN Optimization, Caching, In-transit Encryption and Data management features such as snapshots.  As software-based solutions they are flexible, and meant to be affordable and targeted to a more mid-market customer.  Similarly they are intended for not-overly-demanding throughput needs, as there is no purpose-built processor offload. 

For mid-market companies looking to add a Cloud tier of archival or similar offline data storage, these are products to consider.  For enterprises or companies who want to leverage Cloud storage as a nearstore alternative, you will want to wait for next-gen 'Cloud Networking' products built for high-throughput, hardware-assisted optimization, symmetric caching/network de-dupe capabilities, and that integrate with your existing network management framework.

Tuesday, April 13, 2010

Thank goodness for Twitter

Not sure about you but life is moving pretty fast these days.  And though I have a deep-seated bias against blogs as a medium for serious, thoughtful communication, versus let's say a written letter or a conversation.  Yet, most of the time I can't even muster the time required for a half-way decent blog entry.  

Enter Twitter.  140 characters provide an adequate if staccato means to share an update, a heads up on an event, an interesting Cloud article or just a passing thought.

So if you aren't finding many postings here, then be sure to look for me on Twitter. I'm @mhardi01, and am often posting #cloud related tweets.

Tuesday, March 30, 2010

One Year in the Cloud

The other day I had the epiphany that I've been working on my company's Cloud effort for a year now.  It was around this time a year ago when a few of us, holed up in a room, were doing some next-gen data center planning with a topic being 'clouds'.  I hadn't given the subject too much thought with regard to our product set up to that point.  But a few promising concepts came out of that meeting and the larger effort took off. 

It's all shaping up to be a big inflection point in the IT business, in some ways a lot like server virtualization, ASPs before that, and the web before that.  Hang on for another great ride.    :)

Share any interesting anecdotes of your experience to date with the Cloud. 

Tuesday, February 2, 2010

How to Calculate an Accurate Cost of Cloud vs. Cost of Inertia

There was a very helpful article written recently on the 'hidden' costs of using external Cloud storage services.  This is important information when you are sitting down to determine the business case for moving data outside your data center.  The article correctly points out the basic cost for storage from someone like Amazon Web Services can be as little as $0.15 per GB per month, and that volume discounts can bring this down further.  However, additional features to support WORM or information lifecycle management, will increase the pricetag towards $1.00 per GB.

Some services charge to upload data, some to download, some for both.  And if you have too much to send over the wire, you are welcome to send a tape.  But that will cost you, too.

And then there's connectivity costs.  If you are already max-ed out on your internet connection, then moving data to the cloud will require incremental bandwidth.  Some enterprise-class services actually require bundled bandwidth or even direct circuits, in order to provide an SLA.

So if all this sounds daunting, let's look at your current cost of storage.  The rule of thumb is that for any IT capability, the direct cost (i.e. what you buy and deploy) is only 20% of the total cost: the rest is the cost to maintain it.  This indirect cost % appears to be increasing over time -- both a function of improved value in IT products, but also an increase in wages, facilities, energy, etc.  And this is also true for storage.

TCO components of your storage base case need to include the cost of the hardware -- either the entire cost if looking at a 3-year period, or an annual depreciation.  I'd assume you're looking mostly at Arrays, but you may also have servers involved for some supporting application.  Make sure you're comparing apples to apples with the complete outsourced offering.  If this is a decision to buy another NAS filer vs. a contract with someone like Rackspace, then you need to factor-in the complete deployed cost.  Include any installation, training, and related on-site switching expense like for instance local data migration.

Then add system software and/or array-based software licenses.  Add annual maintenance and support fees.  Add the allocated Server Administrator cost for the devices: the annual burdened wages (i.e. salary + 15% or more for taxes, benefits).  This should be a big line item: staff costs are consistently 40% or more of total costs in the data center.  Each admin can manage just so many TBs of storage.  You need to figure this out for your current environment; even if an analyst or vendor study says that one admin FTE can handle 10TB in an ideal world, what really matters is what you are running in your data center today.

We're not done: you can't forget the allocated cost of the data center space.  This includes the rent of the space, and the power, the heating and cooling, and if you need to pay for a set of hands within the facility when changes are made.   I wish I could offer a rule of thumb here, but it depends on whether you have your own data center and what 'tier' of DC it is, or whether you are using a co-lo.  There are variables in terms of wattage density per rack, efficiency of the cooling system, and probably other key variables.  For many companies their problem is that their space is either obsolete or they are out of power, or space, or both.  Anyway, this may take a little work to get to, but you need to add a cost to reflect the data center operations.

Now look at the numbers.  Assuming you are one of the lucky ones who still have the ability to add more storage internally, the business case for using the Cloud should be more compelling, especially where you are talking about a smaller amount of storage, have a smaller (i.e. less scale-efficient) operation, and especially if your needs are more temporary or at least not expected to be consistent over the life of the hardware (e.g. you don't need to use all the storage for all that period of time).

IT is definitely moving thier resources to the cloud, and it's all about the economics.  Sharpen your pencil and make sure you're taking an accurate picture of your internal vs. your expected Cloud storage costs.  Good luck!

Wednesday, January 27, 2010

Early Cloud Investment makes for a Buyers Market

The $250mil HP-Microsoft deal announced a couple weeks ago speaks to how quickly and seriously the big players have converged on the Cloud opportunity.  Literally before the market has even come to understand what 'Cloud' means, we've seen Cloud deals involving many of the top global IT players including Cisco, EMC, VMware, HP and Microsoft.  Only Oracle-Sun is dialing back their cloud investment.

This is different than the uptake around ASPs many years ago.  Small pure-play start-ups like Corio and Jamcracker were prominent in the market for years with little competition from top IT firms, who instead chose to sit out the market wave and acquire ASP capabilities after the bubble burst. This early involvement of larger players means more vendors with capabilities and expected staying power; in effect two Savvis' for every one GoGrid.  So customers don't have to choose between start-ups who "get it" and dinosaurs who don't.

With more investment, sooner, we can expect the life cycle of the category to be similarly accelerated.  It bodes well for customers looking for validation and future-proofing from leading vendors.  It will probably also shake out the smaller players sooner, as they find themselves competing with established tech brands touting seemingly less-risky solutions.  Perceived security and 'trust' will be key aspects of success, especially in the hosted cloud space.

So with more M&A and investment expected relating to the Cloud, take your time before making big investments, plan with an eye towards federating your external Cloud services with your internal data center environment (rather than creating a hodge-podge of cloud silos), and enjoy a developing buyers market in all things Cloud.

Friday, January 22, 2010

Best Practices for Choosing an External Cloud Vendor

The idea of Cloud is catching fire.  One vendor recently shared that half of their customers are considering external Cloud storage providers. (As this was from a backup software vendor, I'm not sure this is such good news for them). 

The good news for you is that every IT vendor out there is delivering products or services to meet this growing Cloud interest. And the bad news is... every IT vendor out there is delivering products or services to meet this growing Cloud interest. Many are the same products that were last year's "Green IT" solutions. Some are enterprise-class utility computing products aimed at your data center. Some are public web-based services for SMBs.

I've been working now for some time on new External Cloud product efforts, and am seeing the potential confusion just in this one area of Cloud computing. So I thought I'd share a few tips I've picked up that may help you as you are considering new projects in "the Cloud".
  1. Like any outsourcing decision, start with projects and applications that are not mission-critical or performance-sensitive.
  2. Spend time to develop an architecture to address the linear scalability, parallel processing and distributed data aspects of cloud computing.
  3. Understand how your management, monitoring and policies will accommodate external storage and compute resources especially in terms of security, availability, utilization and compliance.
  4. Compare Cloud subscription costs with your burdened total costs of ownership -- Cloud should deliver savings for short-time horizon projects, where specialized admin staffing would otherwise be needed, and where large fixed investments may not be fully utilized.
  5. Stick to annual contracts especially as Infrastructure as a service will become commoditized over time.
  6. Expect that only a few leading ecosystems will emerge once the hype passes – choose your vendors carefully looking for a track record, customer references, a documented SLA and top quality infrastructure.

Good luck with your projects, and feel free to share what you're learning in the Cloud.

Tuesday, January 12, 2010

Storage in 2010 - Seeking Efficiency in the Cloud

I came across a useful story that compiles a number of the 2010 outlooks for the storage industry.   Forrester, ESG and Symantec seem to be sharing an optimistic view, with spending to increase while customers seek ways to be more efficient.  This points to increased consideration and adoption of hosted storage or Storage-as-a-Service offerings this year.

Storage-as-Service has been an established, viable offering well over the past decade.   Companies have been using hosted storage services or ‘Cloud Storage’ going back to at least 1993 from service providers like LiveVault (now part of IronMountain).  Like any outsourcing activity, IT organizations leverage out-of-house hosting providers to lower their capital expense as well as administrative staffing requirements. With the increased adoption of larger bandwidth connections across corporate WANs, and the continued growth in data, we can expect to see more offsite storage useage in 2010 and beyond.

Storage-as-a-Service offerings will grow in popularity from off-line archiving to online product use over the next 5 years.  Currently Cloud Storage is mostly limited to offline uses where the network latency isn’t an issue, such as for video and email archiving, and we will predominantly see this type of use for Cloud storage in 2010.  As network connections between enterprises and their service providers are improved along at least a few key dimensions, we will see these use cases expand to include more production-oriented data storage.  Over time Cloud storage will become an alternative to nearline storage, and some service providers have been quick to launch services to begin this evolution.

However, at least three major advances in networking will be required to enable the broader growth of the Storage-as-a-Service market.   Moving data over a WAN link is a major inhibitor to production storage service use.  Transit acceleration will need to be a core part of the solution to improve the performance to be closer to native LAN latencies.  Similarly, the availability of the connection needs to be better maintained, and more advanced techniques will need to be employed to assure more highly available connections between the datacenter and the service provider, which will also improve throughput. Finally, security for both data-in-transit and data-at-rest will need to be in place to assure that business and customer data cannot be breached, especially within multi-tenant hosted environments.

Wednesday, January 6, 2010

Cloud Stepping into Commodity Pricing Trap?

Back before the holidays there was a story about Amazon bringing out demand-based pricing for their EC2 service. This 'spot instance' program, still in beta, has already been successful in building some awareness and blogosphere traffic for them.

When I initially read the story I wrote this off as a bad trip down "Revenue Optimization" lane. The self-destructive road that the travel industry took in the 90s. Being part of that business, it was exciting to develop systems to determine optimal pricing that mapped to demand segmentation. It became a real craze across airlines, then hotels. And it's still a big feature of that business today. But as the economy soured, the use of technology to maximize revenue became an accelerated exercise in optimally dumping excess inventory. What we later learned was that when travelers compared notes and realized that one paid a hundred dollars less than the other for the same seat or bed, it had an incredible commoditizing effect on how they viewed your product. And fueled by the power of the Internet, it's a trap they never escaped. We perpetuate this legacy every time we begin our travel planning by using a price comparison site like kayak, orbits or travelocity. The brand value of the individual airline, hotel or rental car has all but vanished. The same attraction recently hit Broadway – the phenomenon was captured nicely in a NYT article.

But then I went back and read the details of how the Spot Instance program works. Turns out you bid a rate, and if the price falls to that level, then your Amazon instance will turn on. And just as quickly, it will turn off as soon as the spot price exceeds your maximum price! It would be like kicking the discount vacationer out of their bed because a full-rate business traveler showed up at the front desk. And who has a workload like that, anyway???

Time will tell how this experiment plays out, but I’d suggest keeping an eye on this service, as we are either seeing the next chapter being written in the Pricing and Revenue Optimization text book, or the emergent Cloud industry starting down the commodity road to ruin.