
Another article addresses this same balancing act, but along the local/remote dimension. More from a user perspective, it illustrates that the decision to go to the cloud can be a logical one, based on a method of dividing your data into types, along the lines of access, application, performance and security. In other words, high performance, high priority, high availability apps and data living in Oracle databases and the SAN should stay there. And for things like archival file data, or less-business critical apps, you should look to the cloud to realize savings.
Speaking of Security, a new study from MIT and UC-San Diego has supposedly uncovered a new class of vulnerabilities in public compute clouds. This may become a bump-in-the-road for the vCloud and Compute-on-demand market. Or maybe a basis of differentiation.
The cloud space is shaping up to be very similar to the ASP market a decade ago. Customers demanded that providers address all the tradeoffs and shortcomings that arose. And vendors became squeezed between the cost of meeting these requirements and the slow payback of monthly usage fees. As time went on, there were a handful of survivors, those that got exits with large IT players, and many more who faded away. Keep this in mind as you make your Cloud vendor selections.
Having said all this, the outlook looks good for cloud when carriers like AT&T are seeing 35% growth in their network traffic in the midst of a global downturn. We're all still doing more online, customers and users are expecting more web-based systems and thus, deploying more of your data, applications and compute into 'the cloud' seems logical and inevitable.
This comment has been removed by a blog administrator.
ReplyDelete