Next week in Tel Aviv I'm going to participate in a panel about "the future of clouds", moderated by the legendary Yossi Vardi. In preperation, I wrote down a few of the concepts I've been thinking about for the past several years and I thought I would share them with my readers to get some feedback. Keep in mind these are long-term predicitions and trends (in no particular order).
I'd love to hear some feedback on these trends. Do you agree? Disagree? Have I left something out? Please let me know in the comments.
Yesterday a couple of Computerworld pieces in which I'm quoted came out. They are both authored by Beth Schultz.
The first is about Cloudonomics, or how can enterprises figure out the potential cost-savings and other financial effects of cloud computing. My basic take was that it's difficult to measure the exact financial impact of cloud computing because one of it's major benefits is business agility. See the full story.
The second is about the plethora of Cloud Services available to enterprises and how to choose among them. My take on this one was that it's not a one-size fits all game and organizations will need different tools for different tasks. Read the full story.
In Shopping the Cloud: Performance Benchmarks I listed a number of services and reports that compare cloud provider performance results, but the truth is that in computing (cloud included) you can throw money at almost any performance and scale problem. It doesn't make any sense, therefore, to talk about performance alone, you want to compare price/performance.
But here's the rub: it is becoming increasingly difficult to compare the pricing of the various cloud providers.
About a year and a half ago I wrote What Are Amazon EC2 Compute Units? in which I raised the issue of how difficult it is to know what it is you are actually getting for what you are paying in the cloud. Other vendors use their own terminology, such as Heroku's Dynos. I'm not just picking on these two, everyone has their own system.
In addition, the pricing schemes by the various vendors include different components. Take storage as the simplest example, which clearly illustrates the point. Here's a screenshot from the Rackspace Cloud Files pricing page:
It is fairly straightforward, but also contains many elements that are extremely difficult to project (especially for a new application), such as the Bandwidth and Request Pricing. That's OK - you have to make some assumptions.
But here's my main point -- now compare it to Amazon S3 pricing:
To make things worse, not all cloud storage services were made equal. They have different features, different SLAs, varying levels of API richness, ease-of-use, compliance and on and on.
Another big problem with dealing with pricing is that the market is very dynamic and prices change rapidly. Fortunately, most of the movement right now is downwards, due to the increased competitiveness (especially in the IaaS space) and thanks to vendors benefiting from economies of scale and increased efficiency due to innovation.
Andrew Shafer from CloudScaling wrote a blog post a couple of weeks ago in which he shows how Amazon pricing is constantly shrinking. Check out this graphic he created:
So what do you do in such a complex landscape? There seems to be no escape from creating a test application and running it on multiple services to see where the cost comes out. Then again, that may turn out to be a very time-consuming and expensive effort that may not be worth it -- at least not initially. So you should be prepared to have to move your app across cloud providers if and when the costs become prohibitive (which I am seeing happening to more and more companies).
Hopefully, the cloud benchmark services will also start paying attention to pricing and provide a comparison of price/performance and not just performance.
Here's the data point I found most interesting:
I have followed many cloud surveys and reports that measure cloud traction of the different providers (see for example Guy Rosen's State of the Cloud). It has consistently been the case that Amazon is ranked #1 and Rackspace #2 (which is what prompted my Rackspace: The Avis of Cloud Computing post). The Zenoss survey suggests a different story with Google App Engine and Microsoft Azure coming in at #2 and #3 respectively, pushing Rackspace to #4.
Also, GoGrid's penetration, as well as RightScale's (which is a very different animal than the other players on the list) is very impressive.
Note that the wording of the question in the survey was a bit ambiguous: "What are your cloud computing plans for 2010?". I say ambiguous because the survey was conducted Q2 2010, so probably close to the middle of the year. But in any case, it has a forward looking element to it, which gives a little indication of the trends as they are happening.
Anyway, lots of interesting info on both cloud and virtualization. Check out the full survey results (requires registration).
As cloud computing matures -- meaning it is being used by increasingly larger companies for mission critical applications -- companies are shopping around for cloud providers with requirements that are more sophisticated than merely price and ease-of-use. One of these criteria is performance.
Performance has consistently been one of the main concerns enterprise buyers have had about cloud computing, as indicated from the chart of the responses to a survey conducted by IDC in Q3 of 2009.
To address this concern, and help potential cloud users in selecting their cloud provider, a number of research, measurement and academic groups have initiated efforts to actually measure and compare the performance of various cloud providers under a variety of circumstances (application use case, geographical location and more).
Here are some of the more interesting performance benchmarks out there today:
Compuware Gomez: CloudSleuth
Still in beta, Gomez CloudSleuth is likely to be one of the more important reference points for customer and media cloud performance testing. Gomez has developed a benchmark Java ecommerece application and measures the end-to-end response time of various cloud providers and locations. The tests are run from 125 end-user U.S. locations in all 50 states and from 75 international locations in 30 countries and are conducted 200 times per hour. Gomez is planning on adding the ability to benchmark a user’s own application.
Although just a startup, if it succeeds CloudHarmony is likely to be an important resource for customers and the media for evaluating cloud provider performance. CloudHarmony has a service called Cloud SpeedTest, which allows users to benchmark the performance of a web application across multiple cloud providers and services (servers, storage, CDN, PaaS). This service is currently in beta and is quite simplistic, but CloudHarmony is working on a more sophisticated version with additional features.
In addition, the CloudHarmony staff conducts a variety of performance benchmarks for specific scenarios such as CPU performance, storage I/O, memory I/O or video encoding and publishes them on their blog.
UC Berkeley: Cloudstone Project
Cloudstone is an academic open source project from the UC Berkeley. It provides a framework for testing realistic performance. The project does not publish as of yet comparative results across clouds, but provides users with the framework that allows them to do so. The reference application is a social Web 2.0-style application.
Duke University and Microsoft Research: Cloud CMP Project
The objective of the Cloud CMP project is to enable “comparison shopping” across cloud providers -- both IaaS and PaaS -- and do so for a number of application use cases and workloads. To that end, the project combines straight performance benchmarks as well as a cost-performance analysis. The project has already measured computational, storage, Intra-cloud and WAN performance for three cloud providers (two IaaS and one PaaS) and intends to expand.
BitCurrent: The Performance of Clouds
BitCurrent conducted a comprehensive performance benchmarking study commissioned by Webmetrics and using their testing service entitled The Performance of Clouds. The study covered three IaaS providers (Amazon, Rackspace and Terremark) and two PaaS providers (Salesforce.com and Google App Engine). It measured four categories of performance: raw response time and caching, network throughput and congestion, computational performance (CPU-intensive tasks) and I/O performance.
The Bit Source: Rackspace Cloud Servers Vs. Amazon EC2
According to its website, The Bit Source is an “online publication and testing lab”, which appears to be a one-man show and may not play a significant role going forward. It conducted a one-time benchmark comparing Rackspace Cloud Servers and Amazon EC2 performance.
What are your thoughts about cloud performance and these benchmarks? Please share in the comments below.
[P.S. I added a new category to the blog called "Shopping the Cloud", which will include posts that discuss various aspects of comparison shopping for cloud providers.]
Thinking Out Cloud is a blog about cloud computing and the SaaS business model written by Geva Perry.