This entry was originally published on the CloudSource blog in March 2012.

As this series slowly draws to an end, I’d like to share with you the key points to look at when thinking about moving a workload to the public cloud. Over the last year I kept collecting them in a little cheat sheet. It’s probably a good way to finish this series. Here are the 11 questions in my little cloud cheat sheet to ask as you think about moving to the public cloud.

  1. Where is the service delivered from? In other words, in what geography are the servers that deliver you this service? Does that really matter in the cloud? The further away the servers, the more latency you will experience, so, if you are using a service that is very interactive, you may want to ensure the distance is not too big. US based servers typically add at least between 120 and 200ms of latency. That may sound like nothing, but you’ll have this latency every time you update your screen, and in interactive applications this adds up very quickly.
  2. Who is involved in delivering the service? Why do I ask that question? Obviously the service is delivered by the one advertising it. You are right, but often, particularly with SaaS services, other players are involved. A SaaS provider may use the platform of another service provider, may use yet somebody else for back-up, payment, disaster recovery and you name it. Typically you get one name, but have no clue about the full “supply chain” that delivers that service. That may be an issue as the service is only as good as its weakest link. You cannot assess that because you do not have transparency on who is being involved. Many people were surprised how many companies stopped operations last April when Amazon had part of its delivery capability go down. I cannot stress enough the need for transparency.
  3. Where is the data located? Here we are shifting gears. The location of the data is related to the compliance requirements of several data types. In some countries financial information needs to stay within the country. EU privacy information needs to stay in the European Union, healthcare data may even have to stay in specific states, and I could go on like this. Now one tricky thing to remember is that it is not just the operational data (the one actually used by the service) but also disaster recovery data, back-up data, etc. Some services even replicate the data across sites so they are better able to deliver the service to the customer. All copies of the data are subject to the same regulation. Don’t forget that.
  4. How can I get the data back in case of decommissioning of the service? Let’s assume for a moment you choose a SaaS application and use it for a number of years. One day your requirements have evolved to a point where another application suits you needs better. Regardless whether this one is in the cloud or not, you are going to need to transfer the data to that new application as you do not want to lose everything you have gathered over all the years. This far, such transition is no different from the traditional environment. There are two differences though. First, you have no control on the format in which your applications will be returned to you, that is typically defined by the service provider. And secondly, how will you get the information back? Do you have to upload it? If you are speaking about terabytes this may actually be quite expensive, and if the other application also happens to be a SaaS application, you may end-up having a huge networking cost. Salesforce for example returns you the data in CSV format, in zipped files, that are kept on their site for 48 hours. Up to you to download in time. If the data is huge, you better be quick.
  5. What happens to the operational data, snapshots, disaster recovery and back-up copies in case of decommissioning of the service? Let’s stay on the previous example where you decide to stop using a specific service. You want the data that is associated with that service to be destroyed. Well, as you know, it’s extremely difficult to delete data on the Internet. As pointed out by Eric Slack in an article, titled “How do I know that “Delete” means Delete in Cloud Storage?” there is no real mechanism to delete data from the Cloud, and that is obviously an issue. EU’s new proposed regulation “Right to be Forgotten” is forcing service providers to address the issue at least for privacy data. But once appropriate mechanisms have been found out for that purpose, we would expect them to be available to clean out other data. The one thing to remember, to date, is it’s easy to get your data in the cloud, but darned difficult to remove it.
  6. Who owns the data while it is used by the service? This may sound like a simple question, but the answer isn’t. As long as everything goes well, there are no issues. Trouble starts when for example the Justice department requests some of your data. Who decides to release the information, the service provider on which systems the data is located or you, the “owner” of the data? The service provider wants as little issue as possible with justice so they will look to diligently provide the information. If that’s the case, will they at least warn you? Well, that’s not always certain, so you better ask the question and make sure you have a satisfactory answer. I would strongly recommend you discuss this with your legal department to ensure the best interests of your company are preserved.
  7. What security processes & procedures are in place? Public Clouds are more secure than private environments, you hear that all the time. But when you then ask large public cloud providers how that security is implemented and what procedures are used, the answer is invariably “trust us, we have no time to explain to you” or a variation of that. There is no transparency. That’s the first issue. The second is that the large public cloud services are ideal targets for hackers and other subversive groups that intend to harm the business or to make themselves visible. And this unfortunately is only going to increase.  That is why it’s important you feel at ease with the security processes and procedures used by the service provider you choose. You may want to audit their environment, review their procedures etc. If they refuse to allow you to do that, assess the potential risk.

  8. What responsibility is the service provider taking? The T&C’s of most public cloud providers (you know this lengthy document, written in unfriendly language, that you obviously read carefully prior to clicking the box) limit their responsibility to a maximum. AWS operates and manages the components from the host operating system and virtualization layer down to the physical facilities; the customer’s responsibilities include the guest operating system, other associated application software and configuration of the AWS-supplied security group firewall. “AWS basically are telling you compliance is all up to you regardless of the regulation,” said Joe Granneman, an information security professional with experience in the heavily regulated industries of health care and financial services. “This makes a lot of sense because there is no good way for Amazon to guarantee compliance when it only provides the infrastructure.  The customer connects the infrastructure together and builds on top of it, which Amazon cannot guarantee.” Fundamentally Amazon is responsible for keeping the infrastructure running, that’s it. And by the way, the others have very similar approaches, not to say that many are in “beta,” which mean they do not take any responsibility. So, if something goes wrong, don’t count on your service provider too much. The chances he repays the damage is pretty slim

  9. How are you kept informed in case of issues? You suddenly don’t get a response any more. Your service is dead. How do you know what is happening? Typical answer, blogging and Twitter. Is this approach OK for you? Are you happy to be informed that way?  In many past cases, it has taken quite a while for the service provider to actually acknowledge there is a problem in the first place. In the Amazon outage of April 2011, it was not just the outage, but communication that was the issue. If your datacenter has a problem, you know who to call and can ask them what is happening… How long it will take and what alternatives do you have? With public cloud services none of that is available. Can your users live with such lack of information?

  10. What privacy policies is the service provider subscribing to and how do they manages the user information? With the changes made to the privacy policies of Google and Facebook, just to name a few, and the associated controversy, it makes sense to review in detail how privacy is addressed by the public cloud service provider. Indeed, you are going to have your employees using the system. They will be sharing some of their personal information. How is that one protected? What are the responsibilities of the service provider and what can he/she do or not do with your data, etc.?  All good points to look at.

  11. What happens if your service provider is acquired or goes bankrupt? Obviously, you do not want that to happen, but it’s important to ask the question. In the old days, when a service provider went bankrupt we had an escrow allowing us to continue operations at a reasonable cost. Such things do not exist in the cloud world. Are the receivers able to sell you your own data back for a hefty premium in case of bankruptcy? How is the agreement tackled when the service provider is taken over? You have agreed T&C’s but do not really have a contract. So, what are the limits? And what is happening?

It was not my intention to scare you, but if you decide to use public cloud services, do it with your eyes wide open. It’s all about risk management. How much risk are you prepared to take? Ask the above questions and decide for yourself. Make sure you have the appropriate information from your potential service providers prior to making your decisions. It’s like any contract; you need to think about what you hope will never happen.