Tuesday, July 21, 2009

Cloud bursting and the real world.

Cloud bursting is kind of a hot topic and presented as the panacea for companies that want to provide the service in house without overprovisionning for peaks loads.


As always , most of people around talk about the different solutions on how cloud bursting should be done or will be done. But they are missing the big picture behind it. Cloud is not a technology it’s a business model .Where is the cost model?

With the current state of technology , cloud bursting is highly limited to very specific type of applications. On top of the the economic model is not really clear and nobody has any real idea of the cost.

One of the main problem comes from that you pay for your cloud server on demand, Cloud bandwidth etc.. But on top of that you need to pay for the cloud server to access your data and synchronize the data back internally (not to mention the security and trust issue).

I think the real question is how the cloud bursting cost model stand up against classic peaks provisioning or even better internal cloud resource re purposing?

Sunday, March 29, 2009

Amdahl's law and automation

The theory (and a little bit of practice)

Automation in datacenter and now utility computing is heavily used to drive down TCO cost. However it is rather hard to find out what to automate in order to get the maximum benefit out of it. Some automation that seems obvious often has a low return on investment.

Hopefully, we can use ( or abuse) amdahl's law here. It is used to find the maximum expected improvement to an overall system when only part of the system is improved. Interpreted simply, Amdahl's Law says focus on improving the things that make the biggest difference overall.







Pic: Amdahl's law



If we adapt this law for TCO reduction 1 is the cost of running the system (datacenter, service etc.. ) for a discreet amount of time. The new cost is will be the length of cost the unimproved fraction takes, (which is 1 - P), plus the cost the improved fraction takes. The cost for the improved part of the system is the cost of the automated part's former cost divided by the automation cost factor, making cost of the improved part P/S. The final cost is computed by dividing the old running time by the new running time, which is what the above formula does.

Applied to datcenters, cloud or IT operation, this logic suggests that organizations should start with automation that makes the biggest impact, particularly IT staff productivity.

The reality

The reality


However, if we look at the reality the OPEX cost for a server and a datecenter is represent a very small part of the overall cost. According to google paper it varies between 7% and 9 % of the overall cost. Which means that if we still follow Amdahl law automation can provide only a very limited impact on the overall cost while maximising server utilisation guaranty a better return on investment ( not to mention being smart with hardware acquisition).



But there is no small economy.

Saturday, January 17, 2009

Turning off workstation, sustainability vs efficiency

People are arguing that by turning off the unused corporate PC during the night companies will reduce their ecological footprint and more importantly save money.

Let’s do some math :

  • Average PC consumption : 89 W ( can be less.. ok but let’s not argue)
  • On average ~1.42 KW are saved during a night (16 hour) :
  • One KW in uk = 0.166 € (UK price )
  • So we have spent 1.42 * 0.166 = 0.23 cents per employee

Now if we had left the pc turned on he would have a no waiting time to get to work. However if you turned it off you need to wait:

  • Vista : 1:12 => 72 s
  • XP : 1:02 => 62 s

To be honest I suspect that the average boot time to achieve a ready to use state ( ie, outlook started, Firefox started, word started).

  • To be around 3 to 5 mn
  • So let’s take the a random salary : 35 k € / year, Its 0.01 € / s

Ok now the table


Boot time

Ready to use (~3 mn )

Left running during the night

Vista

0.72€

1.8€

0.23€

XP

0.62€

1.8€

0.23€




Conclusion :


I'm biased and exagerating a little but if you turn of your pc you save energy however you waste the company money .This dilemna can be easily solved by using solution that schedule wake and sleep to ensure computer is ready for regular user or for remote system updates.

What i'm trying to demonstrate is that when it comes to green IT, people tend to forget the big picture and focus only on the positive aspect no matter how tiny it is. While put into perspective it tends to be often counter productive for companies and/or for the environment.


You have to be very carefull when it comes to ecological solution and make sure that you have looked at all the different angles before taking a decision.






Friday, January 16, 2009

Google, the kettle and a calculator

The Kettle:

Recently Google made the news with the carbon footprint of a typical Google search which generates about 7g of CO2 compared to boiling a kettle which generates about 15g.


See : Pervasive monitoring of the environmental footprint of network activity by Alex Wissner-GrossThe CO2Stats remains interesting, but the story ends up being somewhat diluted. It turns out to be a fabrication of journalism and shoddy reporting.


And Google's response , the key figures for one search from this post : 0.0003 Kwh per queries or 0.2 grams of CO2.

While these number are a lot smaller than the previous one, the way the article The way they are presented seems to try to minimise the ecological impact of Google searches:

  • “this amounts to 0.0003 kWh of energy per search, or 1 kJ. For comparison, the average adult needs about 8000 kJ a day of energy from food, so a Google search uses just about the same amount of energy that your body burns in ten seconds.”
  • “thus, the average car driven for one kilometre (0.6 miles for those in the U.S.) produces as many greenhouse gases as a thousand Google searches.”
  • Etc..

The calculator:


Now, I m a man with a calculator and I’m not afraid to use it .

Lets recapitulate the facts:
1 search =

  • 0.0003 KwH
  • 0.2 g of Co2
  • 1 kj

Number of search done in the US October 2008 : 7 114 millions

Lets calculate the cost per month :

  • 7114 000 000 * 0.0003 = 2134200 KwH = 2.1342 GwH per month
  • 7114 000 000 * 0.2 = 1422800000 g CO2 = 1422.8 Tons of CO2 per month
  • 7114 000 000 * 1 = 7114 000 000 Kj

Which means per year (naïve approach):

  • 2.1342 * 12 = 25.6104 GwH
  • 1422.8 * 12 = 17073.6 Tons
  • 7114 000 000 * 12 = 85368000000 kj

Now let’s put that into perspective the same way Google did :


That’s just for US queries, now if you want numbers for world wild queries in august 2007 google did 37.1 billion search queries.

I let you do the math.

The Dose make the poison:


A sixteenth-century Swiss chemist named Paracelsus gave us the most basic rule of toxicology: "The dose makes the poison." Practically every substance on earth can kill you if it's concentrated enough.

We can easily transpose this principle to the ecological impact of human activity on our environment. In this case Google downplayed the ecological cost of a single search while carefully avoiding to mention the "dose".