It is what it is....

Wednesday, April 18, 2007

Sun's Blackbox

Get your mind out of the gutter, I'm referring to their portable datacenter. I was able to attend one of Sun's introductory briefings today in Menlo Park. When Jonathan Schwartz first announced this as a product I was very skeptical and threatened. Skeptical because these containers are 160 sq ft and can support a 200kw draw. That is 1250 watts per foot, albeit very isolated. And threatened because of the potential disruptive effect these new devices could have on the traditional datacenter market, my livelihood. Kinda.

I'm still skeptical but not as much as I was. I'm definitely not threatened, not because I don't believe in the viability but because the two are more complimentary than exclusive.

There are a few kinks to be worked out or how shall I say, items that are quickly set aside during their presentations but what did you expect? Marketing, marketing. Anyone know Al Hops?

Anyway back to this Blackbox.... A couple issues to note:

- these are NOT stand alone units. they require:

- multiple high voltage power connections in a minimum n+1 config - approx 250kw of provisioned primary power(these aren't connections you just run an extension cord f or. these are serious high voltage connections and as such require a serious infrastructure plant to get the connections down to the voltage required by the box. you don't call PG&E up and order one of these. typically this will be a branch on a larger power grid and in the datacenter world can be likened to a 12kv branch to a PDU.

- Cold water - Blackbox units require a cold water feed to support cooling off the payload, if you will. . To support 200kw of draw is approx 30 tons of chiller for these Blackboxes. The chiller doesn't come with the Blackbox and doesn't fit on or in one. Infact, a 60 ton chiller, enough capacity for 3 boxes, is about the size of a box itself. Chillers require power to produce cold water and you don't just plug a chiller into your wall outlet and be on your way. It requires the same or similar type of connections as the Blackbox, hi voltage, hi capacity circuits.

- Water Supply - HVAC systems will lose water to condensation, evaporation, leaks, overflows, etc and that water needs to be made back up to ensure smooth sailing. Maintaining N+1 design, you need two supplies of water from seperate suppliers. One is obviously your regular water supply but what about the second? dig a well like most datacenters do?

- UPS systems. There aren't any. Seriously. So that should tell me who the target customer is. Someone who doesn't care about uptime? The why the hell buy all this crap? why not host it on Amazon S3 or MediaTemple? Who doesn't care about uptime? Google is the only company I can think of, actually amazon too, who wouldn't care if they lost 8 racks of servers. I just don't think Sun is far enough along to have a solution for UPS that doesn't make you take a step back and say, 'wait a second, where the hell am i going to park five* tractor trailers so i can operate my 24 racks?' * 3 actual Blackbox container, 1 container for Generator and batteries and one container for the chiller.

I sound like I'm bagging on Sun but I'm not really. I like the idea and know it's a definite winner in niche applications such as military use, natural disaster use, isolated locations where it can be airlifted in and so on.

The thing is, if Sun owned the entire market for those specific applications it still isn't going to get Sun where it needs to be, it's just too limited in size. Sun needs to find a way to make these Boxes the defacto standard choice when a company begins evaluating datacenter options. That or sell the concept to the colo vendors by delivering them value by showing that the Boxes can compete economically with a standard raised floor environment. Coincidentally, just like a regular datacenter, in order to support a few of these boxes you will need a significant MEP resource which is essentially the bread and butter of a datacenter and datacenter operators are experts are managing MEP. Its a nice fit.

I liken the potential of Blackbox type architecture to what consumers are using Amazon S3 grid or google's own infrastructure(googleOS) for, a shared IT resource that supports unique data for each user and leverages commonalities among users. everything is virtually connected and resources are shared so if one goes down it doesnt matter yet the performance benefits of close proximity is omnipresent.

Cost. The fully built out container(without the computers, chiller, generator and truck or helicopter to transport it) currently costs $500k to build. Sun eluded to the price point of $250k as one which they're shooting for. $250k for 200kw isn't a bad deal. Equinix spends about $25k per rack or $1000/sq ft for a 2.5kw rack. In gross #'s Suns Box looks good at $1200/kw on the Box while a traditional datacenter, per Equinix's rough costs, comes in at $10,000 per kw. I don't know what the cost of the chiller plant and elctrical switches, etc would be but imagine it can't be more than 60% of the total costs of construction of the traditional so add another $6000 per kw and mutiply that sum, $7200, by the number of kw draw and you get your total cost for the Box and the supporting MEP gear. In this case it is $1.4MM for 200kw of datacenter equivalent. For Equinix, it would cost $2MM+

Lots of potential with this product but in order to be mass adopted it needs to demonstrate an economic benefit in addition to the obvious operational ones.

Labels: , , , , , , , , , , , , ,