My Questions About Project Blackbox

I got to see Sun’s Project Blackbox today, over at the University of Wisconsin – Milwaukee.

First impression: interesting idea, obvious that it’s the first attempt at implementation.

Now, my questions:

1. What would the money spent on a Blackbox get you in offsite hosting?

2. If you have the space to store a 20 foot shipping container and a chiller could you not just build a new data center in that space?

3. What keeps vandals and competitors from hijacking or severing the power, cooling, and external network connections? Is a whole new industry going to sprout up for data center trailer parks, with armed guards? It’s just a colocation facility writ large.

4. When Sun says that these things are stackable, does that mean system administrators need to sprout wings to get to the servers? Won’t that be a total pain in the duff? Space efficiency and ease of administration are mutually exclusive, in my experience (and marketers and engineers don’t have to deal with these products on a daily basis).

5. Why doesn’t this come in a 40 foot container?

6. Why doesn’t it have an integrated cooling option (perhaps in a 40 foot container)? You have to provide your own chiller, which makes the unit decidedly not standalone (beyond power requirements, obviously). Your chiller isn’t going to have the same level of imperviousness to the environment, so I see that as an Achilles heel.

7. Why don’t the racks have a better way to slide out, instead of that clumsy dolly thing? It was tough for the demoer to slide the rack out because the container wasn’t quite level. Will I have that same problem? The demoer mentioned the shipping product will have a dolly that moves in two dimensions. That sounds even worse, since I’ll have to manhandle a heavy rack in four directions, with running servers in it. I worry about the cable management.

8. Why does this thing need GPS? Does that mean it’s so easy to move that someone is going to steal it? Do we now need LoJack for our data centers?

9. Can I buy these things preconfigured, so my staff doesn’t have to spend time building and cabling them? A Blackbox preconfigured with and for VMware would be awesome. Ditto for Solaris zones. Plug it in and go.

10. Why are the power and cooling connections on the side, sticking out? Does this mean I can’t park two of them side by side? Ditto for the network connections, which are on the other side, so I couldn’t even park two side by side in opposite directions. To me it seems like these should be on the short ends, and all the connections in one place.

In doing some searching now I see Rackable has come out with their version of Blackbox, called Concentro. Now I have one more, big question:

11. Why didn’t Sun go all the way with the power savings thing and offer certain configurations where they do what Rackable does with DC power?

I do call shenanigans on Rackable’s 1200U of space since they do half-rack servers, but that is neat because it allows for convenient cooling up the center of the rack column with servers back to back. Those are proprietary systems, though, so they aren’t as flexible as the Blackbox for loading the container with whatever you need, but it also means they can address certain problems, like power & cooling, more easily. Concentro is a 40 foot container, so it’s comparable to Blackbox in 20 feet (300U vs. 266U). Like Sun they also don’t supply any sort of integrated cooling, though.

Overall, I think Sun could volley back at Rackable pretty easily with some custom, pre-loaded configurations based on DC power and integrated KVM & systems management. Hopefully they were planning to do that anyhow, and the information is scarce. But at the same time the demo left me with more questions than answers, which isn’t cool. To be fair I didn’t think of a lot of these until afterwards, though.

My conclusion: containerized data centers are the new blades. They’re proprietary, solve some customers’ problems, and introduce their own set of security and operational problems. It’ll be interesting to see who is buying them, what problems they are solving with them, and how these units are getting deployed.

5 thoughts on “My Questions About Project Blackbox”

  1. I have only seen pics of these online, so I don’t know how practical or usable they are, but I can think of a few places where these could be deployed. Mostly as temp solutions or in places lacking proper infrastructure.

    Off the top of my head I can think of places like war zones for military use, disaster relief areas like during the tsunami or Katrina, or even third-world countries where a data center might be needed temporarily in the field. Even installations like the Olympics could be viable with this. I know they always build out huge data center like buildings for two weeks of broadcast.

    Also, I believe they come custom installed, but I am not for sure. As for the GPS, does seem like they are expecting these to be stolen a lot.

  2. Answers:

    1. A Sun Blackbox with 240 x64 boxes will run just shy of $2mil when you factor in a prepackaged chiller and distribution transformer. (Funny how Sun neglects to tell you their container needs hundreds of amps of 208v power, not readily available 480v straight off the pole.) Need filtered UPS power, generator, and transfer switch? Add another $250-500k. And where do you put the UPS?

    2. Yes, at over twice the cost per square foot. There are other benefits to being mobile including taxes and code compliance issues.

    3. The key is the efficiencies this design can bring to the table. The largest users of massive compute capacity are already building their own facilities in modular fashion. Containerized datacenter modules can cut their costs and significantly shorten build times.

    4. It’s unlikely Sun has thought stacking through. It’s one thing at a dock with a specialized 100 ton forklift, it’s another thing altogether when trying to get some locals with a crane to rig a 15 ton $2million dollar load.

    5. Sun is targeting the world market and 40ft containers are not prevalent inland anywhere other than the US.

    6. The cooling plant needs matched to the environment. It’s up to the customer to choose ease of deployment (dry cool) vs complex deployment (evaporative water cooler with storage tanks and economizers).

    7. Worse yet, how do you service the servers once the rack is in the aisle? Open both ends of the container so you can walk around to the other side to unplug/plug cables? What about dust and other airborn particulate while this is going on? I don’t buy Sun’s fail in place model. No one can afford to do that except Google with their dirt cheap servers.

    8. GPS is for tracking, both in transit and once stationary. GPS tied to a GSM modem is only half the solution, there still needs to be additional security measures. GSM jammers are easy to purchase.

    9. That is Sun’s plan.

    10. They’re on the sides because they used a container with doors on both ends which is necessitated by their service model of the racks inside. You wouldn’t want these too close together anyway as even a good truck driver isn’t going to get them closer together than 6″.

    11. Better question, why didn’t Sun tune their servers for the container and integrate UPSes or the necessary 480v to 208v three phase transformer. Rackable natively takes 480v and has integrated power backup.

    Sun can only do 240 1U hosts per container. Rackable can do 600 1U hosts in a 20′ container. If you need compute nodes that’s over twice the servers per container.

  3. Awesome! Thanks for the comment.

    I hadn’t thought about the in-aisle servicing. Maybe you take two people in with you, one on one side, one on the other. 🙂 (lame!)

  4. lol@
    “Do we now need LoJack for our data centers?”

    Pssst! Hey man… want to buy a datacenter…?

  5. Very nice article!

    I am registered to see the BlackBox on tour this Friday in Des Moines. I am hoping to take some pictures, but wonder if they will let me.

    I also invision the trailer park full of containers — perhaps you can request HBO as well for when you are administering servers insde. 🙂

    Look for my review/impressions on BlackBox this weekend.
    (http://datacenterlinks.blogspot.com/)

    -John

Comments are closed.