BY Meghan Fuller Hanna
The Boston Red Sox captured their second World Series sweep in the last 4 years thanks to their on-field resources: Major League Baseballâ��s only 20-game winner, Josh Beckett; American League Rookie of the Year, Dustin Pedroia; a stellar bullpen; strong defense; and a potent lineup led by Manny Ramirez and David Ortiz.
But what was going on off the field wasnâ��t nearly so elegant.
The teamâ��s data center facilities, housed in the cramped if hallowed walls of Fenway Park, had become too unwieldy from both a power and cooling perspective. And there simply wasnâ��t enough space to support the teamâ��s current or future storage and networking needs.
â��We were getting to the point where the equipment was literally spilling outside the computer room,â�� recalls Randy George, senior systems and network analyst for the Boston Red Sox. Even the teamâ��s administrative offices were overflowing with data networking equipment.
â��Youâ��d laugh if you saw it,â�� says George of the Fenway Park data center. â��It was literally maybe a 40- by 20-ft room, and we just scaled way beyond what it could support in terms of power.â��
The Sox had previously inked a sponsorship deal with EMC Corp., provider of information infrastructure technology. As a result, the team ended up buying â��a ton of storage from those guys,â�� in Georgeâ��s words, which added three or four cabinets to its existing facilities. Moreover, off-season construction adjacent to the data center left the facility â��chock full of drywall dust.â��
Retrofitting the old ballpark with the proper heating and cooling equipment needed to support its current operations would have been cost prohibitive, George admits. In short, the team had no choice but to look for an external collocation facility.
â��This is a 100-year-old ballpark,â�� notes George of historic Fenway Park, â��so it wasnâ��t really designed for an enterprise data center. There are other parts of the ballpark that have power issues, too, just because it wasnâ��t really designed for the sorts of things weâ��re doing now.â��
Some of the things they are doing now include complicated video analysis that the players use to chart their performance or to prepare for an opposing team, something that was certainly unheard of when the park was erected in 1912. George estimates that the team now has upwards of 20 Tbits of video â��being pulled in all sorts of directions around the ballpark.â�� And the Sox never delete any of the video they collect, which adds to their storage challenges. â��Weâ��re scaling a good 5 to 10 Tbits per year,â�� he confirms.
The Red Soxâ��s data center also supports the teamâ��s internal scouting system. George estimates that 200 to 300 baseball operations staff members around the country have access to databases and information, all of which are stored in the data center. The facility also supports the teamâ��s financial and business applications.
â��We probably have 40 or 50 servers now and upwards of 30 Tbits of storage,â�� he estimates. â��So we needed a ton of Ethernet going to the colo[cation facility].â��
The Red Sox found collocation space that fit their needs in the Markley Groupâ��s (www.markleygroup.com) 1 Summer Street facility in the Downtown Crossing area of Boston. Located at the intersection of all major fiber loops in the Boston area, the facility hosts some 35 domestic and international carriers, as well as local institutions like The Boston Globe and Harvard Medical. The team then leased dark fiber from Lightower (www.lightower.com), a dark fiber provider in the Northeast. Now, they just needed the equipment to connect Fenway to the Markley Groupâ��s facility, which is several miles from the park.
Because the collocation facility charges its customers based on how much space they occupy and how much power they consume, George says the team was looking for equipment with a compact footprint and minimal power consumption.
The Red Sox shopped several vendors before turning to their counterparts over at ESPN for some advice. That inquiry eventually led to Ekinops (www.ekinops.com), which OEMs its equipment to the company that supports ESPNâ��s networking needs.
â��It is a very nuts-and-bolts-type of option for us,â�� says George. â��Weâ��re really just looking to pump 8 to 10 gig of Ethernet through the dark fiber, and they had lower-end blades that you could use to add that sort of scalability. It fit well, the price was right, and the functionality was there.â��
The Red Sox use the smaller of Ekinopsâ��s two chassis, a compact, 2RU version of the Ekinops 360 DWDM platform. The team has populated the chassis with two aggregation modules: the PM 1008, which supports eight channels of Gigabit Ethernet over a single 10-Gbit/sec wavelength, and the PM 1005, which supports five 2-Gbit/sec Fibre Channel links over another 10-Gbit/sec wavelength, all over a single fiber.
â��Aggregation is very economic,â�� contends Jonathan Amir, vice president of the Americas region for Ekinops. Consider the PM 1005 and its ability to aggregate five separate Fibre Channels on one wavelength. In the past, he says, â��you would take an SFP and a transponder, and you would use five wavelengths to carry those Fibre Channels. What we are able to do with the aggregator is combine them all and very cost effectively put them on a 10-gig wavelength.â��
Running only one fiber (with another available for protection) between Fenway Park and the collocation center enables the team to lower its capital and operational SAN extension costs.
â��WDM optics are the main cost element in a DWDM network,â�� notes Amir, â��and instead of using five DWDM wavelengths, you are only using one. Thereâ��s a big cost savings there.â��
For his part, George is not worried about running out of capacity, even with an ever-increasing cache of video. â��Just one fiber strand is going to serve us a long time,â�� he says. â��For perspective, ESPN is pumping high-def all over the place, between all of its facilities, and I think they only have redundant dark fiber links.â��
George is particularly proud of the Soxâ��s aggressive relocation timeline. The team successfully moved its facilities on the weekend of March 23â��24, in time for the home openerâ��and World Series ring ceremonyâ��on April 8. The relocation was entirely transparent to the users, says George, who does admit that name recognition may have helped speed things up a bit.
â��A lot of our vendors rolled out the red carpet,â�� he recalls. â��We are fortunate to have the name because we can say, â��Hey, our opening day is going to be ruined if you donâ��t come through for us.â�� We were lucky.â��
Meghan Fuller Hanna is senior editor at Lightwave.