Red Sox champion DWDM technology in data center move

May 1, 2008

BY Meghan Fuller Hanna

The Boston Red Sox captured their second World Series sweep in the last 4 years thanks to their on-field resources: Major League Baseball�s only 20-game winner, Josh Beckett; American League Rookie of the Year, Dustin Pedroia; a stellar bullpen; strong defense; and a potent lineup led by Manny Ramirez and David Ortiz.
Ninety-six-year-old Fenway Park, home of the Boston Red Sox, was not designed to support the requisite heating and cooling to satisfy the team’s current data center needs, necessitating a move from the ballpark to a collocation facility several miles away.

But what was going on off the field wasn�t nearly so elegant.

The team�s data center facilities, housed in the cramped if hallowed walls of Fenway Park, had become too unwieldy from both a power and cooling perspective. And there simply wasn�t enough space to support the team�s current or future storage and networking needs.

�We were getting to the point where the equipment was literally spilling outside the computer room,� recalls Randy George, senior systems and network analyst for the Boston Red Sox. Even the team�s administrative offices were overflowing with data networking equipment.

�You�d laugh if you saw it,� says George of the Fenway Park data center. �It was literally maybe a 40- by 20-ft room, and we just scaled way beyond what it could support in terms of power.�

The Sox had previously inked a sponsorship deal with EMC Corp., provider of information infrastructure technology. As a result, the team ended up buying �a ton of storage from those guys,� in George�s words, which added three or four cabinets to its existing facilities. Moreover, off-season construction adjacent to the data center left the facility �chock full of drywall dust.�

Retrofitting the old ballpark with the proper heating and cooling equipment needed to support its current operations would have been cost prohibitive, George admits. In short, the team had no choice but to look for an external collocation facility.

�This is a 100-year-old ballpark,� notes George of historic Fenway Park, �so it wasn�t really designed for an enterprise data center. There are other parts of the ballpark that have power issues, too, just because it wasn�t really designed for the sorts of things we�re doing now.�
The Ekinops 360 platform provides the Red Sox with eight channels of Gigabit Ethernet on a single 10-Gbit/sec wavelength and five 2-Gbit/sec Fibre Channel links over another 10-Gbit/sec wavelength.

Some of the things they are doing now include complicated video analysis that the players use to chart their performance or to prepare for an opposing team, something that was certainly unheard of when the park was erected in 1912. George estimates that the team now has upwards of 20 Tbits of video �being pulled in all sorts of directions around the ballpark.� And the Sox never delete any of the video they collect, which adds to their storage challenges. �We�re scaling a good 5 to 10 Tbits per year,� he confirms.

The Red Sox�s data center also supports the team�s internal scouting system. George estimates that 200 to 300 baseball operations staff members around the country have access to databases and information, all of which are stored in the data center. The facility also supports the team�s financial and business applications.

�We probably have 40 or 50 servers now and upwards of 30 Tbits of storage,� he estimates. �So we needed a ton of Ethernet going to the colo[cation facility].�

The Red Sox found collocation space that fit their needs in the Markley Group�s (www.markleygroup.com) 1 Summer Street facility in the Downtown Crossing area of Boston. Located at the intersection of all major fiber loops in the Boston area, the facility hosts some 35 domestic and international carriers, as well as local institutions like The Boston Globe and Harvard Medical. The team then leased dark fiber from Lightower (www.lightower.com), a dark fiber provider in the Northeast. Now, they just needed the equipment to connect Fenway to the Markley Group�s facility, which is several miles from the park.

Because the collocation facility charges its customers based on how much space they occupy and how much power they consume, George says the team was looking for equipment with a compact footprint and minimal power consumption.

The Red Sox shopped several vendors before turning to their counterparts over at ESPN for some advice. That inquiry eventually led to Ekinops (www.ekinops.com), which OEMs its equipment to the company that supports ESPN�s networking needs.

�It is a very nuts-and-bolts-type of option for us,� says George. �We�re really just looking to pump 8 to 10 gig of Ethernet through the dark fiber, and they had lower-end blades that you could use to add that sort of scalability. It fit well, the price was right, and the functionality was there.�

The Red Sox use the smaller of Ekinops�s two chassis, a compact, 2RU version of the Ekinops 360 DWDM platform. The team has populated the chassis with two aggregation modules: the PM 1008, which supports eight channels of Gigabit Ethernet over a single 10-Gbit/sec wavelength, and the PM 1005, which supports five 2-Gbit/sec Fibre Channel links over another 10-Gbit/sec wavelength, all over a single fiber.

�Aggregation is very economic,� contends Jonathan Amir, vice president of the Americas region for Ekinops. Consider the PM 1005 and its ability to aggregate five separate Fibre Channels on one wavelength. In the past, he says, �you would take an SFP and a transponder, and you would use five wavelengths to carry those Fibre Channels. What we are able to do with the aggregator is combine them all and very cost effectively put them on a 10-gig wavelength.�

Running only one fiber (with another available for protection) between Fenway Park and the collocation center enables the team to lower its capital and operational SAN extension costs.

�WDM optics are the main cost element in a DWDM network,� notes Amir, �and instead of using five DWDM wavelengths, you are only using one. There�s a big cost savings there.�

For his part, George is not worried about running out of capacity, even with an ever-increasing cache of video. �Just one fiber strand is going to serve us a long time,� he says. �For perspective, ESPN is pumping high-def all over the place, between all of its facilities, and I think they only have redundant dark fiber links.�

George is particularly proud of the Sox�s aggressive relocation timeline. The team successfully moved its facilities on the weekend of March 23�24, in time for the home opener�and World Series ring ceremony�on April 8. The relocation was entirely transparent to the users, says George, who does admit that name recognition may have helped speed things up a bit.

�A lot of our vendors rolled out the red carpet,� he recalls. �We are fortunate to have the name because we can say, �Hey, our opening day is going to be ruined if you don�t come through for us.� We were lucky.�

Meghan Fuller Hanna is senior editor at Lightwave.

Sponsored Recommendations

AI’s magic networking moment

March 6, 2024
Dive into the forefront of technological evolution with our exclusive webinar, where industry giants discuss the transformative impact of AI on the optical and networking sector...

Coherent Routing and Optical Transport – Getting Under the Covers

April 11, 2024
Join us as we delve into the symbiotic relationship between IPoDWDM and cutting-edge optical transport innovations, revolutionizing the landscape of data transmission.

Data Center Network Advances

April 2, 2024
Lightwave’s latest on-topic eBook, which AFL and Henkel sponsor, will address advances in data center technology. The eBook looks at various topics, ranging from AI backend networks...

Supporting 5G with Fiber

April 12, 2023
Network operators continue their 5G coverage expansion – which means they also continue to roll out fiber to support such initiatives. The articles in this Lightwave On Topic ...