Alternatives to the local phone loop stumble on cost, property rights, and slow PCs

Oct. 1, 1999


AT&T's cable venture looks more costly than ever; wireless carriers square off against commercial and apartment building owners; and a Canadian plan for Gigabit Ethernet to the home is stymied by NICs in PCs.

Years ago, when Dennis Patrick was chairman of the Federal Communications Commission (FCC), the "2-wire" model of competition was in vogue. The basic idea was that the local telephone company's dominance would be eroded if there were a second telecommunications connection reaching the home and business. Unfortunately, there isn't good news right now for companies trying to put the theory into practice.

For example, AT&T is rethinking its cable-network architecture. In May, the company announced that its trial of voice, digital TV, and high-speed Internet services in Salt Lake City would be carried out with nodes ranging in size from 50 to 75 customers, well below Time Warner's standard node size of 500 customers and a huge reduction from the nodes of 1000 to 2000 customers that were typical in cable systems before 1997. Smaller nodes result from the increasing use of fiber in cable-distribution networks aimed at raising operating frequencies from 300 MHz to more than 800 MHz.

But this point misses the implication of AT&T's Salt Lake City trial, which suggests the company faces substantial construction expense if 75 subscribers is the standard for all nodes inherited from TCI and MediaOne. Having already paid from $4500 to $5400 per subscriber for its cable purchases, an additional $600 to $1000 per subscriber could make AT&T's second wire to the home far too expensive for people with ordinary income. Add this negative prospect to declining profit margins in the long-distance market plus the imminent long-distance service offerings by Bell Atlantic and other local companies, and the result is a second wire that may not support itself financially-a point the company's key people conceded to themselves in July during a meeting in Seattle.

AT&T's predicament, like MCI's $800-million loss on its pre-WorldCom efforts to develop local services, is testimony to the difficulty of supplanting the phone company's local loop with a second telecommunications connection. But the 2-wire theory lives on in the federal government.

In April, Joel Klein, the U.S. Justice Department's deputy attorney general for antitrust, told a meeting of antitrust lawyers that "technically, there is not a means to get multiple means of access to the home." He emphasized there would be no competition until "alternative technologies" permeate local communications markets.

Klein did not mention a wireless loop, but his choice of words fits neatly with the FCC's view of wireless technology as the second "wire." In June, the agency established docket FCC 99-141, a rulemaking meant to facilitate a wireless service provider's access to right-of-way. A month later, Dale Hatfield, the FCC's chief of engineering and technology, reiterated the agency's 2-wire theme.

Speaking at the IEEE Radio and Wireless meeting in Denver in August, he said, "If this nation is to enjoy the...benefits envisioned by...the Telecommunications Act of 1996, we need wireless systems as full-fledged competitors in the provision of local telecommunications services. We need them not only in the provision of broadband services to business customers, but also on a widespread basis to ordinary residential customers as well."

Hatfield's comments paraphrase the remarks of the docket's prime mover, Winstar, a major wireless service provider. In an FCC hearing July 1998, the company's vice chairman, Steven Ghrust, told the agency it should direct its efforts "toward creating alternative local broadband capacity, because if we don't do that, we'll find ourselves...in an environment where DSL [digital subscriber line] will not meet the demands of the marketplace." Two years earlier in FCC docket 96-98, Winstar told the agency that "access to roofs and risers by definition is access to critical rights-of-way" and that failure to afford such access is "unreasonable discrimination against...alternative technologies," the very thing that Klein and Hatfield say the country needs.

But if wireless loops are supposed to compete with wired loops all over the country, the FCC has to settle the issue of property owners supposedly gouging wireless companies. Winstar told the agency that building owners and managers demand fees and other charges "not imposed on incumbent carriers." In one instance, a building manager demanded a rooftop access fee of $1000 per month and a $100-per-month fee for each hookup in the building, amounting to more than $100,000 per year for Winstar. Yet, it is an open question whether property owners have a legal obligation to render wireless service providers the same treatment rendered to incumbents.

Potentially, this situation could be a huge problem because there are 750,000 commercial and apartment buildings nationwide receiving telecommunications service, but only a fraction of them are covered by agreements with Winstar and other wireless providers. Any decision the FCC makes to grant easier right-of-way access to wireless companies is sure to be challenged by real-estate groups such as the National Association of Realtors, the Building Owners and Managers Association, and the National Apartment Association. These organizations have already appealed a prior FCC order, and if they resist the FCC's order in 99-141, there may be no near-term "second wire" to the home and business.

Canarie Inc. of Canada proposes a "third wire" to the home. Its director of network projects argues in a discussion paper, "Gigabit Ethernet to Every Canadian Home by 2005," that gigabit speeds are economical in a new local loop, and wireless access has too many problems.

The director, Bill St. Arnaud, says, "Wireless access may be optimum for remote and rural access, [but] the high cost of roof access, the lack of line-of-sight access, disruption from extreme weather, and interference from other radio-emitting sources will present serious challenges in obtaining full coverage in urban locations....Usable bandwidth [is limited]....Wireless data rates up to 100 Mbits/sec [are conceivable, but] it will be quite another challenge for wireless to scale to Gigabit Ethernet speeds."

St. Arnaud opines, "Voice telephony is moving to a wireless environment and cable TV may be more efficient [over] direct broadcast satellite, making the old copper and coax plants obsolete." He suggests that the Internet may be the "only wired terrestrial service" left, and it "probably does not make sense to try to integrate traditional [telephone and cable-TV] services with Internet delivery." He proposes a new network dedicated to Internet traffic running "in parallel" with the phone and cable networks, but the plan has an Achilles heel.

St. Arnaud sees the computer as "the principal appliance in the home that would connect to this network, [which] would directly connect to the high-speed...simple Ethernet interface...now readily available for most home computers." But according to tests reported by Network World magazine, home computers cannot handle gigabit speeds because the gigabit network interface card (NIC) is much too slow. For each second that 1000 Mbits went into the card, only 30 Mbits were processed. Jim Hayes, president of Fotec, explained the poor performance: "PCs use software to process incoming data, while switches use firmware and hardware. PCs cannot process data fast enough to utilize gigabit speed....It requires far more memory and integrated circuits than standard Ethernet NICs now have." He added, "If the cards are improved then you're driving up PC costs, and that cuts against consumers' preference for cheaper computers," another factor working against the commercialization of a gigabit-to-the-home network.

Hayes's comments are a replay of an observation made at the FCC's broadband forum held January 1997. Bell Atlantic's then vice president of technology, Pat White, told the FCC: "You have to take a really total network solution....2 or 3 kilobits per second is what you normally get when you download files from the Internet....There's almost nothing that we could do at least within the immediate future to make use of the [Internet's] bandwidth. I mean the PCs are not there." Almost three years later, they are still not.

The 2-wire theory emerged more than 15 years ago, but has yet to succeed. And if it doesn't succeed soon, it won't get a second chance.

Stephen N. Brown writes on public policy in telecommunications. He can be contacted by e-mail at [email protected] or telephone: (615) 399-1239.

Sponsored Recommendations

How AI is driving new thinking in the optical industry

Sept. 30, 2024
Join us for an interactive roundtable webinar highlighting the results of an Endeavor Business Media survey to identify how optical technologies can support AI workflows by balancing...

The AI and ML Opportunity

Sept. 30, 2024
Join our AI and ML Opportunity webinar to explore how cutting-edge network infrastructure and innovative technologies can meet the soaring demands of AI memory and bandwidth, ...

On Topic: Optical Players Race to Stay Pace With the AI Revolution

Sept. 18, 2024
The optical industry is moving fast with new approaches to satisfying the ever-growing demand from hyperscalers, which are balancing growing bandwidth demands with power efficiency...

Advances in Fiber & Cable

Oct. 3, 2024
Attend this robust webinar where advancements in materials for greater durability and scalable solutions for future-proofing networks are discussed.