The understatement of the first half century involving optics may have come from a Ciena executive at the Deutsche Bank Technology Conference in September 2016: “[W]e think 100-gigabit is probably going to be the currency for some time still. That is a lot of capacity. I think we all get desensitized to it because everybody talks about 100-gigabit.” Over 20 years ago, this writer predicted the following when he was president of Trans-Formation: “Given the limitations on today`s technology, it is likely that the maximum capacity on a commercially available time-division multiplexer will remain at 10 Gbits/sec for at least 10 to 15 years.” While that range turned out to be true, the most revealing occurrence was that in 2014, about 90% of the transport capacity was still at 10G. In addition, the large Web 2.0 operators, which are today either moving or in the process of migrating to 100G, represent less than 10 percent of the data centers in the world – while the rest are still predominantly dealing with lots of 1-gigs and a good number of 10-gigs. In essence, despite all of the discussion for so many years about how bandwidth demands have been exploding, it is just unreasonable to assume that such a significant leap in capacity (and not to mention the role that 40G will continue to play), which is still only in the nascent stages in terms of getting to a universal data rate, will not last well into the 21st century.
Naturally, on the long-haul side, there has already been the use of higher speeds, including 200G, and the growing hype for 400G in general, will make it appear that it will start taking over the world in the foreseeable future. Nevertheless, we have pointed out in the past other important considerations in taking advantage of 100G as long as possible, such as that the overwhelming bulk of the spending (unprecedented in this market) was done internally by the vendors, as well as the need for service providers to avoid moving to higher cost test sets. There is also no reduction in reach in sticking at the lower rate.
It should further be noted that the purchase of Xtera Communication’s assets by a very well-financed private equity firm may turn out to be quite important in the overall business tending to remain at 100G. Before declaring bankruptcy, Xtera had started looking at the idea of a terrestrial upgrade market by adding L band into the current C band. The notion was to change out the amplifiers in order to increase the channel count — both types of channels would be converted in an alien wavelength-type system (these systems would have an impact on the big data center entities as well by allowing them to simply hook the gear up into a router without the need for a protocol or anything else). Of course, the S-band may be an option as well.
Another advantage to the lion’s share of the market sticking with 100G indefinitely is that it potentially eliminates the cost and complexity in any widespread development of either serial 50G and 100-gig components for the client side. Therefore, the 4x25G opportunity can continue to go strong for an extremely long period of time without being compelled to reduce prices as much because of the threat from higher capacity components.
Those small number of hyperscale customers, which may still be interested in the development of higher speed componentry, in order to potentially reduce cost, can more than afford to pay for these innovations themselves, if necessary.
Please consider our reports, including Clash of Metro 100G Optical Vendors with Shifting Network Paradigm and Clash of Optical Component Vendors & Technologies in Data Center Networks.
For more frequent updates, please follow us here.
[written by Mark Lutkowitz]
For additional commentary on this article, please see: https://www.linkedin.com/pulse/100g-remain-dominant-decades-mark-lutkowitz/