When it comes to defending its overwhelmingly dominant position on the Internet or in extracting the very last penny in its network cost, Google will play hardball with the best of them. Yet, when it has to do with the quite hyperbolic attacks of the company’s existence supposedly playing a role in destroying the earth, it has almost totally capitulated (partially for political reasons), dramatically increasing the pressure on other cloud players to do likewise. It appears to be a big reason that fewer and larger Data Centers (DCs) are being built by these huge enterprises, which will probably have a negative impact on the fiber optic industry.
Many people in the world are genuinely concerned about the environment and will support reasonable changes to protect the planet. There are others who erroneously view any kind of technological advancement, especially in western countries, as a threat to the large percentage of the world’s population in impoverished conditions, and under the guise of environmentalism, will do everything possible to slow down industrial progress. Ironically, if these elitists were really just preoccupied over the effects of carbon emissions, instead of focusing so much on the Googles of the world, they would spend the bulk of their time on urging nations to provide the funds more quickly to over a billion people lacking access to electricity – substantially reducing the use of biomass, such as kerosene, wood, etc. (which results in 50% more of carbon dioxide per unit of energy than coal).
Of course, right from the start of the Internet explosion, there was a steady chorus from the extremists that the web was harming the environment. A big turning point was an attack by a physicist on the carbon footprint of Google searches in 2009. Although the firm countered with statistics at the time on the relatively low amount of energy utilized, it was obviously not willing to stick to its guns. (Google also realized that the number of searches was going to increase substantially – we estimate that in 2014, the corporation was approaching 600 billion watts per year for searches compared to about 290 billion watts annually in 2009.)
Still, Google deciding to become “carbon neutral,” including through the use of “carbon offsetting,” opened up the floodgates. With these “offsetting” acts, a company can play this silly game of funding the growing of trees, which will at some point in the future theoretically balance out its current carbon output.
Eventually, self-appointed guardians of the environment began demanding that all large enterprises disclose their carbon emissions. Many of them have acquiesced because their talented bean counters can do wonders with carbon offsetting and even turn it into a competitive advantage, especially if it means drawing attention away from any of their actual shortcomings as a business. Still, a much harder task is coming through on the expectation for an answer with any company that happens to be farming out its data center capabilities.
Ultimately, hyperscale DC operators, like Google and Microsoft, started to build enormous DCs, presumably at least in part, to take full advantage of an area offering green energy, such as the use of hydroelectric facilities. According to Google, wind farms are only economically viable in places with robust and constant winds. Be that as it may, as pointed out in a Data Center Knowledge article, a large number of these expenditures by the mega-DC operators in wind as well as solar farms are long-term purchase agreements, which provide cash for development; however, the power to the DCs is not commonly coming straight from those facilities, but from the standard grid. (The renewable energy is bought on the wholesale market and the game mentioned above is played again with these enterprises gaining energy credits.)
While hydropower can provide very cheap electricity, environmentalists are working hard to shut dams down in order to protect wildlife. Concerning solar power, a lot more R&D spending would be necessary to make it ready in a big way for prime time.
Despite all of the maneuvering that has occurred, the bottom line is that DCs will be heavily dependent on fossil fuels way into the future because in general, they provide the cheapest and most efficient means of providing power. Any objective study of all of the data would most charitably suggest that any conclusions that could be made on climate change resulting from carbon emissions caused by human beings would be all over the map, and at worst, would indicate the perpetration of the biggest hoax in modern civilization. In the past, environmental predictions have turned out to have a more dismal record of accuracy than the prognostications by the cheerleaders at many of the market research firms looking at the telecom space.
Concerning the optical business, fewer DCs obviously mean less opportunity for interconnection. More spacious structures could be expected to result in a maxing out at the 100G data rate for intra-DC communications. Given that in the recent past, it was not unusual for Microsoft to have a half a million 10-gig links inside the DC, a similar number of 100G connections would be a heck of a lot of capacity – and with square footage representing such a cheap component, there could be an even greater amount of deployment at that speed.
It should be noted that some people in the industry will assert that there is a different phenomenon going on associated with “Big Data” when it comes to a giant DC in that the latency issues associated with getting information from all of the computers, etc. within the DC will demand greater bandwidth. Given that latency has not tended historically to be a primary driver for jumping to a higher rate (despite hype to the contrary), we are not convinced by the argument.
[written by Mark Lutkowitz]