How Using the Internet is Bad for the Planet

Server farms are the unsung heroes of the cloud computing revolution. They’re home to remote computers that handle digital tasks so your desktop doesn’t have to, freeing up storage and streamlining data processing for an almost infinite number of applications. However, there’s a catch – they are silently destroying the environment. Unlike cars and nuclear power plants, server farms keep a relatively low profile. However, servers currently make up 2% of all electricity use in the United States, and have a greater carbon footprint than the airline industry. According to the MIT Press Reader, a single data center can consume the equivalent electricity of 50,000 homes. 

 

Server farms are set to keep growing. Innovations like the Internet of things and artificial intelligence continue to drive demand for remote computing power. Additionally, surfing the web and using online applications like Spotify is a popular pastime that shows no signs of slowing down. According to FinancesOnline, worldwide data traffic will increase by 150% within the next three years. With server farms growing to accommodate this surge, data storage infrastructure may be set to triple in the next decade.

 

What is a Server Farm

First created for academic and research purposes, server farms are now used to power web hosting, scientific simulations, and rendering 3D graphic imagery. The rise of Software as a service (Saas) programs like Adobe and Google Docs has dramatically increased the need for server farms, and people all over the world rely on them daily. Vast server farms, also known as “data centers”, are the “secret sauce” that have allowed The Cloud and remote computing to become possible. Consider the following real world example:

 

19 year old Jimmy is a college student in New York City. He is working on a psychology paper,  which he stores in Google Drive. Jimmy logs on to work on the paper, and his computer requests the file from a server located in a server farm in Virginia. If his computer malfunctions, he doesn’t have to worry, because his files are not stored on the computer itself. They live on the remote server. When he makes updates to his document, the changes are saved on the server for future access from any device.

src: technipage.com

How Server Farms Work

A server farm typically mounts computers, power supplies, routers, and related electronics on 19-inch racks in a data center or server room. Computers in server farms can be connected together for maximal computing power, and keeping computers close together makes them easier to maintain. Machine failure is common, so server farms keep “backup servers” to prevent service disruptions. The upside of backup servers is that consumers never have to worry that the website they want to visit won’t be accessible. The downside is that inactive servers gobble up energy to “stand by” while not being used. Cisco reports that their idle servers still use 40% of their power.

 

The Scope of Server Farms

 

Though invisible in our daily lives, the pollution caused by server farms is far from a niche issue. With the rise of cloud computing, crypto currency, and 5G, the US International Trade Commision reports that there are over 8,000 data centers around the globe. The US is a leader in data farming with a total of 2,670 data centers, while the UK comes in a distant second with 452. Because the internet never sleeps, server farms must run 24/7, 365 days per year to accommodate global data needs. While your local coffee shop may be closed at 4 a.m. on Christmas day, your email account is always accessible.

Hyperscale Data Centers

A small server farm may only have a few machines. However, these micro server farms are quickly falling out of fashion and giving way to “hyperscale” data centers run by major companies. Tech titans like Microsoft, Amazon, and Google combined own over 50% of all hyperscale centers. These massive behemoths can be the size of football fields and house tens of thousands of racked servers. Their large size allows them to use economies of scale to have superior cooling and management technology. Microsoft’s data center in Illinois is one of the largest in the world at 700,000 square feet. As of 2021, there are 600 hyperscale data centers in the world, twice as many as there were five years prior. 

The Energy Demands of Server Farms

The biggest expense for server farms is not the equipment itself, but the energy required to run them. Energy costs can be more than 50% of the total cost of running a data center. Overheating is a major concern that can lead to data loss or work shutdowns. As a result, cooling is responsible for up to 40% of all energy consumption. Worldwide, servers use as much energy as 30 power plants. This is equivalent to more energy than countries like Argentina, Egypt, and South Africa.

 

Data centers run on diesel fuel, and can emit 150 thousand pounds of carbon dioxide annually. Though cleaner forms of power are available, data centers don’t use them on the basis that they are less reliable than traditional sources. Instead, North American data centers get their energy from electricity grids. The biggest offender is Virginia’s “data center alley,” which handled 70 percent of the world’s internet traffic in 2019. Because of backup servers, it’s not unheard of for 90% of energy pulled from the grid to be wasted.  From an energy engineering perspective, the efficiency of server farms is captured by their PUE (Power Usage Effectiveness): to know more, check out this link.

CO2 and Environmental Impact

In addition to wasting energy, data centers are known for having a negative impact on air quality. Many have appeared on the Toxic Air Contaminant Inventory, which records the worst polluters in the tech-savvy Silicon Valley area. According to ComputerWorld magazine, data centers are on track to account for 3.2% of worldwide carbon emissions by 2025, and 14% by 2040. Energy use by data centers has consistently doubled every four years, giving servers the largest carbon footprint of any modern technology.

Wasting Water

Many data centers lower the temperature of their servers through Chilled Water Systems. The U.S. National Security Agency’s (NSA) Utah Data Center requires seven million gallons of water to operate, and has created water and power outages for the nearby community of Bluffdale. The Uptime Institute has shown that a data center (1 MW) with traditional cooling methods uses around 25 million liters of water per year.

The Impact of Electronic Waste

Electronic waste is another major downside to relying on server farms. Data centers must replace their equipment every five years. Though many people are aware of the dangers of exhaust fumes, a single server produces more carbon dioxide than a car. Equipment that a server farm uses is not limited to the servers themselves, and includes:

 

  • Cables
  • Computer room air conditioners (CRACs)
  • Computer room air handlers (CRAHs)
  • Batteries
  • Uninterruptible power supplies (UPS)
  • Power distribution units (PDUs)
  • Transformers 

 

Electronics are abandoned when they fail to meet the standards of reliability and redundancy set by organizations like the Uptime Institute, the digital standard bearer for digital infrastructure performance. According to Greenpeace, less than 16% of all electronic waste is recycled. This is especially concerning because batteries used by server farms require mining heavy metals like:

 

  • Mercury
  • Gold
  • Lead
  • Silver
  • Palladium
  • Cadmium
  • Brominated Flame Retardants



In addition to wasting valuable resources, getting rid of outdated equipment runs the risk of contaminating the environment. According to the MIT Press Reader, “Some of these components have toxic polychlorinated biphenyls (PCBs) and must be disposed of rather than reused.” 

Solution A: Better Server Farms

Because data center cooling takes a major environmental toll, one solution is to put data centers in naturally cold climates like Canada or Iceland. Lower temperatures would take the place of air conditioners and chilled water systems. Other potential methods to reduce energy usage include:

 

  • Using recycled or reclaimed water 
  • Evaporative air solutions
  • Independent fresh air cooling
  • Solar power

 

Major tech companies have recognized the harmful impact that data centers can have on the environment. From 2018 – 2020 Microsoft’s project Natick put servers underwater in an attempt to reduce the environmental impact of cooling. Google, Facebook, and Amazon have committed to becoming carbon neutral through carbon offsetting and investment in renewable energy infrastructures. Google plans to make its data centers carbon free by 2030. On the policy side, the EU has a goal of making all data centers climate neutral by 2030.

Solution B: Grid Computing and ByteNite

Grid computing offers all the positives of cloud computing with remote servers, but avoids the hassle of server farms. Rather than export work to thousands of interconnected servers that require cooling and energy management, systems like ByteNite rely on outsourced and independently operated devices like laptops, tablets, and phones. Additionally, these devices are already in use, which reduces any additional power usage further. The geographic distance between devices allows them to avoid overheating without additional fans or water cooling systems. 


Rather than increase electronic waste, grid computing services have the potential to lower it. ByteNite lets users put their older devices to work, so they can be reused rather than disposed of in landfill. Through innovative solutions, it’s possible to enable technological innovation while also ensuring the health of the planet.

Leave a Reply

Your email address will not be published. Required fields are marked *

ByteNite Inc.
San Francisco,
California 94158, USA