TCO Analysis of a Traditional Data Center vs. a Scalable, Containerized Data Center

Posted by on April 29, 2012

White Paper 164

Power and cooling systems available now are more modular, more standardized, and more efficient than those installed in the majority of data centers today. Whether upgrading an existing data center or building a new one, data center managers will minimize both capital and operating expenses by specifying physical infrastructure with the following attributes:

  • Standardized, pre-assembled, and integrated components
  • Modular infrastructure than can scale as the load increases over time
  • Efficient power and cooling components
  • Cooling design with integrated economizer mode
  • Pre-programmed controls

White Paper 163,“Containerized Power and Cooling Modules for Data Centers”, describes how standardized, pre-assembled, and integrated modules (sometimes referred to as containers) save deployment time and upfront cost compared to the same electrical and mechanical infrastructure implemented in a “stick built” manner with custom engineering and considerable onsite work.

However, significant additional savings can be achieved. The modular nature of facility modules enables scaling and rightsizing to actual data center loads. This, in combination with current power and cooling distribution technologies, results in a TCO savings of nearly 30% over a traditional data center (27.2% capital cost and 31.6% operating cost).

“TCO Analysis of a Traditional Data Center vs. a Scalable, Containerized Data Center” Full White Paper (Download It Here)

Executive Summary:

Standardized, scalable, pre-assembled, and integrated data center facility power and cooling modules provide a “total cost of ownership” (TCO) savings of 30% compared to traditional, built-out data center power and cooling infrastructure. Avoiding overbuilt capacity and scaling the design over time contributes to a significant percentage of the overall savings. This white paper provides a quantitative TCO analysis of the two architectures, and illustrates the key drivers of both the capex and opex savings of the improved architecture.


  • Cost Comparison
  • Assumptions


Traditional designs almost always intentionally incorporate excess capacity upfront because subsequent expansion of power and cooling capacity is extremely difficult and costly in a production data center. This often has the effect of people being overly conservative in capacity planning which then results in higher upfront capital costs and a chronically inefficient data center. The proper deployment of facility modules, on the other hand, eliminate this wasteful oversizing tendency, because its standardized, modular architecture makes adding or reducing capacity to meet real-world, dynamic demand much easier. This, in conjunction with efficient, integrated power and cooling technologies results in TCO savings of 30% compared to a typical oversized data center operating today.

White Paper #164 Written By:

Wendy Torell

Universal Networking Services is proud to partner with Datapod™ to deliver an unique alternative to the traditional bricks and mortar data center installation. With Datapod we can provide the data center community an alternative solution that maximizes their investment and increases the reliability and availability of their mission-critical facility.  Datapod is an unique, modular data center system that incorporates innovative design and cutting edge mechanical and electrical engineering. It has extended the concept of containerized data centers to include critical site infrastructure such as modular generators, chillers, and deployment services thereby providing a complete infrastructure solution for data centers. By enabling data center users to deploy when they like, where they like and for how long they like, the Datapod system offers performance superior to that of  a “bricks and mortar” data center facility, deploys faster and at a more cost-effective price point.

Please feel free to contact Waite Ave at or click on contact us to learn more.

Comments are closed.