What are some common misconceptions about data center infrastructure and how do they impact the business?


The biggest misconception and pitfall is to manage and provision data center infrastructure and the applications which run on them independently. Given the scale, complexity and required optimization of today’s computing infrastructure, organizations cannot afford to generously over-provision their facilities. This has a direct cost impact to the business.

Furthermore, applications and facilities are becoming evermore entwined: Critical infrastructure works best when it is aware of the workloads running atop it, and applications work best when they are aware of the infrastructure upon which they run.

There are numerous examples, including:

  • Using predicted infrastructure failure to pre-emptively move workloads
  • Managing a facility (server utilization, thermal set points, etc.) based on actual workloads, so facilities are not running full tilt while applications sit idle
  • Using “end-to-end” information, from facilities to users, to detect application and security anomalies

There are several business impacts related to the above: cost of wasted energy, reduced application availability with revenue/SLA/customer satisfaction impact, and reduced agility due to the lack of flexibility regarding intelligent workload placement.

Related Terms

Enzo Greco
Chief Strategy Officer at Nlyte Software

Enzo Greco is Chief Strategy Officer for Nlyte Software the leading data center infrastructure management (DCIM) solution provider helping organizations automate and optimize the management of their computing infrastructure. Greco sets Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.