Don't miss an insight. Subscribe to Techopedia for free.


What are some common misconceptions about data center infrastructure and how do they impact the business?

By Enzo Greco | Last updated: November 1, 2018

The biggest misconception and pitfall is to manage and provision data center infrastructure and the applications which run on them independently. Given the scale, complexity and required optimization of today’s computing infrastructure, organizations cannot afford to generously over-provision their facilities. This has a direct cost impact to the business.

Furthermore, applications and facilities are becoming evermore entwined: Critical infrastructure works best when it is aware of the workloads running atop it, and applications work best when they are aware of the infrastructure upon which they run.

There are numerous examples, including:

  • Using predicted infrastructure failure to pre-emptively move workloads
  • Managing a facility (server utilization, thermal set points, etc.) based on actual workloads, so facilities are not running full tilt while applications sit idle
  • Using “end-to-end” information, from facilities to users, to detect application and security anomalies

There are several business impacts related to the above: cost of wasted energy, reduced application availability with revenue/SLA/customer satisfaction impact, and reduced agility due to the lack of flexibility regarding intelligent workload placement.

Share this Q&A

  • Facebook
  • LinkedIn
  • Twitter


Infrastructure Management IT Business Alignment Data Centers Emerging Technology Identity & Access Governance

Written by Enzo Greco | Chief Strategy Officer at Nlyte Software

Profile Picture of Enzo Greco

Enzo Greco is Chief Strategy Officer for Nlyte Software the leading data center infrastructure management (DCIM) solution provider helping organizations automate and optimize the management of their computing infrastructure. Greco sets Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

More Q&As from our experts

Related Terms

Related Articles

Term of the Day

High-Performance Computing

High-performance computing (HPC) is the use of supercomputers and parallel processing techniques to solve complex...
Read Full Term

Tech moves fast! Stay ahead of the curve with Techopedia!

Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.

Go back to top