From GIGO to Digital Twin: How DCIM G2 Cleans Up Your Data Center Data Quality
“Garbage in, garbage out.” Everyone who has ever worked in computing or other data-adjacent fields has heard this adage at least once. This phrase or acronym (GIGO) reflects the fundamental concept in both computing and data governance that the quality of your data is the critical determinant of successful results in any system, regardless of whether your focus is IT or OT.
What The Uptime Institute Thinks of Data Quality and DCIM As a Digital Twin
In the final installment of his series on digital twin software, Uptime Institute’s Senior Research Analyst for Cloud and Software Automation, John O'Brien, evaluates DCIM software as a potential data center digital twin through the lens of GIGO and suggests that although current DCIM tools can capture large volumes of data, ensuring the quality of that data continues to be a challenge for data center operators, “Operators often manage tens of thousands of data points across BMS, environmental, cooling, and IT power systems. This data can generate valuable information, but only if it is accurately captured, measured, stored, and made accessible. Over time, data quality can degrade due to poor configuration, missed system updates, and undocumented changes between IT and facilities teams.” The data quality-specific challenges that can limit the effectiveness of legacy DCIM tools in creating a digital twin data center include:
- Data degradation over time due to configuration challenges, missed updates and undocumented changes.
- A lack of a common nomenclature across disparate multi-vendor systems obstructs integration.
- Persistent data silos impede a true single pane of glass view and restrict data sharing between IT and OT, further limit what operators using legacy DCIM tools can access.
O’Brien also suggests, “True end-to-end DTs differ in that they are large-scale, often industry platforms involved in the design and development of products, systems and built environments. They are increasingly used in retrofits supporting AI and high-density IT, as well as the associated power, cooling, circuitry and cabling. In this context, DCIM's domain specialization in data center IT-OT, combined with its visualizations using curated asset data, could be valuable in an emerging digital twin ecosystem that requires granular, micro-level data for specific applications and use cases (see Digital twins: reshaping AI infrastructure planning and Digital twins: the role of simulations).”
O'Brien concludes his analysis with the observation that, “DCIM could — and should — provide valuable physical data on IT and OT assets and connected systems, supporting both historical trend analysis and time-delayed operational analysis. At the same time, operators are demanding more real-time data streaming capabilities to model live environments. Achieving this will require suppliers to adopt common standards and terminology to simplify data sharing and reduce the barriers to system interoperability. At the same time, operators should look for ways to modernize their internal policies to break down data siloes wherever practical.”
O’Brien’s assessment of the problems with legacy DCIM tools indicates that they are insufficient to create the digital twin data centers that modern IT and facilities teams need to meet customer needs at the speed of business.
If you would like to access the full report and other insights from Uptime, you can request an evaluation of Uptime Intelligence.
How Second-Generation DCIM Surpasses Legacy DCIM to Create Digital Twin Data Centers
Like any good digital twin data center software, second-generation DCIM (DCIM G2) does more than just capture large volumes of real-time data from the intelligent PDUs and other physical assets in your data center. Its other benefits include:
- Driving data quality through automation, data validation and data governance.
- Creating 2D and 3D data center simulations and dashboards in an open system that reduces the likelihood of data quality degradation and data silos through a vendor-agnostic and integrated environment.
- Encouraging IT and OT to align data on a single system for simplified metrics, and scalable, modern, data-driven analyses and decision-making.
DCIM As a Digital Twin in Action
Sunbird recently helped three data center operators create digital twin data centers that reduce their reliance on legacy tools.
A colocation data center and cloud services provider wanted a vendor-agnostic, accurate and secure way to provide their customers with a way to only view the customer’s facilities and assets down to rack-level granularity. With the real-time data collected from PDUs, second-generation DCIM could easily create dashboards and simulations to give customers the visibility they needed to manage their assets.
Another large data center operator was struggling to manage multiple sites with multiple spreadsheets per site. DCIM enabled them to leave their spreadsheets behind and build digital twin data centers with 3D visualizations that let them see where each asset was inside each cabinet at every site.
A third company, a large pharmaceutical company, managed thousands of cabinets across over 50 sites. DCIM not only enabled this company to improve data quality through integration, but also to leverage this high-quality data to create dashboards and 2D and 3D data center simulations for faster, smarter planning and analysis. It additionally helped the company’s data center managers align on a single source of IT and OT truth.
As more companies look to modernize their approach to IT and OT to address AI infrastructure challenges with digital twin data centers, it’s easy to see why so many facilities teams are turning to second-generation DCIM to create their digital twins. Want to see how Sunbird DCIM enables a digital twin of your data center? Get your free test drive now.
- PREVIOUS ARTICLE
- NEXT ARTICLE





























