Story image

Opinion: 5 data center ideas that would never have been imagined 50 years ago

28 Jun 18

Article by Schneider Electric global marketing vice president Abby Gabriel

If you could go back in time, how would you possibly explain modern data center concepts to a late-1960s “data processing programmer”? In those days it was not uncommon to see a paper sign posted in the data center that would read “The computer is down today”. Computer systems were big and finnicky. Not many people knew how to run them or fix them.  In fact, the building that housed THE computer was not called a data center at all.  “Processing center” was the more familiar nomenclature.

Most of these processing centers were owned, operated and maintained by big banks. Trucks loaded with paper would arrive in the evening, and the data would be “crunched” overnight in the processing center. Printouts would be created and sent back to bank branches the next day. State of the art. Right?

Given the realities of that bygone era, how would our late-60’s processing operator react to the following list of modern data center “unimaginables”?

1. “You don’t need the utility to keep your data center running.”

Data centers used to rely on utility power alone. If utility power failed, you were out of luck. Now, in addition to elaborate power backup plans (supported by UPS and generators), power devices within racks are modular and hot swappable. If one fails, the other power modules take on the added load. The failed power module can be replaced by simply sliding out the bad module and replacing it with a new one.  All without interruption, and invisible to the end user who may be sitting thousands of miles away.

2. “You need to cool your data center? Use outside air.”

Computer equipment was highly sensitive in the old days, and internal environmental conditions had to be precisely controlled. Today eco-mode cooling techniques that deploy a variety of economizer technologies allow data center owners to save lots of money by using the power of mother nature to cool their data centers…all without the fear of risking downtime.

3. “If you’re short of qualified staff, just outsource.”

The concept of outsourcing was virtually unknown in the late 1960’s.  Trusting your computer operations to any outside organization was unthinkable.  Today, even the most specialized aspects of data center operations can be outsourced to any number of highly qualified experts.

4. “If you’re short on space, just order a data center in a box.”

Today, the popularity of pre-fab data centers is on the rise.  The power, cooling, and racks required are all pre-configured and preassembled for rapid delivery and for immediate “plug and play” upgrades, or for quickly commissioning “edge” data centers that help support bandwidth-intensive remote applications.

5. “To avoid downtime due to component failure, practice predictive maintenance.”

As the “Internet of Things” (IoT) revolution accelerates into full swing, it is now possible to gather much more precise data surrounding data center and facility equipment performance and to analyze that data for the purpose of predicting much more accurate future performance.  Such practices save millions of dollars each year for those businesses that still rely on break/fix and preventative approaches for maintaining their data centers.

Nostalgia is nice, but in this day and age, technology advancements are too good to ignore.

Data centers have come a long way, especially in the area of integrated data center architectures.

Schneider Electric’s EcoStruxure IT architecture, for example, can be delivered to end users through reference designs, pre-configured solutions, and prefabricated solutions. It can be configured as an entire data center or it can start out as an infrastructure product, like an Uninterruptible Power Supply (UPS) that is managed through the cloud and supported with a 24/7 service bureau.

It can be deployed all at once or it can be built in stages or pieces. EcoStruxure IT consists of three layers ̶ connected products, edge control, and analytics ̶ that are integrated to facilitate IoT connectivity and mobility, cloud analytics, and cybersecurity.

What is the benefit of embracing such an open data center architecture?  Your next big idea can be delivered over a compressed time period to help your data center produce business value and drive your organization’s competitive advantage.

Exclusive: How the separation of Amazon and AWS could affect the cloud market
"Amazon Web Services is one of the rare companies that can be a market leader but remain ruthlessly innovative and agile."
HPE extends cloud-based AI tool InfoSight to servers
HPE asserts it is a big deal as the system can drive down operating costs, plug disruptive performance gaps, and free up time to allow IT staff to innovate.
'Public cloud is not a panacea' - 91% of IT leaders want hybrid
Nutanix research suggests cloud interoperability and app mobility outrank cost and security for primary hybrid cloud benefits.
Altaro introduces WAN-optimised replication for VMs
"WAN-optimised replication allows businesses to continue working in the case of damage to on-premise servers."
DDN part of data mining mission on Mars
DataDirect Networks (DDN) today announced that it will be playing a role in one of NASA’s most critical missions.
Opinion: Data centre management can learn from the Navy
While a nuclear submarine may seem like a completely different beast from a data centre, the similarities in how they should be managed are striking and many.
14 milestones Workday has achieved in 2018
We look into the key achievements of business software vendor Workday this year
HPE building new supercomputer with €38m price tag
It will be installed at the High Performance Computing Center of the University of Stuttgart and will be the world's fastest for industrial production.