Data centersare complicated, fluid, and sometimes risky places. Risky in the sense that while they are being asked relentlessly to be faster, more agile, and more reliable, they also are burdened with an ever-increasing array of equipment, applications, processes, and requirements. In the IT world, complexity breeds points of possible failure — and increased risk is the result.

Recent McKinsey & Company research underscores this challenge in its “Enterprise IT Infrastructure Agenda for 2014” paper. As more business value migrates online and business processes become more digitized, IT infrastructure inevitably becomes a bigger source of business risk. “Even after years of consolidation and standardization, which have led to huge improvements in efficiency and reliability, most infrastructure leaders work in environments that they believe are too inflexible, provide too few capabilities to business partners, and require too much manual effort to support,” says McKinsey principal Bjorn Munstermann in the report. “Addressing these problems to create more scalable and flexible next-generation infrastructure will require sustained actions in multiple dimensions.”

Data centers, McKinsey & Company says, are a key focus of these risky, inflexible processes. In the drive to reduce risk and increase dependability, most data centers employ workflow automation solutions. Unfortunately, many IT organizations have an automation strategy that is fragile, unable to scale, and not built for change. To truly reduce risk through improved IT automation, managers must shift their thinking from an elemental “one-off” approach to something much more robust: the architectural approach.

THE ELEMENTAL APPROACH TO IT AUTOMATION

In most data centers, when a task or process needs to be automated, the traditional answer is to bolt on the answer to the existing system. As examples, a database administrator might need a solution to automate backups, or an integration architect is looking to automate ETL processes on a specific system. A script is written or a job is scheduled for the task, usually on a platform-specific tool such as Windows Task Scheduler or SQL Server Agent.

This “elemental approach,” involving some combination of coding, a new application, hardware, or simply one more addition to the processing schedule, is understandable. And it’s outdated. While point solutions for point problems are expedient in the short term, over time they create significant cost and complexity. Within a few years the IT organization’s landscape is a mix of patched-together automation solutions that are unwieldy, outmoded, and insufficient for the current environment.

When an IT organization implements an automation strategy without thinking of the cross-departmental automation requirements that are at play, silos of automation are built that present barriers to the integration of business and IT operational processes; processes that can be dependent on one another. Moreover, these “point” scheduling solutions that proliferate and clutter system after system are temporary fixes that become outdated or insufficient within a few years. They increase IT complexity and impose increased costs and resources. 

Such patchworks inevitably lead to failures that directly affect the business — and the IT organization’s reputation suffers as a result. For example, if the server that a script is stored on is down for maintenance, the script doesn’t run and IT discovers the problem from a business unit that hasn’t received a file on time. Headaches like these represent “the automation tipping point” when an IT organization realizes it must consolidate multiple automation solutions to improve monitoring, governance, and to more effectively pass data and manage dependencies between systems and applications.

At some point in time, scripts and platform-specific scheduling tools become nothing more than a fragmented mix of automation solutions, implemented independently of each other. They require constant revision and resynchronization in the face of change. In an era when data centers are more diverse than ever before, the situation presents risk to the business by blocking agility.

Furthermore, an elemental automation strategy is not designed for change. That philosophy is reflected in the very nature of scripting. Scripts are like concrete — they’re solid at first, but start to crack and become brittle over time. Writing and updating scripts consumes time and resources — and when it comes time for an update, the developer that wrote the script has often moved on.

ARCHITECTURAL APPROACH TO DATA CENTER AUTOMATION

The more modern alternative to the elemental approach is the “architectural approach.” This notion, which crosses the boundaries of physical assets, technologies, geographies, and work methods, puts processes and systems first. Nimbly responding to business requirements, policies, SLAs, and standards, the architectural approach goes beyond point solutions by placing individual tasks in the context of the broader computing landscape.

An intelligent workload automation solution, geared to the architectural approach, lays the foundation for a policy-driven automation strategy that drives governance, visibility, and control allowing IT to more quickly respond to the demands of the business when something does break, rather than looking for a script running on some disparate server.

The architectural approach to workload automation also provides an enterprise-grade automation strategy. It allows the data center to easily accommodate change and therefore, lets the data center actually use agility as a risk mitigator. Now, managers can embrace complexity and change — two terms that have never been particularly popular with CIOs.

AN AUTOMATION SOLUTION DESIGNED FOR CHANGE

Modern IT automation solutions must be designed for change by going beyond being a “script wrapper” to provide scalability and extensibility. Many of today’s script-driven legacy job schedulers require a developer to hard-code common IT and business workflows, as well as associated job properties, constraints, triggers, flow control, and workflow logic. This is not the formula for flexibility.

According to recent research by Gartner and Forrester Research, the modern workload automation solution should take a different approach by providing a library of production-ready job steps for common IT functions, applications, data sources, and platforms. Ideally, the library should be combined with a dynamic workflow designer that utilizes a drag-and-drop GUI and automation architecture, thereby allowing the enduser to focus on driving innovation via IT automation as opposed to writing and QA’ing lines of code. In this way, the burden of building and maintaining scripts is taken off the shoulders of the enduser, and placed squarely with the vendor. The functionality of intelligent workload automation should also extend to the inclusion of Web services, APIs, and other connectors to allow the solution to easily “plug-in” to custom third-party applications and data sources that may comprise a batch process the organization is looking to automate.

A perfect real-world example of how this works is the concept of “agile BI.” The democratization of BI and reporting solutions throughout the modern enterprise is placing increasing pressure on IT organizations to build or continually update the underlying data integration processes that feed these reporting solutions. Financial service companies, for instance, constantly seek to automate overnight batch processes that pull data from third-party data sources (such as Reuters or Bloomberg), process and format such data, and update their trading desk systems/applications by the start of the next business day.

As new data sources are identified, or as traders and analysts require access to new data sets and reports, the ability of the IT organization to quickly update underlying data integration processes becomes more than just a task that requires completion — it becomes a competitive advantage for the firm. The ability to quickly manage these individual processing tasks in the context of the broader computing landscape saves time, reduces cost, and removes complexity. Importantly, it also allows IT to be more proficient at reporting on the value of services it provides.

With complexity continually adding risk to data center performance, the architectural mindset to workload automation is the “Occam’s Razor response,” validating the notion that the simplest alternative is often the best. The architectural approach allows enterprises to remain nimble, cost effective, and reliable in the face of change. It reduces manual labor and helps maintain a broader perspective on the issues impacting data center operations. As data centers are called upon to bring new products and services to market quickly and dependably, the architectural approach to workload automation is more than a back office tool — it’s a key component for business success.