The Symbiosis Between Automation and Engineering

A little while ago Steve Beaver wrote a post titled “Is Automation Killing The Engineering?” In the post, Steve ponders whether the increased use of automation in today’s data centers is killing engineering knowledge. The argument, as I understand it, says that because tasks are becoming increasingly automated, data center professionals are increasingly less knowledgeable about how things actually work. But is this a valid argument? Is automation killing engineering?

I think there are at least two aspects to this idea that are worth exploring:

  1. On one hand, increased attention to and use of automation is quite likely enabling some IT professionals to do things they weren’t able to do manually. For example, I might not know very much about GlusterFS, but using a Puppet module for GlusterFS I could get it installed and configured without actually having to learn how it is done. (I think this is the trend that Steve is picking up on in his post.)

  2. On the other hand, an increased focus on automation and configuration management is enabling IT professionals to more quickly accomplish things that otherwise would have taken more time and effort. Using the same example from before, while I might know exactly how to configure GlusterFS and get it up and running manually, using a Puppet module to do so makes the process quicker, more scalable, and the results more consistent.

It’s up to us, as an industry and a profession, to ensure that we find—and maintain—the right balance between depth of engineering knowledge and extent of automation in our data centers. While the meme says “Automate all the things!”, we have to ask ourselves, what does it make sense to automate? Further, we have to ensure that we are holding ourselves accountable to know how the automation works. See, the pendulum can swing too far in either direction. We can carefully engineer and hand-craft our solutions (snowflake servers, anyone?), but that’s not the most scalable approach. We can also blindly use automation tools without understanding how things work—in which case the solution might be up and running, but is it running well? Is it optimized? Is it actually meeting the requirements it needs to meet? We don’t know, because we’ve allowed the pendulum to swing too far in the opposite direction.

I believe there is a careful symbiosis between engineering and automation that we must maintain. Yes, we should use automation—it’s a force multiplier that brings scalability and consistency. However, in order to ensure these consistent configurations are the correct configurations, we need to have the right depth of engineering knowledge. We must ensure that we understand both how the automation tools work as well as what the tools are doing.

In other words, we have to live by this phrase I posted to Twitter some time ago (the individual tweet is long gone in the “stream of consciousness” that is Twitter, so I can’t include a link):

“You can’t automate something if you don’t understand it.”

So, my advice to you: Start with automating what you know. Codify your existing engineering knowledge. Then, use the time freed up by those efforts to expand your knowledge, allowing you to automate new things once you understand them. This, I think, allows us to strike the correct balance between automation and engineering.

Disagree with what I’ve said here? That’s OK—share your views, perspectives, or thoughts in the comments below. All courteous comments are welcome!

Tags: ,

  1. Andy Hill’s avatar

    John Allspaw has written quite a bit on this topic here: http://www.kitchensoap.com/2012/09/21/a-mature-role-for-automation-part-i/

    To summarize: he is in favor of “designing and implementing automation while keeping an eye on both its limitations and benefits.”

  2. slowe’s avatar

    Yes, I’ve read some of John Allspaw’s writing, including that specific post. Very good stuff!

  3. Daniel Lord’s avatar

    As an electrical engineer and software developer who has been on occasion a systems engineer and IT administrator, I can draw a parallel to the advantages and caveats of TI automation with one in software development. GUI programming environments that put pre-programmed modules have been around for some time and are often cited as great tools to develop your own software without having to learn powerful programming languages or system nuances or operation and access. Apple OS X even has Automator for combining OS X application with Applescript glue.

    The one common characteristic of all such tools is that they are blunt instruments that trade way the flexibility and fine-grained control of complex languages, tools, and system knowledge for the simplicity of pore-programmed relatively inflexible modules. Simple chores become easy for neophytes, but complex creative tasks are impossible and the lack of underlying knowledge of the systems and what goes on underneath it ca quickly lead a neophyte into deep water.So it is without IT automation.: ti holds great promise leveraging labor in command and control for vast virtual environments and warehouse-sized arrays of blade servers but there will be a temptation to hire less experienced (i.e. less expensive) IT labor to staff their use. The phrase a little knowledge can be dangerous comes to mind.

    As with automation in most things: care must be used with such leverage and it is important to understand the system underneath to avoid the dangers of unintended consequences.

  4. Andy Konecny’s avatar

    Packet analysis queen, Laura Chappell, often states that “auto configuration is evil” when referring to some poorly built automated configurations (20 years ago it was frame types, now it is UPnP and such security holes).
    Automation automates the bad as well as the good. It certainly wouldn’t be a good thing to autodeploy many of those snowflake servers.
    Also automation doesn’t scale down very well, such as when you are deploying systems in smaller organizations where you might only build one or two a year.
    Everything has its place, automation is no exception.

  5. Jeff Ely’s avatar

    You hit upon a key point that I have mentioned to people who are caught up in the automate everything craze. “You can’t automate something if you don’t understand it.” I think this is a key issue in automation that is lost in the automation craze. I like to think of automation as part of a bigger workflow orchestration.

  6. Kurt Bales (@networkjanitor)’s avatar

    I agree with everything in this post (surprise surprise), but I think the key with automation is it gives you the ability to “discover correct” quicker.

    Yes you can automate bad (and we see plenty examples of this), but when investigating highly distributed systems, automation with systems such as puppet or chef allows us to have fast iteration over each config tweak.

    “Ok now that I have it ‘somewhat working’ lets tweak it till its better” now multiply that by every machine and most people will give up.

    Use automation to iterate quickly to awesome!

    In a blog post I wrote recently(http://www.network-janitor.net/2013/06/increase-the-awesome/), I paraphrased it as this:
    1. Remove the mundane
    2. Increase the awesome.

    This is what we need to be using automation for. Free up our time to deliver better services to our end users.

    Just my $0.02,

    Kurt