Fotolia

Master the seven key DevOps engineer skills for 2018

We all know DevOps is hard, but complex new technologies are going to make the development methodology even more challenging. Expert Bob Reselman outlines the skills you'll need.

This year will be an exciting one in DevOps. Cloud-based technologies will continue to grow in 2018, as will the use of AI in day-to-day operations. We're going to see a renewed focus on the role of hardware in both the cloud and on-premises installation. Also, quantum computing will become a regular part of commercial computing.

All of these trends will require developers and operations professionals to acquire new DevOps engineer skills to adapt to this evolving landscape. Below, you'll find 2018's technological trends and the skills DevOps pros will have to develop to be viable in the coming year.

Serverless computing is here, so get used to it

The big three of web service providers -- Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform -- now provide serverless computing environments. AWS has Lambda, and there's Azure Functions and Google Cloud Functions. These technologies were significant investments; they are not going away. In fact, the big three are promoting serverless computing as a first-order way for developing for the web, particularly around the internet of things (IoT).

And so moving into 2018, key DevOps engineer skills will include understanding the basic concepts of serverless computing in terms of architecture, version control, deployment and testing. There are still outstanding problems to be solved, particularly around real-world unit testing of serverless functions in a continuous integration and continuous delivery pipeline.

The latest software testing trends

Learning the most sought-after testing technologies and strategies can make you invaluable throughout the year. Take some time to shift your focus to the future -- so you can shift left with machine learning and AI, as well as microservices and APIs.

Get with IoT, or IoT will get you

IoT is on pace to eat the internet. Back in 2014, Business Insider predicted IoT would become the internet's predominant technology.
 
This year, we'll see even more activity. IoT will have a growing impact in two significant areas: processing and security. In terms of processing, these IoT devices will emit a lot of data, which obviously needs to be processed. The increased demand will put a burden on infrastructure. Understanding how to accommodate the increase in volume due to IoT devices is going to be an important DevOps engineer skill in 2018.

In terms of security, new practices still need to be adopted. One type of consumer hazard is home invasion, in which a nefarious agent takes over household appliances. Imagine some bad tech turning off the heating system in a house during a Boston winter. After a few hours, all the pipes in the house burst. The damage would be significant. In the commercial enterprise, things can get much worse -- think nuclear reactor.

Given the risks at hand, DevOps personnel need to get a firm understanding of the intricacies of IoT. The technologies go beyond the standard practices of controlling a standard data center. The risks are real, and the consequences of not being well-informed are significant.

IoT smart home
Security in your smart home -- and all your IoT devices -- will become even more essential, as code-crunching quantum computers become more readily available.

Get ready for the resurrection of hardware

The days of using any old type of hardware to run a cloud-based VM are coming to a close, particularly as more applications are used in life-and-death situations -- driverless vehicles, for example. The most telling change is the growing attraction of GPU as the processor of choice for AI and Machine learning computation. Hardware is indeed making a comeback.

Cloud providers are listening. Amazon allows you to attach GPUs to cloud instances -- so does Azure and Google Compute Engine. Along with this GPU rise, you are also going to see companies going back to "down to the metal" installations. There are providers out there, such as Packet.net, BareMetalCloud and Storm, that offer hourly rates on actual hardware.

As specialized big data processing becomes more a part of the everyday computing workload, alternatives to multicore commodity hardware will become essential. This hardware resurrection will have a definite impact on DevOps engineer skills and practices. DevOps personnel will need to know the basics of chip architecture -- for example, how is a GPU different from a CPU? We're going to have to refresh our understanding of network hardware and architecture.

Put learning about RPA firmly on your roadmap

Robotic process automation (RPA) is the practice of applying robotic technology to do physical work within a given workflow. In other words, RPA is the about teaching robots to do work with, or instead of, humans.

Over the last few years, RPA has become a standard discipline on the factory floor, and it's getting more prominent in general IT. Short of a Luddite revolution, RPA is not going away. A quote in the Institute for Robotic Process Automation primer is quite telling:  "Though it is expected that automation software will replace up to 140 million full-time employees worldwide by the year 2025, many high-quality jobs will be created for those who are able to maintain and improve RPA software."

As hard as it is to imagine today, teaching robots is going to become an essential DevOps skill. It makes sense in a way. We've been automating since Day One. Applying robotic technology to physical work in physical locations such as a data center is a natural extension of DevOps activity.

Prepare for the impact of quantum computing on your security infrastructure

Quantum computing is no longer a science-fiction fantasy. It's here. IBM has a quantum computer available for public use via the cloud. D-Wave Systems is selling quantum computers commercially. They go for around $10 million each. Google and Lockheed Martin are already customers.

Quantum computing is no longer a science-fiction fantasy. It's here.

The key benefit of quantum computing is speed. There are still problems out there that take classical computers -- computers that use standard binary processors -- billions of years to solve. Decoding encrypted data is one such problem. Such complex code breaking can be done by a quantum computer in a few hundred seconds.

The impact of quantum computing on security practices is going to be profound. At the least, quantum computing is going to allow any text-based password to be deciphered in seconds. Also, secure access techniques, such as fingerprinting and retinal scan, will be subject to hacking. Quantum computing will allow malicious actors to perform highly developed, digital impersonation in cyberspace.

To paraphrase Alan Turing in Imitation Game, "The only way to beat a machine is with another machine."

It'll be the same with quantum computing. DevOps security pros -- whose primary concern is providing a state-of-the-art security infrastructure -- will do well to start learning how to use quantum computing. Quantum computing will provide the defensive techniques required to ensure the safety of the digital enterprise as we move into the era of infinite computing.

Get good at extreme edge cases

In the old days, we'd need a human to look over reports from automated system agents to figure out how to address anomalies. Now, with the growth of AI and machine learning, technology can identify more anomalies. The more anomalies AI experiences, the smarter it gets. Thus, the number of anomalies -- aka edge cases that require human attention -- is going to diminish. AI will have it covered.

But the cases that do warrant human attention are going to be harder to resolve. And the type of human that will be needed to address an edge case will need to be very smart and very specialized, to the point that only a few people on the planet will have qualifications necessary to do the work.

In short, AI is going to continue to grow. But there will be situations in which human intelligence will be required to address issues AI can't. Resolving these edge cases is going to require very deep understanding of a very precise knowledge set, coupled with highly developed analytical skills. If part of your job is to do troubleshooting, start developing expertise in a well-defined specialty to a level of understanding that only a few will have. For now, keep your day job. But understand that the super-specialization and extreme analysis are going to be a DevOps skill trend in the future.

Relearn the 12 Principles of Agile Software

The Agile Manifesto, released in 2001, describes a way of making software that's focused on getting useful, working code into the hands of users as fast as possible. Since the Manifesto's release, the market has filled with tools that support the philosophy. There have been arguments at the process level, and there are a number of permutations of Agile among project managers. Still, the 12 principles listed in the Manifesto are as relevant today as when they first appeared.

Sometimes, we in DevOps get so muddled down in the details of our work that we lose sight of the essential thinking that gave rise to our vocation. Though not a DevOps skill exactly, reviewing the 12 Principles of Agile Software is a good investment of time not only to refresh one's sense of how DevOps came about, but also to provide an opportunity to recommit oneself to the essential thinking that makes DevOps an important part of the IT infrastructure.

Dig Deeper on IT operations careers and skills