Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Attacks/Breaches

12/13/2017
10:30 AM
Gary Golomb
Gary Golomb
Commentary
Connect Directly
LinkedIn
RSS
E-Mail vvv
100%
0%

Automation Could Be Widening the Cybersecurity Skills Gap

Sticking workers with tedious jobs that AI can't do leads to burnout, but there is a way to achieve balance.

According to Cybersecurity Ventures, the cybersecurity skills shortage is now expected to hit 3.5 million positions by 2021 — a huge jump from current estimates of 1 million job openings.

To help compensate for the growing shortage of talent, the cybersecurity industry is embracing artificial intelligence and automation to fill the gap. But can automation actually make the skills gap even greater? Unfortunately, yes — but security can still find a balance.

The Leftover Principle of Automation
The concept of mechanizing human tasks to drive efficiency has been studied since the advent of industrial automation. The primary goal is to automate as much as possible and thus eliminate human decision making in the process because human decisions can be the most frequent source of error in a given process. Any task not assigned to machines is "left over" for humans to carry out.

The problem with this theory, especially in cybersecurity, is that only very well-understood (relatively simple) processes can be automated, meaning the tasks left for security teams are the hard tasks that can't be automated. These difficult tasks require security professionals who have experience and deep domain knowledge. 

This is exacerbating the vicious cycle of security analyst burnout we currently face: 

  • Tasks that provide a sense of completion/satisfaction are automated.
  • Security analysts are increasingly asked to work on tedious, arduous tasks that lead to burnout.
  • Analysts leave to find greener pastures, leaving the security operations center shorthanded.
  • Company struggles to find talent to fill the gap.
  • When security management finds someone to hire, they give the new employees tedious, arduous tasks that lead to burnout.
  • Wash. Rinse. Repeat.

Lessons from the '90s and the IT Community
This isn't the first time this phenomenon has reared its head in the technology world. We saw a similar cycle in the IT/sysadmin world 25+ years ago. The sysadmin of the '90s was near omnipotent when it came to domain knowledge of technology and IT systems. This was driven by need — IT professionals had to be able to fix every problem across technology infrastructure, and that infrastructure was nowhere near as reliable and interoperable as it is today.

As technology advanced, this need for all-knowing IT admins lessened, driven by technology improvements. This necessarily lessens the experience and accumulated knowledge gained from fixing systems and making sure they work together.

Today's IT professionals no longer implicitly acquire deep domain expertise on IT infrastructure in the same ways; however, the analogy also ends here for two significant reasons:

  1. While admins always have to contend with users who break systems unintentionally, they are not faced with armies of users distributed around the world with the sole intention of sabotaging their systems. Simple repetitive tasks can be automated. Accurately discerning behavior and intention within environments that are difficult or impossible to accurately model in the first place is a fool's quest.
  2. Automation of IT infrastructure (DevOps) has led to many positive outcomes, such as requiring fewer people to manage more systems. This works for knowledge domains that slowly evolve and/or are hyper-focused on a specific component of a system. In security, however, the knowledge domain is not dictated by just "security practices" (quite limited), but rather the security professional must be knowledgeable about how technologies are abused across all the legitimate technologies and architectures adopted in the enterprise, most of which evolve extremely rapidly.

Compensating for Automation
Where does this leave the security industry? Is it possible to find a balance? The offshoot of the Leftover Principle is called the Compensatory Principle. This theory says that there are tasks that humans do well that machines don't. People and machines should focus on what they do well, compensating for each other's shortcomings. 

Attempting to automate humans out of cybersecurity is detrimental to our industry and destined to fail, primarily because we're not facing a tech opponent — we're facing human adversaries who go to great lengths to find weaknesses to exploit. Because so much is automated now, security analysts simply aren't required to go to the same depths, which is creating an even wider and more detrimental gap between attackers and defenders.

What's an example of "leftover" work today? The work that nowadays we call hunting — the responsibility of the team to compensate for the ineffectiveness of automated systems — is one example. The inability of most teams to hunt has created a perception that work isn't getting done because there's no talent to do it. The reality is that automation is making matters worse in this context, because effective hunting is based on the analyst having learned the more fundamental techniques while completing more "simple" tasks.

What's the solution? How do we embrace machine learning and automation without making our situation worse?

Organizations need to focus automation on the tedious and error-prone tasks that drive security analyst burnout —while leaving jobs needing more discernment to analysts.  

For instance, automating parts of the alert investigation process can have a huge impact on security analyst productivity. Automating things such as tracking a device as it moves across the network and identifying infected devices by its human owner and their behaviors, rather than ephemeral identifiers like IP addresses (which require more human work to then identify the owner), can be enormously helpful and positive for analysts.

Like many of the overhyped features we've seen over the past couple of decades, from anomaly detection (early 2000s) to analytics (late 2000s), automation is not a cure-all for our cybersecurity woes of today. And worse, without a clear understanding and strategy for how automation will improve the work of your employees, automation might make some of your challenges worse — in a way that could be difficult to compensate for later.  

Related Content:

Gary Golomb has nearly two decades of experience in threat analysis and has led investigations and containment efforts in a number of notable cases. With this experience — and a track record of researching and teaching state-of-the art detection and response ... View Full Bio
 

Recommended Reading:

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
COVID-19: Latest Security News & Commentary
Dark Reading Staff 9/25/2020
9 Tips to Prepare for the Future of Cloud & Network Security
Kelly Sheridan, Staff Editor, Dark Reading,  9/28/2020
Malware Attacks Declined But Became More Evasive in Q2
Jai Vijayan, Contributing Writer,  9/24/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
Special Report: Computing's New Normal
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
How IT Security Organizations are Attacking the Cybersecurity Problem
How IT Security Organizations are Attacking the Cybersecurity Problem
The COVID-19 pandemic turned the world -- and enterprise computing -- on end. Here's a look at how cybersecurity teams are retrenching their defense strategies, rebuilding their teams, and selecting new technologies to stop the oncoming rise of online attacks.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-5132
PUBLISHED: 2020-09-30
SonicWall SSL-VPN products and SonicWall firewall SSL-VPN feature misconfiguration leads to possible DNS flaw known as domain name collision vulnerability. When the users publicly display their organization’s internal domain names in the SSL-VPN au...
CVE-2020-15216
PUBLISHED: 2020-09-29
In goxmldsig (XML Digital Signatures implemented in pure Go) before version 1.1.0, with a carefully crafted XML file, an attacker can completely bypass signature validation and pass off an altered file as a signed one. A patch is available, all users of goxmldsig should upgrade to at least revisio...
CVE-2020-4607
PUBLISHED: 2020-09-29
IBM Security Secret Server (IBM Security Verify Privilege Vault Remote 1.2 ) could allow a local user to bypass security restrictions due to improper input validation. IBM X-Force ID: 184884.
CVE-2020-24565
PUBLISHED: 2020-09-29
An out-of-bounds read information disclosure vulnerabilities in Trend Micro Apex One may allow a local attacker to disclose sensitive information to an unprivileged account on vulnerable installations of the product. An attacker must first obtain the ability to execute low-privileged code on the ...
CVE-2020-25770
PUBLISHED: 2020-09-29
An out-of-bounds read information disclosure vulnerabilities in Trend Micro Apex One may allow a local attacker to disclose sensitive information to an unprivileged account on vulnerable installations of the product. An attacker must first obtain the ability to execute low-privileged code on the ...