Amid the perennial arguments around the skills gap that has marred the information security space, the topic of cybersecurity education usually makes an appearance. But how big really is the dearth of infosec professionals? At present, there’s a 25 percent gap between the demand for cyber talent and the existing supply.
According to a report by Peninsula Press, in 2016, more than 209,000 cybersecurity jobs in the U.S. were unfilled, “and postings are up 74 percent over the past five years.” Within months, the demand for cybersecurity professionals will increase to approximately six million globally.
How did we get here? In the late eighties, the Internet was still a very new thing most people had not even heard of, but there were a lot of early adopters who saw the potential in connecting the world’s computers. Most of these pioneers homed in on the positive, productivity-enhancing aspects of the new technology while others set out to test its limits. On November 2, 1988, a coder named Robert Tappan Morris released one of the very first malware programs in history out of his curiosity to know how quickly a malicious code could spread.
The worm, eventually dubbed the Morris Worm, infected nearly 10 percent of the internet in a span of days. The result was major damage, widespread outages, a conviction under the Computer Fraud and Abuse Act for Morris, and the creation of the CERT Coordination Center by the Defense Advanced Research Projects Agency (DARPA). The center was focused on giving experts a central point for coordinating responses to network emergencies, thus laying the foundation for information security.