The title of this piece is quite obvious, but it is also an unappreciated fact. Consider for a moment the change we have seen over the last 30 years: access to cyberspace was scarce, often limited to enterprise users such as governments, educational institutions and the largest corporation, whereas today, there are billions of users that treat the Internet as some basic need for living – just like electricity – with access points into this domain continuing to grow. The entire Internet of Things (IoT) wave may very well clobber us as, even in 2017, we cannot figure out if there will be 20, 30, or 50 billion devices by 2020. (Don’t believe me? Just do a quick Internet search. Reports published over the last 18 months can’t make up their minds). And we are not making our lives any easier when 99 percent of computers are considered vulnerable and attackers are just plain better and faster than a defenders’ ability to protect a network. In brief, technological advancement does not seem to be the problem; we are pretty good at that (like 3D printing an ICBM, for example). But dealing with the technologies we create is a bigger problem. (The thought of your next door neighbor having the capability to “print up” a ballistic missile delivery vehicle should worry you.) So, to better address this problem, we need to ask: how do we use our technology? And perhaps more specifically: how much do we rely on our technology? Consider this: up until the mid-2000s, we used to use our “cellular” phones to make calls, maybe send text messages, and little else. (By the way, bonus points if you know the difference between cellular, mobile, network, and very impressive if you know what “handy” means!) Today, a smartphone allows you to place calls, send multimedia messages, take pictures, watch videos, listen to music, make financial transactions, understand your voice, tell you what your heart rate is, and so much more. Smartphones can even be used to hack networks. (Long gone are the days when you were a cool geek amongst your friends because you knew a few GSM network codes and could do some funny things on your phone.) It's important to note that there is something much more valuable than money: our information (remember from the previous piece: network security + information security = data security). And yet a paradox exists where we would rather not give up this valuable currency, but we continue to do so like we are addicted to some bad fashion. I would suggest to you a main reason for this is that the general public – and perhaps even so-called “experts” – do not have a uniform level of understanding of “cyber” issues. This lack of uniform understanding helps explain why human error is still responsible for 95% of cyber incidents and why, for some time now, malicious actors have shifted away from trying to take advantage of system vulnerabilities to trying to take advantage of users. And remember, if you cannot get at your target directly, you can always take a different route, like going through a third party that has trusted access, a tactic we are seeing more often as cyber incidents attributed to business partners is significantly on the rise. (This is largely due to the fact that both individuals and organizations do not know the details of the cyber policies in place at the third party.) So, let’s think about that for a moment: we don’t really know what we’re doing and we know that we have problems, but now you’re telling us that we have to worry about our third party’s problems too?! Yes. This comment goes back to the title of this piece: technology changes much faster than humans. For well over 30 years, we have been trying to address our cyber problems through a patchwork of technical solutions, failing to appreciate the legal and social frameworks that have been in place for hundreds of years and that most of the cyber challenges we face are just an extension of some pre-existing conflict already happening in the physical domain. Furthermore, the mass hysteria over “cybersecurity” now in 2017 requires some context. If one examines the core of the issues we face today, such as networks being inherently vulnerable, they are not all too different from the ones professionals faced in the 1980s, except that many of the past lessons have been ignored and magnitude and complexity of today’s challenges are just that much more overwhelming. Therefore, while most humans are busy adapting to the Internet by changing our attitudes towards shyness, confidence, knowledge, imagination, and connections to people, or how we consume news, or by feeding new vices like gambling and social media, malicious actors are having a field day taking advantage of all these psychological changes while still having the added benefit of inherently vulnerable networks on their side. Politely, things are a mess in cyberspace right now. So, what is the solution this mess? Slowing down operations, taking the time to sift through our networks, and figuring out what is going is a solution. But I am a realist: we are not going to slow down operations despite all the social talk about “leisure” being needed in life. If anything, I would suggest to you we are adding more and more on our plates. (Totally unscientific study: ask yourself if you have more or less leisure time in your life as time passes.) My solution begins here: take seriously the “people factor” in the cybersecurity equation. I get mortified when I hear people say, “There is no point in training our people because they will just click random links anyways.” NO! All that means is that somebody is not taking personnel training seriously. And that comment is absolutely no different than saying, “There is no point in teaching my kid anything because kids will be kids.” How well would that approach turn out for you? How well does that approach turn out for society? I am not suggesting that you treat your personnel as children, but I am suggesting you support them through the necessary cyber education, which in turn, supports your enterprise (and is probably cheaper in the long term). The daunting challenge we face is this: technology changes so wickedly fast, but humans do not. We easily accept the benefits of technology but rarely have a clue about the associated consequences leading us to coping problems when things go wrong (and “wrong” feels like a permanent state right now). Remember, humans are still at the core of most of the decisions that are being made, whether that decision is to click a link or set a national cybersecurity policy. This simple fact means is all the more reason why people need to be informed and educated across the board. In the next piece, I will take a look at some of the macro-challenges we face, some of which began in 1648.
About the Author: George Platsis has worked in the United States, Canada, Asia, and Europe, as a consultant and an educator and is a current member of the SDI Cyber Team (www.sdicyber.com). For over 15 years, he has worked with the private, public, and non-profit sectors to address their strategic, operational, and training needs, in the fields of: business development, risk/crisis management, and cultural relations. His current professional efforts focus on human factor vulnerabilities related to cybersecurity, information security, and data security by separating the network and information risk areas. Editor’s Note: The opinions expressed in this guest author article are solely those of the contributor, and do not necessarily reflect those of Tripwire, Inc.
Meet Fortra™ Your Cybersecurity Ally™
Fortra is creating a simpler, stronger, and more straightforward future for cybersecurity by offering a portfolio of integrated and scalable solutions. Learn more about how Fortra’s portfolio of solutions can benefit your business.