By Lionel Snell
The battle between organization and chaos rages across the cyber universe. Whose side are you on? Business and Government will of course insist that they are the forces of stability and organization, and the cyber criminals and saboteurs are the forces of chaos. But the irony is that in many ways it is business and government that are in chaos, while cybercrime grows increasingly organized – as we shall see.
Let’s start with data from a supposedly reliable source: The World Economic Forum. In 2018 they estimated the global cost of cybercrime to be $600 billion. They estimate that it will reach $3 trillion by 2020. Meanwhile Gartner estimates that total global spending on cybersecurity in 2019 was $124 billion. Compare how much we are spending with how much we are losing and it looks like a very bad bet. Or, as analyst Vikram Phatak, Founder, NSS Labs points out: from an R&D perspective: “The bad guys are able to fund their research at a rate about five times of what the good guys are. This does not bode well for the future.”
Phatak also describes a serious skills shortage (heck! have the good ones gone off to work for the bad guys?) and how, for all the good intentions of DevOps, there are people doing rapid coding via Google search “grabbing an open source repository that may or may not have been backdoored by the North Koreans, Chinese, Russians, Iranians – pick your adversary. We’re literally embedding the next attack vector in the code that we’re developing today.” So that: “What’s our current posture? If you ask most CSOs where do you stand, they probably can’t tell you.” And that is before he has even got on to the perils of IoT, 5G and threats against the physical world of infrastructure.
Have you heard of the Dunning-Kruger Effect? That folks who are incompetent are so incompetent that they don’t know that they’re incompetent. They don’t have the tools to judge their own capabilities. So, the data here, basically, the bottom 25th percentile, the actual performance is terrible, but they thought they did rather well – see diagram.
Conversely, people who are highly competent assume that, if things are easy for them, they will be easy for others. “This happens in Silicon Valley all the time, with software products in general – right?” The best minds overestimate the ability of others.
It was not all bad news, however. Versa Networks’ FlexVNF did achieve the NSS Labs’ coveted “Recommended” rating on the strength of its 99% Exploit Block Rate and high scores in both SSL/TLS Functionality and Total Cost of Ownership criteria.
In summary Vikram Phatak says: “We’re falling behind, and the bad guys have got the upper hand. We keep on making things worse. The tools that we rely upon are incredibly complex and are not getting any easier. Hopefully, with some of the AI stuff it will be, but that’s no guarantee”.
As in any detective work, it helps to start with motive: what are the bad guys wanting to achieve? If you ask someone from the US Secret Service – part of the Department of Homeland Security – you might expect them to major on political threats and cyber war. But Tom Edwards insists that money is still the biggest driver: “the Secret Service has a dual mission. Not only do we protect our president and vice president, family and foreign heads of state, most people don’t know that we also investigate cyber-enabled financial crime. We started investigating financial crime in 1865 with counterfeit currency, and we have evolved with the criminals all the way till today.”
Tom says: “The cyber actors that we deal with are organised groups all over the world, and it’s profit driven, whether public or private sector, through ransomware, through business email compromise. They’re going after the money, they’re going after credit card data, they’re going after personal identifiable information, and then turning that into profit. The next thing would be credential theft. They get into cloud-based servers and steal that data so they can monetise it on the dark web or in other places. “
Ted Ross, CEO and Founder, SpyCloud, knows all about this. He has a team of researchers that interacts with criminals, and social engineer data from them as they steal it, so they can turn that around to our customers and prevent account takeovers. “We started this company three years ago; we have over 13,000 breaches in our database right now. Almost 80 billion assets. Basically, if you have an online identity, we probably have it in our database – which means the criminals have it. They’re highly organised. They build their own databases”.
People underestimate the criminal’s ability to be creative. They assume that, once the information gets to the deep and dark web, companies like Akamai can detect it, and protect their customers. But actually they might continue to analyse that data for up to two years: “First thing they’ll separate the data into high-value targets – a special category for very sophisticated targeted attacks. They leave all the rest for later, and run automated attacks.” One of his customers, a financial organisation that deals with a lot of crypto currencies, said that 10 per cent of the attacks into their network are targeted, and cause 80 per cent of their losses.
So what technical solutions point the way forward? Yes, there is much talk about AI and Machine Learning techniques for automating, accelerating and refining attack detection – but remember what Phatak said about R&D resources. The bad guys are probably already applying these techniques. Maybe a more obvious approach is to focus on visibility? This is a natural human instinct: if we hear a suspicious noise in our house in the middle of the night, most people instinctively switch on the light, even though it could make them visible to the invader. Once humans can see what is happening, we can act fast and often very effectively.
Paul Kraus, Vice President of Engineering for Cybersecurity at NetScout Systems, makes this point: “How do you know what to monitor, or how do you put value on the assets in your organisation, if you don’t even know they’re there? The first aspect is gathering up the inventory of actually what you have. Second of all, can you actually take statistics? Can you look at the changes? Can you monitor to the level that your organisation can accept the risk for that asset being compromised? … I asked a group of security engineers ‘when the dev ops team in your organisation moves something to the cloud, are you involved?’ Out of 300 people, a half dozen raised their hand. Does the security team even understand what’s out there? Does the IT team even understand what’s out there? Let’s go back to visibility. That’s the key bit for understanding your risk posture. Without that visibility, you really have no way.”
This is a subtle shift in focus: from putting all our efforts into software or devices to protect us, to thinking more about what humans need to fight back. For a soldier, binoculars with night vision might be worth more than a gun.Ted Ross takes this up: “When you install a network, you install security with it. It’s part of the install. Hopefully, that’s happening in most places. When you start, to actually implement a security practice, I think most people forget about the human element. The weakest link is the human. I think we are all wired to start with the devices, but I think we should also be starting with human training and educating them on concepts like zero trust, and not to click on an email attachment. If you get an email from the CEO asking you to buy gifts at Apple, call your CEO. Just pick up the phone and call him, and see if he actually sent it. Really basic things”.
Phatak adds: “Cybersecurity has become like gym membership. People buy it, it makes them feel good, but they’re not really deploying and using it properly. They’re not actually going in and doing what they need to do.”
Michael Levin, former United States Secret Service & Deputy Director, US Department of Homeland Security, is now CEO for the Centre for Information Security Awareness, helping organisations induce security into their organisation to create a culture of security. He emphasises that social engineering aspect: “This is one of the things that we educate companies and employees on. It’s not just the phishing email, it’s the different ways employees can be socially engineered. It could be over the phone, it could be in person, it could be through social media. You have to come up with mechanisms and reminders for the employees in the organisation on a daily basis to be on the lookout”.
This all paints a depressing picture. If we cannot offer cheer, we can at least offer advice:
- Forget the old idea that hackers favor low hanging fruit means they are lazy. As Michael Levin says: “what is the first thing that a smart car burglar should do? Check the door handle to see if it’s unlocked. Many hackers are doing very good reconnaissance, finding all the open windows on our networks.” They are not only educated, they are highly motivated. For most employees security is a secondary issue, for hackers it’s their job.
- Think about the relationship between your social media and your business. One company CEO tweeted to his friends that he was heading to Beijing for a few days. A few days later, his CFO got an email – purportedly from him, saying he’d been arrested by the Chinese government and needed money. They sent the money because there was enough detail on social media to make it credible.
- Be suspicious if given any sense of urgency. As Levin says: “Nine times out of 10, it forces people to make decisions very quickly and it often results in a fraud.” If a message is saying “hurry, hurry”, don’t just click. Phone and check, or go to the original website.
- “Support an open culture about cyber threats” says Tom Edwards. If employers are too scared of the boss to confess to clicking on a phishing e-mail, then the company is in trouble. Michael Levin agrees: “There should be some way for them to come out and let the organization know they might have done something wrong without being penalized.”
- If your security thinking is focused only on technology, you will be thinking mechanically. Think instead about the human factors, the hacker’s motivation and the victim’s weak points, and you will have a much broader view of the battlefield.