Raising cybersecurity awareness – how to enrich the experience

Delivering cybersecurity awareness at the most teachable moment toughens defences against cybercriminals. Tools can catch deepfakes too.
16 November 2022

Helping hand: once again, AI is boosting cybersecurity. Image credit: Shutterstock.

Have you ever received cybersecurity training? Was it a one-off event or is the process ongoing? The type of training received by employees could turn out to have a profound effect on the level of overall cybersecurity awareness at firms. Adversaries are becoming much more targeted in their approaches and to stay secure, companies need to step up their game.

“Criminals are looking for more information; they use social media channels and can spend eight hours or more preparing an attack,” Daniel Hofmann, CEO and co-founder of Hornetsecurity, told TechHQ. Each year, the firm – which grew out of an office in Hanover, Germany, in 2007 and today helps more than 50,000 companies across the world – presents its findings on the current cyberthreat landscape.

Hornetsecurity processes more than two billion emails every month as part of its cybersecurity operations, which adds weight to the analysis. And while it’ll be little surprise to many to see that phishing remains top of the list of email attack methods, the observation that cybercriminals are investing more time in crafting a single campaign is noteworthy.

And once they have gathered their information, the next step for bad actors is to identify a way in – or more likely, several. Hofmann points out that it’s straightforward to find out whether a target company is using Zoom, Salesforce, Microsoft 365, and other popular business communications tools. Attackers will strike up conversations across multiple channels in an attempt to wrong-foot their prey.

M&A concerns

Organizations are particularly vulnerable to phishing attacks, and social engineering more generally, during mergers and acquisitions (M&As). Adversaries will track information in press releases and note whether companies have engaged consultants to help with the transition. It’s a situation where firms can find themselves being caught off guard as M&As will legitimately bring new faces into communications.

Cybercriminals will gladly leverage this period of uncertainty and deploy their psychological tricks. Adversaries will also probe around to learn more about business operations and discover details on payment processes. It’s critical that finance teams, in particular, deploy good working practices such as independently verifying invoice details and requiring multiple approvals.

The application of time pressure, stressing the need for confidentiality, and telling an employee that they have made a mistake (and need to correct it), are all signs that a campaign could be in progress. But unless staff members have strong cybersecurity awareness, recognizing those tells in the moment may not happen readily. And it’s why those policy guardrails, touched on earlier, need to hold firm. It also points to a weakness in conventional cybersecurity training.

“You need to change the mindset and deliver security awareness at the most teachable moment,” said Hofmann. “And that’s what we do with our simulations.” Hofmann is a big believer in raising cybersecurity awareness as employees go about their normal working routine.

Awareness engine

Hornetsecurity has developed a suite of tools that gives SecOps teams a way to quantify the level of cybersecurity awareness within their organization. What’s more, managers can benchmark performance against scores within their industry or sector. The platform has the capability to send simulated emails to employees – generated by an educational spear phishing engine – which sit alongside legitimate messages.

The AI-powered approach, fed with OSINT and information from other sources, replicates realistic scenarios. And the system builds scores based on whether staff successfully report the rogue messages to IT departments or fail to notice and click on the email. When an employee clicks on the email, the tool delivers timely advice on what to look out for – educating staff at exactly the right moment, when it’s more likely to be memorable.

As cybersecurity awareness scores rise and staff become less susceptible to attacks, the AI-scheduler can dial down the frequency of the simulations. But it will never completely turn off, ensuring that the company’s defences remain strong. The tools provide a forward-thinking approach to protecting firms against phishing campaigns and other messaging attacks.

Deepfake detector

Another concern, alongside persistent email threats, is the emergence of deepfakes, which can tip the balance in favour of cybercriminals. Synthetic video and audio content designed to mislead is becoming more commonplace. But here too there are tools that could help to protect employees. This week, Intel announced that it had ‘productized FakeCatcher, a technology that can detect fake videos with a 96% accuracy rate’.

The system, which has been optimized using deep learning, works by looking for subtle changes in the colour of pixels within image frames that correspond to blood flow under the skin. Deepfakes are then revealed as the signals are translated into spatiotemporal maps. Only real human subjects will exhibit true blood flow behaviour (known as photoplethysmography).

“There are several potential use cases for FakeCatcher,” writes Intel. “Social media platforms could leverage the technology to prevent users from uploading harmful deepfake videos. Global news organizations could use the detector to avoid inadvertently amplifying manipulated videos. And nonprofit organizations could employ the platform to democratize detection of deepfakes for everyone.”