AI intervenes in soft target cyber attacked schools

Schools are soft targets for scammers, AI is changing that.
Jul 26, 2021
AI
The technology is always looking for anomalies

Education has one of the highest click rates for spam, that means schools are among the most vulnerable to nasties like denial of service, ransom ware or being coopted to a bitcoin mining bot net.

Why is that? Predictably, it has a lot to do with a time poor staff trying to do too many things at once, that and the sophistication of the scammers.

“The reality is that too often teachers, staff and pupils around the world, busy doing their jobs or learning, are faced with very convincing and sophisticated social engineering email attacks. The people on the receiving end of these emails will not make every decision with security in mind. Training on how to spot them is important, but some malicious emails are now virtually indistinguishable from genuine communication and there are no hard and fast rules for how to identify them,” says Hayley Turner, Director of Industrial Security, APAC at Darktrace.

Attackers view the public sector as lagging in cyber maturity and suffering from a cyber skills shortage. Schools and are culturally places of open collaboration, sharing ideas and innovation, making them soft targets.

In the case of Girton Grammar, the school’s IT team of only four people oversaw all users, devices, and systems across the entire institution. This is not uncommon in the education sector. 

The school deployed self-learning AI technology across its entire digital business to unify its security strategy and provide full visibility over its digital world.

Using unsupervised machine learning, the AI learns what is ‘normal’ across the school’s infrastructure and uses this understanding to spot threats as they emerge.

“Self-learning AI technologies act as a force multiplier to the school’s team, handling the complexity and network noise, so limited security staff can focus on the most pressing security incidents or strategic activities like staff training. This is what robust cyber security looks like.

 “In an ideal world, organisations are equipped with technology that interrupts attacks as they emerge – before harm can be done. We are in a new era of cyber-threat, where attacks move too quickly for human teams to contain. Once ransomware is able to spread, it’s usually too late. That’s why thousands of organisations today have handed over the keys to AI to respond to emerging attacks on their behalf,” says Turner.  

In April 2020, Darktrace’s AI stopped attackers who were trying to encrypt digital PhD research papers. 

A university in the UK was hit with an attempted ransomware attack – a kind of malware attack that strikes in seconds and blocks access to a computer system until a sum of money is paid to hackers. Hackers tried to gain access to staff and students’ computers via an externally facing server – using a mechanism typically used by IT teams to remotely diagnose and resolve problems on employees’ computers. 

Once they gained a foothold, in an instant Darktrace AI spotted the attacker attempting to move laterally and log into devices in order to encrypt files that were later revealed to be PhD student research papers. Having identified the abnormal behaviour associated with the ransomware attack, Darktrace’s AI interrupted the malicious activity precisely, without disrupting the university’s normal practices.

“The education sector is a challenging industry to secure, particularly when the physical wall around these networks has dissolved in the global transition to remote learning. While the volume of traffic and the complexity of the digital environment that needs to be monitored and protected rivals many large businesses, the resourcing for IT security in schools is much more limited."

Facilitating remote learning has resulted in a widespread adoption of new collaboration tools – meaning new data movements and new holes in security defences. Cyber-criminals are opportunistic and have taken advantage of this shift, disrupting and extorting schools with fast-moving ransomware that infects their digital systems in a matter of seconds. At a time when schools cannot afford any disruption to learning, hackers know they are more likely to pay the ransom.

“At some education institutions, tackling this challenge head on requires a mindset change. That’s why a growing number of schools and universities are investing in artificial intelligence to keep pace with fast-evolving changes in their digital systems, and using AI to interrupt fast-moving attacks like ransomware,” Turner says. 

There are some safety steps that schools could make immediately and cheaply like adopting the Australian government’s Essential Eight security hygiene practices. Anyone with legitimate access to the school’s network, such as teachers and students, needs to have an understanding of basic cyber hygiene practices to avoid accidently exposing the network to risk. Running cyber security awareness training sessions for teacher and student groups is always a good practice.