City, University of London

EMPHASIS researchers at City, University of London:


Prof Tom Chen is a Professor in Cyber Security in the Department of Electrical and Electronics Engineering. His research interests are in malware, network security, and cyber-terrorism.



Dr Ali Raheem is a Post-doctoral Research Fellow in the Department of Electrical and Electronics Engineering at City, University of London.  His research interests are in Security Protocols: Internet of Things (IoT), Wireless Sensor Networks, Cloud Computing, Wireless Networks, M2M systems and Pervasive healthcare applications and systems, Peer to Peer Network, Long Term Evolution (LTE),Locator/ID Separation Protocol (LISP), Malware, Virtualisation and Cloud Computing and Verification.

GDPR-based extortion is a dangerous myth

After the third body, the series of murders becomes less “mysterious” as we are getting hints of a dodgy business deal (a house, or a painting) or a guilty family secret from over 20 years ago. Someone has been blackmailing someone else over this, rather than solve the issue or go to the police. This detective story trope must be plaguing the minds of some GDPR story writers at the moment – forgetting that the blackmailer is usually the 3rd if not already the 2nd murder victim.

Looking to feed the nervous sentiment over the upcoming GDPR, or hunting for follow-up threats to cryptoransomware, the information security trade press is currently producing many stories about GDPR extortion. These stories are not just presenting extremely unlikely scenarios, but by their prominence they actually introduce an unnecessary risk.

Emphasis member Eerke Boiten responds to these stories with an article in Computing.

The Kent Cyber Security Centre

Academics from the School of Computing:

  • Julio Hernandez-Castro is a Professor in the School of Computing. His main research areas are Computer and Network Security, Cryptography and Crytptanalysis, Steganography and Steganalysis, Data Loss Prevention and RFID Security.
  • Budi Arief is a Senior Lecturer in the School of Computing. His research interests include cybercrime (especially in understanding the human aspects involved), computer security, and the Internet of Things, with a strong overarching element of interdisciplinary research.

We have two Research Associates working on the EMPHASIS project: Orcun Cetin and Osama Abu Oun.

Orcun Cetin is a Research Associate at the University of Kent’s School of Computing. His research focuses on the economics and human aspects of cybersecurity, where he uses qualitative and quantitative methods to answer questions related to cybersecurity policies and cybercrime victimisation.

Osama Abu Oun is a research associate in the School of Computing at the University of Kent. His research interests include technical aspects of cybersecurity, virtualisation, fog/edge computing and internet of things.

Two PhD students are also working in topics related to ransomware: Jamie Pont and Calvin Brierley.


WannaCry report shows NHS chiefs knew of security danger, but management took no action

File 20171030 18689 132a24x.jpg?ixlib=rb 1.1

supimol kumying/Shutterstock

Eerke Boiten, De Montfort University and David S. Wall, University of Leeds

A report from the parliamentary National Audit Office into the WannaCry ransomware attack that brought down significant parts of Britain’s National Health Service in May 2017 has predictably been reported as blaming NHS trusts and smaller organisations within the care system for failing to ensure that appropriate computer security measures such as software updates and secure firewalls were in place.

But the central NHS IT organisation, NHS Digital, provided security alerts and the correct patches that would have protected vulnerable systems well before WannaCry hit. This is not a cybersecurity failure in the practicalities, but a failure of cybersecurity management at the top level.

Despite the extensive news coverage it received, WannaCry was a major wake-up call for the NHS rather than a downright disaster. It wasn’t a sophisticated attack. But any attack based on an actual zero-day exploit – a software flaw creating a security hole that is not yet known to the manufacturer or has not been made public, and so no defence or patch exists to prevent the attack succeeding – could hit the NHS much harder than WannaCry did.

Given the lessons learned discussed in the NAO report, hopefully the NHS will be better prepared next time. And as there will definitely be a next time, the NHS had better have learned its lessons, because the implications of not doing so could be much greater.

Failing to plan is planning to fail

As it happened, much of the damage caused by WannaCry – including many of the more than 19,000 missed appointments – did not relate directly to the attack. The NAO report makes it clear that the NHS as a whole lacked a proper response to a national cybersecurity incident. The business continuity plan had not been tested against such a serious attack. Although only a relatively small number of NHS organisations were actually infected by WannaCry, other parts of the NHS shut down their systems as a precaution to prevent WannaCry spreading until they were sure what to do. Email systems were switched off without first establishing alternatives, leading to improvisation by telephone and WhatsApp.

More broadly, it has become clear that decentralisation has left NHS cybersecurity very exposed when under attack. NHS Digital provides alerts and patches, of course, but there appears to be no mechanism for anyone to check, let alone enforce, that they are implemented. In any case, security alerts run a risk of being drowned in the stream of “cry wolf” messages from the cybersecurity industry. The NHS trust boards take little ownership of cybersecurity matters, and are not being held accountable because the Care Quality Commission, the NHS regulator, has not included it in their inspections.

The official reaction from NHS Digital to the report was brief – no wonder, as it emerges from the affair having performed what was expected of it. NHS Digital offered on-site cybersecurity assessments at 88 NHS trusts in the years before the WannaCry incident, failing all of them. But without powers of enforcement, it was unable to press for the changes and preventative measures required to improve security. NHS Digital’s own review of the WannaCry incident (as mentioned in the NAO report) had established that most trusts did not even think that cybersecurity was a risk to patient outcomes – a naive and dangerous view in an organisation heavily dependent on integrated digital systems.

The decentralisation of the NHS means that no one is in charge of enforcing the cybersecurity practices that would have prevented WannaCry.

No one left holding the reins

The NAO report acknowledges that NHS trusts could not be blamed for some of the missing software updates. Some medical instruments such as MRI scanners are controlled by software written for old and unsupported versions of Windows, for example, or in some cases by companies that have since gone out of business. Decoupling these machines from the network would solve the most immediate cybersecurity problems, but at the expense of complicating their use and increasing the chance of human error. Neither the NAO nor NHS Digital appear to have a solution yet.

For small NHS organisations, such as individual GP practices, there is likely to be an issue of resources. Who will have the time, and at what point in their already full working day, to ensure computers are updated? Should the many NHS receptionists wait for their Windows updates to complete at the start of their day, or help their patients?

If the lack of resources doesn’t already point at government underfunding of the NHS, the report certainly points to failures at the national level, to NHS England and the Department of Health. Provided with cybersecurity recommendations by both the National Data Guardian and the Care Quality Commission by July 2016, neither body responded until July 2017, months after WannaCry. The urgent need for effective, national-level cybersecurity incident planning in such a decentralised system as the NHS must be clear by now.

The NHS was spared the full impact of a cyber-attack this time, mainly because the technical solution – a “kill-switch” in the ransomware – was quickly discovered by MalwareTech researcher Marcus Hutchins. Next time the NHS might not be so lucky, though new research has been commissioned to this end. Projects such as EPSRC EMPHASIS will look at not only the technical aspects of ransomware attacks, but also their economic, psychological and social aspects to obtain a more rounded understanding of Ransomware.

The ConversationNot only will this interdisciplinary approach increase our understanding of ransomware attacks, but it will also help us to quickly ascertain whether or not the attack is socially engineered – triggered by users opening attachments or clicking on infected web sites – or triggered through technological means such as by a worm, as was the case with WannaCry and not-Petya – the latter seeking to disrupt and destructively wipe data without even attempting to extort money. It’s also important to understand the new means of payments via cryptocurrencies such as bitcoin, because ransomware is usually crime of extortion. With a better understanding of our attackers and their motivations we will be better placed to defend against them.

Eerke Boiten, Professor of Cyber Security, School of Computer Science and Informatics, De Montfort University and David S. Wall, Professor of Criminology, University of Leeds

This article was originally published on The Conversation. Read the original article.