By David Soto Dalmau (ERNI Spain)
We are all aware of the importance of data in the business context. However, when talking about the data in MedTech, the security and regulations go much further.
Despite the complexity of these regulations, data and software specialists do play a crucial role in providing parties with solutions for better treatments, diagnosis and patient care. Therefore, the question remains: how can we safely connect hospitals and laboratories to make the most of the available data?
Developed countries have the advantage of having this kind of connectivity in place – with data travelling from laboratory to the cloud and from the cloud to the hospital and all the way back – and the benefits are amazing: instant results, global information availability between healthcare systems, and at the centre of everything, the patient.
Nevertheless, because of the nature of the information processed by hospitals and laboratories which are part of the health system, a possible security breach that affects the confidentiality or integrity of the data can cause severe issues for the health of the patients, making it an extremely critical scenario.
With the data protection regulations around the world on one hand, with high fines in case of data leakages, and the risk to patient safety of health information on the other, a big fear is present in some people’s minds towards the topic of data connectivity. This fear causes countries and hospitals to have a feeling of trepidation towards trying to connect both systems. However, nowadays there are technological solutions to make it possible and safe.
Here is where the importance of secure software design comes into play. Hardware, isolation and locks no longer work as dependably for these scenarios. We need to give the responsibility to the software, and the software has all the necessary tools to make it happen.
If we embrace this make-it-happen mindset, we can think of a viable technical solution putting cybersecurity as a central pillar of the development.
Let’s have a look at some of the challenges and the complications that a software solution has to overcome in order to make it feasible:
Infrastructure security
Once the lab or the hospital is connected to the internet, it is exposed to attacks. It is mandatory to protect the internal infrastructure with minimum exposure and the best hardening for the connections.
That means that only the communication service should be present on the internet, and the network access should be secured through hardening best practices, secure virtual private networks and demilitarised zones for the exposed sides. Strong firewall restrictions are necessary for achieving good results, and a strict “Zero trust” policy must be followed.
Software protection
All the software involved in the process must follow the “Secure by design” principle and pass all necessary checks to ensure there are no vulnerabilities that can allow an attacker to gain unauthorised access to the data or the hosting systems.
The user interaction has to be restricted with authentication systems, strong password policies and 2FA in order to ensure the user’s privileged access, and these privileges must be managed in a separate way.
A secure software development life cycle must be maintained throughout the development.
Vulnerability assessments and checks should be done often during the development period to address and fix unexpected issues in the early stages.
Runtime Application Self-Protection (RASP) systems would also be a good asset to be included in order to prevent unexpected threats and behaviours.
Data security in travel
In order to protect patients’ medical data, ensuring the confidentially of the data while travelling from laboratory to hospital is a must. To do so, strong end-to-end encryption is needed, and a secure communication channel must be guaranteed.
Non-repudiation principle
In order to ensure the identity of the laboratory or hospital, all data communication should be digitally signed.
Data security at rest
It’s always said that nobody can guarantee 100% against unauthorised access, so ensuring the data is unreadable once stored is critical. Encryption at rest is needed in order to protect data confidentiality on both sides by using, again, strong encryption and by storing the encryption keys in a secure vault.
As a side note, after talking about confidentiality, integrity and non-repudiation principles, there is another side to consider, namely the availability of the data in the event of a ransomware attack or a service disruption that makes the hospital/laboratory unable to read the results, which can lead to a catastrophic situation. Therefore, the usual measures must be put in place in all the workstations and servers included in the workflows.
In conclusion, I want to remark that connecting hospitals and laboratories in a safe way is a reality. And we at ERNI have been specialising in it for several years. The first steps towards complex ecosystems are the change of mindset and finding the right people with the technical capabilities. We know for a fact that connecting labs and hospitals has a positive impact on the quality of patient care, and we expect to see this trend on the global level in the near future.