Do No Harm With the NHS Digital Health Guidelines

“Do no harm.”

This is a phrase often attributed to the Hippocratic Oath – the ethical standards physicians swear to. While the text does not actually include this exact phrase, it is full of similar language that forces these medical professionals to constantly evaluate whether the care they provide could cause more harm than good.

In exploring new ways to deliver healthcare, organizations around the world are making the same evaluation. So when it comes to the use of medical data, the same concerns still apply.

That’s why, following the implementation of the GDPR in mid-2018, England’s National Health Service (NHS) began consulting with industry experts, academics, regulators and patient representative organizations to develop a code of conduct for data-driven health and care technology.

The code of conduct, updated in July 2019, is meant to “create a trusted environment that supports innovation of data-driven technologies while being the safest in the world, appropriately responsive to progress in innovation, ethical, legal, transparent, accountable, evidence-based, and collaborative.”

Included in the code are 10 principles that any new data-driven NHS initiative is expected to adhere to:

  1. Understand users, their needs, and the context
  2. Define the outcome and how the technology will contribute to it
  3. Use data that is in line with appropriate guidelines for the purpose for which it is being used
  4. Be fair, transparent and accountable about what data is being used
  5. Make use of open standards
  6. Be transparent about the limitations of the data used and algorithms deployed
  7. Show what type of algorithm is being developed or deployed, the ethical examination of how the data is used, how its performance will be validated and how it will be integrated into health and care provision
  8. Generate evidence of effectiveness for the intended use and value for money
  9. Make security integral to the design
  10. Define the commercial strategy

All of these principles are important. Return on investment and commercial strategies can help with the sustainability of your program. Constant evaluation of your data usage and a focus on security will ensure those who entrust you will their data don’t have cause to regret it. However, three of these principles in particular stood out to us:

Understand the users, their needs, and the context

At Dimagi, our mantra is to “design under the mango tree.” Our field managers are taught these app design principles, which involve focusing on the benefits and experience of the application for those who will be using it. That means understanding their use cases, building for the available infrastructure, and always testing in the field to ensure the app actually delivers on the needs they have while on the job.

Check-ins with these users and an evaluation of their circumstances should happen at every stage of the process. You need their input to accurately identify your project objectives: What problems are affecting their community the most? The infrastructure of their catchment area will tell you whether to design your application with offline capabilities. Their experiences and responsibilities will inform each of the user stories around which you design the app. And of course, you need their feedback from testing to understand whether the user experience will allow them to actually use the app the way you intend.

Define the outcome and how technology will contribute to it

There are two discrete instructions in this principle: defining the outcome you hope to achieve and understanding the role of technology within your plan. Inherent in this principle is the idea that technology itself is not the objective or the outcome. It is a tool that will hopefully allow you to achieve your objective more efficiently, but it alone is not a solution. We have spoken about the process for defining your project objectives, and nowhere in there do we speak about mobile devices or application development.

Mobile tools are just one method of data collection and their strengths and weaknesses are considered as part of your overall data collection plan. But the core objective of your program, whether it’s decreasing maternal mortality rates or improving health outcomes for those with chronic diseases, doesn’t care whether you’re using a mobile phone or a pad of paper. It’s the technology that has to adapt to the objective – not the other way around.

Be transparent about the limitations of the data used and algorithms deployed

As we begin to explore the capabilities of artificial intelligence in the programs we work on, it has become clear that it alone will never accomplish your goal. Just like a mobile app, AI is a tool that can help you achieve your objectives. However, it requires a solid foundation and culture of data-driven decision making. We call this the “data curve,” where the base of consistent data use and the support of simple analytics hold up the capabilities of AI.

In order to take advantage of the natural language processing, image classification, and knowledge representation that AI and machine learning can offer, you need reliable data collected consistently and accurately. None of this is possible without data, so establishing a strong mechanism for those data to be collected and ensuring that everyone in the organization is aware of them is paramount. It can’t just be the tech or M&E teams who know how what data you collect from whom and how. Next, you must set up ways to understand what those data are telling you. This involves examining the data for correlations, outliers, and other indicators and creating pathways that turn that analysis into actionable insights for your team. Once your team is in the habit of receiving, trusting, and acting upon these data-driven insights, you can begin to explore how AI might help you perform deeper analysis more quickly to improve the outcomes of your program even further.

Cultivating this trust in data and building a culture of data-driven decision making relies on clarity around the limitations of its insights. Knowing what they can expect will improve your team’s receptiveness to this information and transparency in this area is mandatory. The possibilities that AI can offer are impressive, but the last thing you want to do is overpromise and underdeliver.

Reviewing the principles

These are just three of the ten principles that stood out to us from the code of conduct for data-driven health and care technology developed by the NHS. Understanding the opportunities presented by the future of technology in these sectors is exciting, but it’s vital to also understand their potential consequences when it comes to the privacy, security, and respect their beneficiaries deserve.

To learn more about the advice and implications of the other seven principles, read the guidelines here.

Share

Tags

Similar Articles

The World's Most Powerful Mobile Data Collection Platform

Start a FREE 30-day CommCare trial today. No credit card required.

Get Started

Learn More

Get the latest news delivered
straight to your inbox