Do No Harm With the NHS Digital Health Guidelines
“Do no harm.”
This is a phrase often attributed to the Hippocratic Oath – the ethical standards physicians swear to. While the text does not actually include this exact phrase, it is full of similar language that forces these medical professionals to constantly evaluate whether the care they provide could cause more harm than good.
In exploring new ways to deliver healthcare, organizations around the world are making the same evaluation. So when it comes to the use of medical data, the same concerns still apply.
That’s why, following the implementation of the GDPR in mid-2018, England’s National Health Service (NHS) began consulting with industry experts, academics, regulators and patient representative organizations to develop a code of conduct for data-driven health and care technology.
The code of conduct, updated in July 2019, is meant to “create a trusted environment that supports innovation of data-driven technologies while being the safest in the world, appropriately responsive to progress in innovation, ethical, legal, transparent, accountable, evidence-based, and collaborative.”
Included in the code are 10 principles that any new data-driven NHS initiative is expected to adhere to:
- Understand users, their needs, and the context
- Define the outcome and how the technology will contribute to it
- Use data that is in line with appropriate guidelines for the purpose for which it is being used
- Be fair, transparent and accountable about what data is being used
- Make use of open standards
- Be transparent about the limitations of the data used and algorithms deployed
- Show what type of algorithm is being developed or deployed, the ethical examination of how the data is used, how its performance will be validated and how it will be integrated into health and care provision
- Generate evidence of effectiveness for the intended use and value for money
- Make security integral to the design
- Define the commercial strategy
All of these principles are important. Return on investment and commercial strategies can help with the sustainability of your program. Constant evaluation of your data usage and a focus on security will ensure those who entrust you will their data don’t have cause to regret it. However, three of these principles in particular stood out to us:
Understand the users, their needs, and the context
At Dimagi, our mantra is to “design under the mango tree.” Our field managers are taught these app design principles, which involve focusing on the benefits and experience of the application for those who will be using it. That means understanding their use cases, building for the available infrastructure, and always testing in the field to ensure the app actually delivers on the needs they have while on the job.
Check-ins with these users and an evaluation of their circumstances should happen at every stage of the process. You need their input to accurately identify your project objectives: What problems are affecting their community the most? The infrastructure of their catchment area will tell you whether to design your application with offline capabilities. Their experiences and responsibilities will inform each of the user stories around which you design the app. And of course, you need their feedback from testing to understand whether the user experience will allow them to actually use the app the way you intend.
Define the outcome and how technology will contribute to it
There are two discrete instructions in this principle: defining the outcome you hope to achieve and understanding the role of technology within your plan. Inherent in this principle is the idea that technology itself is not the objective or the outcome. It is a tool that will hopefully allow you to achieve your objective more efficiently, but it alone is not a solution. We have spoken about the process for defining your project objectives, and nowhere in there do we speak about mobile devices or application development.
Mobile tools are just one method of data collection and their strengths and weaknesses are considered as part of your overall data collection plan. But the core objective of your program, whether it’s decreasing maternal mortality rates or improving health outcomes for those with chronic diseases, doesn’t care whether you’re using a mobile phone or a pad of paper. It’s the technology that has to adapt to the objective – not the other way around.
Be transparent about the limitations of the data used and algorithms deployed
As we begin to explore the capabilities of artificial intelligence in the programs we work on, it has become clear that it alone will never accomplish your goal. Just like a mobile app, AI is a tool that can help you achieve your objectives. However, it requires a solid foundation and culture of data-driven decision making. We call this the “data curve,” where the base of consistent data use and the support of simple analytics hold up the capabilities of AI.
In order to take advantage of the natural language processing, image classification, and knowledge representation that AI and machine learning can offer, you need reliable data collected consistently and accurately. None of this is possible without data, so establishing a strong mechanism for those data to be collected and ensuring that everyone in the organization is aware of them is paramount. It can’t just be the tech or M&E teams who know how what data you collect from whom and how. Next, you must set up ways to understand what those data are telling you. This involves examining the data for correlations, outliers, and other indicators and creating pathways that turn that analysis into actionable insights for your team. Once your team is in the habit of receiving, trusting, and acting upon these data-driven insights, you can begin to explore how AI might help you perform deeper analysis more quickly to improve the outcomes of your program even further.
Cultivating this trust in data and building a culture of data-driven decision making relies on clarity around the limitations of its insights. Knowing what they can expect will improve your team’s receptiveness to this information and transparency in this area is mandatory. The possibilities that AI can offer are impressive, but the last thing you want to do is overpromise and underdeliver.
Reviewing the principles
These are just three of the ten principles that stood out to us from the code of conduct for data-driven health and care technology developed by the NHS. Understanding the opportunities presented by the future of technology in these sectors is exciting, but it’s vital to also understand their potential consequences when it comes to the privacy, security, and respect their beneficiaries deserve.
To learn more about the advice and implications of the other seven principles, read the guidelines here.
BAO Systems and Dimagi Announce Integration Partnership
Learn more about the new partnership between Dimagi and Bar Systems. This collaboration will support communities connecting CommCare and DHIS2 to harness technology and data for greater impact, as they improve data processes from data collection to integration to analysis and use.
May 29, 2023
A Day in the Life Of - Wouter Vink, Director of Solutions Delivery, Solutions Division
The Q&A provides insights into the Solutions Delivery team at Dimagi, including A Day in the Life of Wouter Vink, Director of Solutions Delivery, Solutions Division, their consulting role, unique challenges, remote collaboration, career advice, and excitement for the future.
May 26, 2023