A new code of conduct for artificial intelligence and other data-driven technologies has been launched to ensure systems are safe for the NHS.
The code has been drawn up with support from industry, academics and patient groups to provide a gold standard set of principles to help tackle areas such as dementia, obesity and cancer.
It is hoped the code with help provide evidence of what good looks like and to ensure products are suitable for the future.
AI technology is already being used across the NHS to improve the early diagnosis of heart disease and lung cancer, to reduce the number of unnecessary operations performed due to false positives, assist research by better matching patients to clinical trials, and support the planning of care for patients with complex needs. Examples include:
- Moorfields/Deepmind – 1 million anonymised eye scans were shared with Deepmind under a research agreement that began in mid-2016. Deepmind’s algorithm is designed to find early signs of age-related macular degeneration and diabetic retinopathy.
- John Radcliffe Hospital – worked with their partner, Ultromics, to use AI to improve detection of heart disease and lung cancer
- Imperial College London – developed a new AI system that can predict the survival rates for patients with ovarian cancer
The code is made up of 10 principles:
1. Understand users, their needs and the context
Understand who specifically the innovation or technology will be for, what problems it will solve for them and what benefits they can expect. Research the nature of their needs, how they are currently meeting those needs and what assets they already have to solve their own problems. Consider the clinical, practical and emotional factors that might affect uptake, adoption and ongoing use.
2. Define the outcome and how the technology will contribute to it
Understand how the innovation or technology will result in better provision and/or outcomes for people and the health and care system. Define a clear value proposition with a business case highlighting outputs, outcome, benefits and performance indicators.
3. Use data that is in line with appropriate guidelines for the purpose for which it is being used
State which good practice guideline or regulation has been adhered to in the appropriate use of data, such as the Data Protection Act 2018. Use the minimum personal data necessary to achieve the desired outcomes of the user’s needs and the context.
4. Be fair, transparent and accountable about what data is being used
Utilise data protection-by-design principles with data-sharing agreements, data flow maps and data protection impact assessments. Ensure all aspects of the Data Protection Act 2018 have been considered.
5. Make use of open standards
Utilise and build into the product or innovation current data and interoperability standards to ensure it can communicate easily with existing national systems. Programmatically build data quality evaluation into AI development so that harm does not occur if poor data quality creeps in.
6. Be transparent about the limitations of the data used and algorithms deployed
Understand the quality of the data and consider its limitations when assessing if it is appropriate for the users’ needs and the context. When building an algorithm, be clear about its strengths and limitations, and give clear evidence of whether the algorithm you have published is the algorithm that was used in training or in deployment.
7. Show what type of algorithm is being developed or deployed, the ethical examination of how the data is used, how its performance will be validated and how it will be integrated into health and care provision
Demonstrate the learning methodology of the algorithm being built. Aim to show in a clear and transparent way how outcomes are validated.
8. Generate evidence of effectiveness for the intended use and value for money
Generate clear evidence of the effectiveness and economic impact of a product or innovation. The type of evidence should be proportionate to the risk of the technology and its budget impact. An evidence-generation plan should be developed using the evidence standards framework published by NICE.
9. Make security integral to the design
Keep systems safe by safeguarding data and integrating appropriate levels of security into the design of devices, applications and systems, keeping in mind relevant standards and guidance.
10. Define the commercial strategy
Purchasing strategies should show consideration of commercial and technology aspects and contractual limitations. Consider only entering into commercial terms in which the benefits of the partnerships between technology companies and health and care providers are shared fairly.