Disruptive innovation and health inequalities

Introduction

Since Covid-19 the adoption of AI, digital and technology solutions within healthcare has been accelerated at an impressive pace. It has been evidenced that in commissioning these disruptive innovations that the worsening of health inequalities is not always considered. The following report details research undertaken in 2022 and outlines the possible implicit bias, under-representation and inequalities built into innovations designed using datasets, or medical tests that historically favour men and white ethnicities. Additionally, the research covers digital exclusion within healthcare and provides ways to reduce the impact upon those patients digitally excluded. The aim is to use this research when designing, developing or commissioning disruptive innovations to avoid unequal access, treatment and experience for diverse groups of patients and service users.    

Medical Devices

Disparities across racial groups are present for a range of health outcomes. Studies have shown that there is limited evidence that our genes drive racial and ethnic health disparities, rather racial disparities in health are better explained by structural racism rather than by genetic differences[1]. This is relevant to ensure fairness and equity in this use of medical devices, precision medicine and other disruptive innovations and recognising any implications of bias. Bias could be interpretation bias, when the clinicians apply unequal, race-based standards to their interpretation of the medical tests. It has been suggested that any published work on new medical devices includes a ‘fairness statement’ to outline how it performs across different population[2].

 The Secretary State for Health and Social Care, Sajid Javid has ordered an independent review looking at whether systematic race and gender bias exists within medical devices including how to address it with the findings and recommendations from the independent panel’s call for evidence expected [3]. A rapid review published in April 2021 by the NHS Race and Health Observatory on the accuracy of pulse oximeters for people from black, Asian and minority ethnic backgrounds suggested there are issues with poor pulse oximeter readings in people with dark skin, meaning that clinical decision making could be made upon reading that may not be accurate. One of the recommendations from the review is that clinicians using pulse oximeters must take skin pigmentation into account when considering effectiveness in making clinical decisions[4]. As with implementation of all innovation ensuring that personalised care is provided and that medical devices like pulse oximeters are appropriate for the individual.    

 Another type of medical device that could create racial disparities is wearable devices (e.g., smart watches) which monitor a range of vital signs including:

 ·       Heart rate

·       Continuous respiratory rate

·       Oxygen saturation

·       Core temperate

·       Absolute blood pressure

·       Electrocardiogram monitor

·       Sleep tracking

Many of these wearables also use photoplethysmographic (PPG) green light signalling. PPG is non-invasive technology that uses a green light source and a photodetector at the surface of skin to measure the volumetric variations of blood circulation and used for heart estimation and pulse oximetry readings. Green light PPG sensors are commonly used in consumer devices as they are cheaper and less sensitive to motion errors than infrared lights used in hospital grade trackers.

There are concerns that with these wearables that use PPG technology, they may only be accurate for a population of people with lighter skin tones, as there is some evidence that these devices are not as accurate in people with darker skin tones, likely due to darker skin containing more melanin which absorbs more green light than lighter skin. In one study inaccurate PPG heart rate measurement occurs 15% more in dark compared to light skin. As with most innovations there is a lack of diversity in validation studies.

These studies are often undertaken with individuals that may lack a diversity of skin tones, and objective measurement of skin pigmentation is rarely reported. Inaccurate signal detection and validation for darker skin tones to identify how wearables work in diverse skin tones may reinforce existing healthcare disparities for those with darker skin tones. If these wearables are being used to support clinical decisions, they may provide false assurances about the effectiveness of monitoring a patient. There is a possibility that if these wearables are being approved by FDA and CE regulatory approval for use in clinical practice, then they could worsen health inequalities for people from ethnic minority groups with darker skin tones[5].    

Disruptive Innovation

Medical Tests

Within some disruptive innovation particularly digital solutions, medical tests will be incorporated into their design. Within some medical tests or clinical criteria there have been ethnic inequalities identified within clinically validated medical tests. For example, in kidney tests the guidance from NICE until August 2021 recommended a correction factor to measure ‘glomerular filtration rate’ equations for people of African-Caribbean or African family origin. This inappropriate adjustment for ethnicity was removed by NICE in the guidance as it led to under-diagnosis of chronic kidney disease and underestimation of disease severity amongst Black people in the UK leading to reduced access to care[6][7]. One study found that removal of this ethnicity adjustment significantly reduced bias and improved 30% accuracy of the eGFR equation and subsequently diagnosis of chronic kidney disease[8].  The healthcare system needs to ensure that if these medical tests have inbuilt race disparities and are digitalised this does not continue with these biases or worsen health inequalities.  

Artificial Intelligence

Artificial Intelligence (AI) solutions in the provision of healthcare have expanded significantly in the past 5 years. AI is the ability of a computer programme or a machine to mimic the problem-solving and decision-making capabilities of the human mind. Simply as a field of technology it combines computer science and robust datasets to enable problem-solving.

 AI works by using datasets to construct an algorithm, written in programming language and used to train the machine being developed.  The data sets could come from scientific research and electronic health records, and could include patient characteristics, symptoms of specific diseases, diagnostic criteria, medication doses, x-rays, scans, or other diagnostic information. The algorithms become the AI’s background reference through which it can recognise and interpret incoming information.

Fundamentally AI systems make decisions based on the data they have been trained on. If that data or the system it is embedded in is not representative based upon protected characteristics such as race, gender etc., it risks perpetuating or even cementing new forms of bias in society. One review that looked at the challenges and opportunities in AI-based image classification methods for diagnosis of skin cancer, argued that most of the current skin lesion datasets belonged to lighter skinned individuals rather than those with darker skins. To reduce social or ethnic bias in deep learning models for skin lesion datasets any AI solution should ensure racial diversity.[9] This failure to ensure this could result in AI solutions struggling to detect cancerous moles in darker skin. Therefore, the ongoing training of these algorithms need to ensure a significant proportion of the patients’ data is from ethnic minority backgrounds.

 The UK Biobank is a large-scale biomedical database which can provide a resource for accredited researchers to access medical and genetic data from 500,000 volunteer participants in the UK gathered between 2006 to 2010. A study concluded that UK Biobank is not representative of the sampling population; there is evidence of a “healthy volunteer” selection bias[10]. It has been argued that if AI health technologies are based on unrepresentative datasets like the Biobank it creates ‘health data poverty’.

 For example, individuals whose data make up the UK Biobank have been shown to be much healthier than the general population with under-representation based upon socioeconomic status and ethnicity. Researchers argue that if data-driven interventions are safe and effective for some groups there is a risk they could be ineffective or even unsafe for others, as the data used to develop and validate the interventions can be highly sensitive to fundamental characteristics including age, sex, gender, ethnicity, or environment. One such instance occurred with a deep learning algorithm for predicting acute kidney disease with only 6.4% of the data in the training dataset coming from female patients resulting in the AI underperforming for females and being less effective[11]

 The MHRA the Medicines and Healthcare products Regulatory Agency in October 2022 launched the Software and AI as a Medical Device Change Programme, a programme of work to ensure regulatory requirements for software and AI are clear and patients are protected[12]. for data-driven technologies in England have no direct oversight for the quality of the data used to train algorithms, and it appears there is no regulator responsible for preventing bias in AI tools[13]. However, The NHS Artificial Intelligence Laboratory (AI Lab) was created to address this challenge by bringing together government, health and care providers, academics, and technology companies. The AI Lab’s role is to support the regulation of the AI ecosystem to develop mechanisms to ensure that approved AI technology can be used safely and ethically at scale. One of the lab’s objectives is providing a regulatory advice and approval service – Multi Agency Advice Service MASS.  This collaboration between NICE, MHRA, CQC and Health Research Authority (HRA) has created a single platform for advice and guidance. This streamlines the regulatory and health technology assessment pathway launched in August 2022 and is currently being test.   In 2022 the NICE Evidence Standards framework for digital health technologies was updated to include evidence requirements for AI and data-driven technologies with adaptive algorithms [14].

The AI Lab has an ‘AI Ethics Initiative’, this programme will ensure that AI solutions used in the NHS and care settings will not exacerbate health inequalities.  There are some actions to counter the risk of algorithmic bias[15] including:

  •  Recognising that the inequities within the data generate algorithmic bias, these data sets should represent both those groups that access healthcare and receive treatment fairly to achieve the best survival rates; alongside data sets of those protected groups that face structural inequalities and barriers to accessing healthcare, and poorer clinical outcomes. If the data sets are not representative, there is a risk the algorithms will reinforce the structural inequalities in healthcare systems. Additionally, if training datasets are developed using historical data about outcomes, data coming from it would reflect and perpetuate any bias in the real world[16].

  • The teams developing algorithms should understand the specificities of the health system context for which they are developing the algorithms to consider the diversity of needs of different groups. The teams should also provide training to recognise and consider implicit bias within the algorithm development and their influence on decision-making.

  • Where possible, data science teams should be as diverse as the populations that the AI algorithms they develop will affect. Those individuals from diverse backgrounds maybe more familiar with the challenges faced by those who are underrepresented in data sets. Research into ‘Biased Programmers’ indicated that programmers who are unrepresentative may exhibit bias (consciously or otherwise) that are passed onto the algorithms that they write[17]. However, it is worth recognising that diverse programmers alone will not eliminate implicit bias.

  • It is worth noting that the government has produced 'Guidelines for AI procurement (2020)' with 10 considerations including ‘make decision in a diverse multidisciplinary team to address the need for diversity to mitigate bias in the AI system’. The government’s guidelines for AI procurement also detail that there should be oversight mechanisms and governance to allow scrutiny of AI systems and maximise transparency in AI decision -making. Transparency and being able to explain how the algorithm works will help clinicians and commissioners to know which information is being used to make a decision in the AI solution to identify if inequities in the existing healthcare system need addressing to reduce bias.  NHS organisations could build in governance for reviewing training and testing algorithms on a regular basis.

 A user guide[18] (February 2022) for project teams seeking access to imaging data from the National Medical Imaging Platform (NMIP) and how to undertake an algorithmic impact assessment has been developed by the Ada Lovelace Institute. This guide is part of a wider research partnership with the NHS AI Lab exploring algorithmic impact assessments (AIAs) in healthcare.

Precision Medicine

 Precision medicine uses information about an individual’s own genes or proteins to prevent, diagnose or treat disease. For example, in cancer pathways using specific information about a person’s tumour to help make a diagnosis, plan treatment, find out how well treatment is working or to make a prognosis. As outlined previously the representativeness of the UK Biobank cohort was investigated by comparing demographic characteristics between non-responders and responders. 94.6% of participants were of white ethnicity (2011 Census 91.3%). UK Biobank participants were more likely to be older, to be female, and to live in less socioeconomically deprived areas than nonparticipants. Compared with the general population, participants were less likely to be obese, to smoke, and to drink alcohol on a daily basis and had fewer self-reported health conditions. If the biobank is not representative across all populations and disruptive innovation like precision medicine used the UK Biobank research the impact could worsen health inequalities in particular based upon ethnicity, gender, and socioeconomic status.  

Digital exclusion

Digital exclusion are the barriers often faced by marginalised groups in their access to digital and technological solutions in healthcare. In the UK and internationally we have seen the rapid deployment and adoption of technology and disruptive innovations during the Covid 19 pandemic[19]. There are benefits to health and social care services to build on this period of change and opportunities including:

 ·       Deliver more efficient and cost-effective care.

·       Help manage demand better and risk-stratify.

·       Improved self-care and management of long-term conditions.

·       Faster multi-disciplinary clinical decision-making.

·       Reduced loneliness and isolation[20] [21].

In line with UK regulatory requirements, it is essential that digital and technology solutions should be assessed and evaluated for evidence of clinical benefit, and patient safety. However, there is also a debate about whether they should be evaluated for their potential to benefit or worsen health inequalities[22]. The wider healthcare system needs to identify if there is a risk that those groups who are already experiencing inequalities could be excluded from accessing or benefiting from these innovations if they are not robustly assessed and any negative impact mitigated against.

Technology access challenges

Digital exclusion in accessing healthcare can occur when people cannot easily access the internet and / or digital technology or have limited digital skills or confidence in using technology. Personal use of technology is widespread, for example an Ofcom survey showed that, in 2020, 82% of UK adults owned a smartphone, 52% owned a tablet and 57% owned a laptop[23]. The average monthly mobile data use was 4.5 GB, an increase of 27% from previous year[24]. Whilst digital technology has become integral to many aspects of our lives, including accessing healthcare, there are challenges with access and use, with many people remaining digitally excluded, potentially increasing the social and economic gap between those who are digitally connected and those who are not.

 For some protected groups, digital exclusion is a challenge. Ofcom’s access and inclusion in 2018 report identified that only 67% of disabled people under 65 years old use an internet-enabled device, compared to 93% of non-disabled consumers in the same cohort[25]. The ONS research in 2017 found that amongst their respondents with disabilities the reasons for non-engagement were: that they did not need it (64%), followed by a lack of skills (20%), and a physical or sensorial disability (2%)[26].

 Broadband - In 2021, 6% of all households in the UK, and 11% of lower socio-economic households, did not have home internet access. Older people are also less likely to have home internet access (18% of over-64s do not have access)[27]. Additionally, access to home broadband increases with income, only 51% of households earning between £6,000 and £10,000 have internet access compared with 99% of households with an income over £40,0001[28]. An annual report on poverty by the Joseph Rowntree Foundation found that, since the coronavirus pandemic, there has been an increased risk of poverty (particularly in-work poverty) among Black, Asian and minority ethnic households. They found that 14% were more likely to be made unemployed and 13% less likely to be furloughed during the pandemic[29]. Therefore, it could be argued that individuals on low income will prioritise paying for other essential outgoings like food, heating, electricity etc. for home broadband bills.  

Although 94% of UK homes in 2021 have broadband and average download speed was 80 Mbit/s, connections are unequal. For residential broadband connections, 22% deliver less than 30 Mbit/s on average, with 8% even falling below 10 Mbit/s[30]. A slow and inconsistent broadband connection can cause video call streams to be interrupted, leading to frozen screens etc. If a patient requires a virtual consultation with their health care professional through online video platforms, if they have a poor connection this could lead to miscommunication and other patient safety issues.

Software operating systems - Another access challenge is the need for regular software updates and operating system updates within smartphones, tablets, and laptops. Often software updates are not compatible with older smartphones (e.g., iPhone 6 and below) due to security risks and the limitations of older phone hardware. For example, the NHS covid-19 app is not compatible with smartphones built before 2015[31].

 Digital skills

 Lower digital literacy has been linked to population groups with lower socio-economic status, education, and those in older age groups, putting them at greatest risk of exclusion[32]. The Essential Digital Skills Framework developed by Lloyds Bank assess digital literacy against seven foundation tasks- if adults cannot do any of the seven tasks they are considered digitally excluded.

 Lloyds Essential Digital Skills Report 2021 found that one-fifth (21%), of our population, c. 11 million, are digitally disadvantaged, lacking “Essential Digital Skills for Life”[33]. Equipping everyone with the digital skills required to engage with technology in everyday can help to bridge this gap. There are specific programmes in the UK to address gaps in digital skills, such as NHS Widening Digital Participation.  However, it could be argued that these types of training programmes are reliant upon people being aware of the training and having the time and motivation to participate. 

Trust and fear in technology

 The Refugee Council[34] has highlighted other factors which could contribute towards digital exclusion for refugee and migrants as often there is a distrust amongst these groups towards ‘official’ organisations such as the NHS.  This could be due to their experience of their originating country as well as other bias and systems of discrimination in the UK that cause mistrust (E.g. incorrect refusal when they try to register with a GP); leaving them unable to navigate NHS services (either online or offline) and these challenges are often exacerbated by language barriers. In particular, asylum seekers are often digitally excluded due to:

  •  Lack of access to technology as often they do not have a mobile phone or sometimes their phones are confiscated at the border under immigration powers.

  • As they are unable to work, they rely upon asylum financial support and are unable to buy phones or afford calls / internet data.

  • Internet access is not always provided in dispersal accommodation and contingency accommodation (e.g. hotels), and if they do have Wi-Fi then often the connectivity is poor.

  • Concern about privacy and lack of trust in how organisations will use their personal information / data.

  • Mobile apps such as the NHS Test and Trace service can lead to scepticism or re-traumatisation for asylum seekers and migrants from countries notorious for surveillance of their citizens.  

Motivation to engage with digital platforms

 Prior to the Covid 19 pandemic, a study found although 50% of adults were willing to have a virtual consultation with their GP, that 25% of people aged over 65 years and 40-45% of people from households earning less than £25,000 a year were not keen[35]. During the national lockdowns people’s perceptions and experiences of virtual or remote consultations have changed. For example, NHS Wales in their evaluation of virtual consultations have found that their patients using the Welsh virtual consultation platform were more than just satisfied with using it, and also found it empowering and more personal in their patient care[36]. However, it is worth noting that only 3.8% of the respondents in their study were from ethnic minority backgrounds. An evaluation study of remote consultation in mental health settings in South London found that in one mental health trust, 97% of survey participants stated they would ‘definitely’ or ‘probably’ use the system again, were they to be offered the option, despite issues with video and audio quality reported in the survey[37]

 As outlined, there are differences in access and engagement with technology across some marginalised groups (e.g., refugees, economic migrants, disabled, and older people). Any introduction of new and disruptive innovation within healthcare needs to reduce the risk of the ‘digital inverse care law’. The inverse care law was created by Julian Tudor Hart, who observed that those most in need of medical attention perversely receive the least and the lowest quality care, whereas those with the least need of health care tend to use health services more[38]. If those already experiencing health inequalities and worse health outcomes are also unable to access digital services, these inequalities could be exacerbated without efforts to mitigate against digital exclusion. These efforts might mean including digital inequality as a factor in the equality impact assessments alongside service development and delivery[39].

 The Nuffield Trust undertook a rapid review of digital primary care. Their evidence suggests that patients with the least need for care are more likely to benefit from online and digital access to primary care. The emerging picture from UK and international evidence shows how shifting primary care online can create inequalities in access to health care by making it more difficult for some patients – often those who are less well and already economically disadvantaged – to get access to the care they need. Their recommendation is to avoid implementation of digital and remote primary care in a rigid way which offers less choice in how patients can access this, which causes challenges for those patients who need care the most. Flexibility is especially important for certain groups of people, including those with learning disabilities, dementia, autism, sensory and communication difficulties, and those experiencing homelessness[40].

 Whilst there is fear that digital health is increasing health inequalities, one should not forget that there are also digital technologies that reduce health inequalities. Examples of this could include:

  •  Health data exchange platforms set up across primary, secondary, social, and voluntary care organisations, to improve integrated care for all patients, but often particularly benefiting people requiring cross-sectional care who often suffer inequalities.

  • Technology has also been used to tackle health inequalities by supporting women in the workplace(for example, self-management apps for women’s health such as menopause) as well as the LGBT+ community.

  • Text and chatbot functionalities have been added to services to allow flexible out-of-hours access routes and navigate around language barriers to improve uptake.

  • Virtual appointments rather than face to face provide greater flexibility for those with caring responsibilities, working hours etc.

  • Moving away from paper-based forms to digitising forms (e.g., mental health act form) enables patients and their families to have digital access to information such as clinical decisions, care plans, treatment plans, medication etc. 

Research accurate as of September 2022. 

END

 References

[1] Mersha, T. B., & Beck, A. F. (2020). The social, economic, political, and genetic value of race and ethnicity in 2020. Human Genomics, 14(1). https://doi.org/10.1186/s40246-020-00284-2

[2] Wallis, C. (2021). Fixing Medical Devices That Are Biased against Race or Gender. Scientific American. Retrieved 12 May 2022, from https://www.scientificamerican.com/article/fixing-medical-devices-that-are-biased-against-race-or-gender/.

[3] https://www.gov.uk/government/consultations/equity-in-medical-devices-independent-review-call-for-evidence/equity-in-medical-devices-independent-review-call-for-evidence

[4] NHS Race & Health Observatory. (2021). Pulse oximetry and racial bias: Recommendations for national healthcare, regulatory and research bodies. Retrieved from https://www.nhsrho.org/wp-content/uploads/2021/03/Pulse-oximetry-racial-bias-report.pdf

[5] Bent, B., Goldstein, B. A., Kibbe, W. A., & Dunn, J. P. (2020). Investigating sources of inaccuracy in wearable optical heart rate sensors. Npj Digital Medicine, 3(1). https://doi.org/10.1038/s41746-020-0226-6

 [6] Liverpool, L. (2021). Kidney test adjustment based on ethnicity cut from UK medical guidance | New Scientist. Newscientist.com. Retrieved 12 May 2022, from https://www.newscientist.com/article/2288008-kidney-test-adjustment-based-on-ethnicity-cut-from-uk-medical-guidance/.

[7] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8360513/

[8] Gama, R. M., Clery, A., Griffiths, K., Heraghty, N., Peters, A. M., Palmer, K., Kibble, H., Vincent, R. P., Sharpe, C. C., Cairns, H., & Bramham, K. (2021). Estimated glomerular filtration rate equations in people of self-reported black ethnicity in the United Kingdom: Inappropriate adjustment for ethnicity may lead to reduced access to care. PLOS ONE, 16(8), e0255869. https://doi.org/10.1371/journal.pone.0255869

[9] Goyal, M., Knackstedt, T., Yan, S., & Hassanpour, S. (2020). Artificial intelligence-based image classification methods for diagnosis of skin cancer: Challenges and opportunities. Computers in Biology and Medicine, 127, 104065. https://doi.org/10.1016/j.compbiomed.2020.104065

[10] Fry, A., Littlejohns, T. J., Sudlow, C., Doherty, N., Adamska, L., Sprosen, T., Collins, R., & Allen, N. E. (2017). Comparison of Sociodemographic and Health-Related Characteristics of UK Biobank Participants With Those of the General Population. American Journal of Epidemiology, 186(9), 1026–1034. https://doi.org/10.1093/aje/kwx246

[11]Ibrahim, H., Liu, X., Zariffa, N., Morris, A. D., & Denniston, A. K. (2021). Health data poverty: an assailable barrier to equitable digital health care. The Lancet Digital Health, 3(4), e260–e265. https://doi.org/10.1016/s2589-7500(20)30317-4

[12] https://www.gov.uk/government/publications/software-and-ai-as-a-medical-device-change-programme/software-and-ai-as-a-medical-device-change-programme-roadmap

[13] Joshi, I., & Morley, J. (2019). Artificial Intelligence: How to get it right Putting policy into practice for safe data-driven innovation in health and care. NHSX. Retrieved from https://www.nhsx.nhs.uk/media/documents/NHSX_AI_report.pdf

[14] https://www.nice.org.uk/about/what-we-do/our-programmes/evidence-standards-framework-for-digital-health-technologies

[15] Panch, T., Mattie, H., & Atun, R. (2019). Artificial intelligence and algorithmic bias: implications for health systems. Journal of Global Health, 9(2). https://doi.org/10.7189/jogh.09.020318

[16] Cowgill, B., Dell’Acqua, F., Deng, S., Hsu, D., Verma, N., & Chaintreau, A. (2020). Biased Programmers? Or Biased Data? A Field Experiment in Operationalizing AI Ethics. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3615404

[17] Cowgill, B., Dell’Acqua, F., Deng, S., Hsu, D., Verma, N., & Chaintreau, A. (2020). Biased Programmers? Or Biased Data? A Field Experiment in Operationalizing AI Ethics. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3615404

[18] Algorithmic impact assessment in healthcare. Adalovelaceinstitute.org. (2021). Retrieved 12 May 2022, from https://www.adalovelaceinstitute.org/project/algorithmic-impact-assessment-healthcare/.

[19] https://www.health.org.uk/what-we-do/supporting-health-care-improvement/adoption-and-spread-of-innovation-and-technology-in-the-nhs

[20] The Health Foundation, Horton, T., Hardie, T., Mahadeva, S., & Warburton, W. (2021, March). Securing a positive health care technology legacy from COVID-19. https://www.health.org.uk/publications/long-reads/securing-a-positive-health-care-technology-legacy-from-covid-19

[21] The Strategy Unit. (2020, April). Improving Digital Health Inclusion: evidence scan. https://www.strategyunitwm.nhs.uk/sites/default/files/2021-04/Digital%20Inclusion%20evidence%20scan.pdf

[22] McCartney, M. (2018). Margaret McCartney: Health technology and the modern inverse care law. https://www.bmj.com/. Retrieved 11 May 2022, from https://www.bmj.com/bmj/section-pdf/981380?path=/bmj/362/8162/Comment.full.pdf

[23] Ofcom. (2021). Online Nation. Retrieved from https://www.ofcom.org.uk/__data/assets/pdf_file/0013/220414/online-nation-2021-report.pdf

[24] Ofcom. (2021). Communications Market Report 2021. Retrieved from https://www.ofcom.org.uk/__data/assets/pdf_file/0011/222401/communications-market-report-2021.pdf

[25] Ofcom. (2019). Access and Inclusion in 2018 - Consumers’ experiences in communications markets. Retrieved from https://www.ons.gov.uk/peoplepopulationandcommunity/householdcharacteristics/homeinternetandsocialmediausage/articles/exploringtheuksdigitaldivide/2019-03-04

 [26] https://www.ons.gov.uk/peoplepopulationandcommunity/householdcharacteristics/homeinternetandsocialmediausage/articles/exploringtheuksdigitaldivide/19-03-04

[27] Ofcom. (2021). Online Nation. Retrieved from https://www.ofcom.org.uk/__data/assets/pdf_file/0013/220414/online-nation-2021-report.pdf

[28] Opinion: Coronavirus has intensified the UK’s digital divide. University of Cambridge. Retrieved 11 May 2022, from https://www.cam.ac.uk/stories/digitaldivide.

[29] Joseph Rowntree Foundation. (2021). UK Poverty 2020/21. Retrieved from https://www.jrf.org.uk/report/uk-poverty-2020-21

[30] Lee, P., & Raviprakash, S. (2021). Subscription Video-on-Demand: The one where the going gets tough. Deloitte. Retrieved from https://www2.deloitte.com/uk/en/pages/technology-media-and-telecommunications/articles/digital-consumer-trends-svod.html

[31] Which phones will the NHS COVID-19 app not work on?  · COVID-19 app support. Faq.covid19.nhs.uk. Retrieved 11 May 2022, from https://faq.covid19.nhs.uk/article/KA-01116/en-us.

[32] Tinder Foundation. (2016). Health & Digital: An evaluation of the Widening Digital Participation programme. Retrieved from https://www.goodthingsfoundation.org/insights/widening-digital-participation/

[33] Lloyds Bank. (2021). Essential Digital Skills Report 2021. Retrieved from https://www.lloydsbank.com/assets/media/pdfs/banking_with_us/whats-happening/211109-lloyds-essential-digital-skills-report-2021.pdf

[34] British Refugee Council. (2021). A note on barriers experienced by refugees and people seeking asylum when accessing health services. Retrieved from https://media.refugeecouncil.org.uk/wp-content/uploads/2021/10/29174557/A-note-on-barriers-experienced-by-refugees-and-people-seeking-asylum-when-accessing-health-services_March_2021.pdf

[35] Castle-Clarke, S. (2018). What will new technology mean for the NHS and its patients? The Kings Fund. Retrieved from https://www.kingsfund.org.uk/sites/default/files/2018-06/NHS_at_70_what_will_new_technology_mean_for_the_NHS_0.pdf

[36] Johns G, Khalil S, Ogonovsky M, Wright P, Williams J, Lees M, et al. (2020). The NHS Wales Video Consulting Service. TEC Cymru. Retrieved from https://digitalhealth.wales/tec-cymru/how-we-can-help/evidence/eval-reports

[37] Health Innovation Network. (2021). Remote Consultations in Mental Health: Learning from Evaluation Report. Retrieved from https://healthinnovationnetwork.com/report/evaluation-of-remote-consultations-in-mental-health-settings-in-south-london/?cn-reloaded=1

[38] Appleby, J., & Deeming, C. (2001). Inverse care law. www.kingsfund.org.uk/. Retrieved 11 May 2022, from https://www.kingsfund.org.uk/publications/articles/inverse-care-law.

[39] Davies, A. R., Honeyman, M., & Gann, B. (2021). Addressing the Digital Inverse Care Law in the Time of COVID-19: Potential for Digital Technology to Exacerbate or Mitigate Health Inequalities. Journal of Medical Internet Research, 23(4), e21726. https://doi.org/10.2196/21726

[40] Paddison, C., & McGill, I. (2022). Digital primary care: Improving access for all?. Nuffield Trust. Retrieved from https://www.nuffieldtrust.org.uk/files/2022-02/digital-access-to-general-practice-evidence-review.pdf  

Previous
Previous

Being successful with Smart Grants

Next
Next

Are digital prescriptions the future to tackle our NHS crisis?