CILIPS Chartered Institute of Library and Information Professionals in Scotland
Navigation Close

New Voices RGU Student Series 2024 – Tom Rogers

Category: Blog, New Voices, RGU Student Series 2024

The CILIPS SNPC 'New Voices' blog logo, with white and yellow text on a turquoise background

In the 2024 Student Series for the New Voices blog, CILIPS Students & New Professionals Community will be sharing the views of Robert Gordon University students from the MSc in Information and Library Studies.

With special thanks to Dr Konstantina Martzoukou, Teaching Excellence Fellow and Associate Professor, fororganising these thought-provoking contributions.

Portrait photograph of author Tom Rogers.

Today’s blog post author is Tom Rogers. Tom is a senior library assistant for Oxford Health Libraries, as well as a student on the RGU MSc Information and Library Studies course.

 

Empowering the NHS Workforce and Patients with Information Literacy: The Role of Information Professionals and AI

“Information literacy is the ability to think critically and make balanced judgements about any information we find and use. It empowers us as citizens to develop informed views and to engagefully with society.” CILIP 2018

In the NHS, information professionals will serve as ‘a crucial bridge between technology and those who use it’ (Lacey Bryant 2022, p. 385). For a workforce of medical staff and carers, digital skills do not seem like an immediate priority when compared to front-line work. However, within the next two decades, 90% of jobs will require some level of digital proficiency (Limonte 2002). It is important that Information Professionals continue to be ‘active learners’ (Andretta, p. 108) to upskill staff in matters of Information Literacy (IL) (defined by CILIP 2018) and generative AI tools.

Seeking for information and IL itself can be seen “two sides of the same coin,” (Limberg and Sundin in Mai 2016, p. 67) which makes the idea of information literacy seem more digestible. It is not some abstract classroom concept; it is wrapped up in the daily occurrence of searching for and consuming information.

A subdivision of IL is health literacy, defined as ‘a person’s ability to understand and use information to make decisions about their health’ (NHS Digital Manual 2023). Low levels of digital and health literacy impact patient comprehension of health information and diminishes service use. Around half the British population struggles to understand information about managing their own wellbeing (NHS Digital Manual 2023) meaning they cannot make full use of the health information available to them.

Healthcare staff are already trained by librarians on health literacy via eLearning modules. However, these modules could be expanded further to include AI tool implementation, an example being how ChatGPT can translate complex medical jargon to meet the health literacy levels of specific patients. The training could display how ChatGPT instantaneously creates content in multiple languages, eradicating common communication breakdowns (Mai 2016) within healthcare.

Generative AI tools rely on algorithms, meaning there is risk of algorithmic bias in healthcare, such as directing more attention or resources to majority rather than minority populations (Jones 2021). To combat this, librarians in the health information world already run critical appraisal training sessions, which assess the reliability of research/articles. In part, critical appraisals highlight whether a piece of research is applicable to patients, but mostly centre around whether the trial was biased in its sampling/methods/reporting, whether it is methodologically sound, and whether appropriate statistical tests were used such that one can rely on the conclusions of the
research.

Informational professionals must build upon critical appraisal training to highlight AI’s inherent algorithmic biases, stressing the importance of critical thinking for staff when using it to relay information. Recent research has revealed that ChatGPT, when deciding on how to treat patients for depression, is more ‘unbiased’ than GPs. It appears to be better at ‘following clinical guidance’ and ‘does not discern between men and women or economic status when deciding how to treat patients (Searles 2023). Nonetheless, health content generated by AI can confuse users (Brandl 2023) emphasising the need for a human cross-checking.

There is also risk of AI ‘hallucinations’ – outputs that seem plausible but are either factually incorrect or unrelated to the given context (Marr 2023). Training must highlight the nature of these errors as misleading health information has grave consequences for both staff and patients. Privacy is also an issue. Currently, OpenAi offers no procedures for individuals to check whether their personal information is stored, or if they can request for it to be deleted (Gal 2023). If ChatGPT is used to enter data about a specific patient, the NHS need to make sure that processes are compliant with data protection legislation.

In the future, information professionals will play a huge role in educating healthcare staff. It is important that relevant opportunities are provided so information professionals can stay abreast with advancements – Health Education England and CILIP are partnering with providers to make this happen. Yet as the world becomes more technological, it important to remember that people are the heart of digital transformation, just as they are the heart of the NHS (Lacey Bryant 2022).

Word count: 657

References:

Andretta, S. (2005) Information Literacy: A Practitioner's Guide. Witney: Elsevier Science & Technology.

Brandl, R. and Ellis, C. (2023) 'ChatGPT Survey: Can People Tell the Difference?'. ToolTester. Available at: Survey: ChatGPT and AI Content –Can people tell the difference? (tooltester.com) (Accessed: 29 November 2023).

Content style guide. (2023) 'Health Literacy', NHS Choices. Available at: Health literacy – NHS digital service manual (service-manual.nhs.uk) (Accessed: 29 November 2023).

Gal, U. (2023) 'ChatGPT is a Data Privacy Nightmare: If You've Ever Posted Online, You Ought to Be Concerned'. The Conversation. Available at: https://theconversation.com/chatgpt-is-a-data-privacy-nightmare-if-youve-ever-posted-online-you-ought-to-be-concerned-199283 (Accessed: 30 November 2023).

Jones, D.S. and Payton, F.C. 2021 'Racial bias in health care artificial intelligence', NIHCM. Available at: https://nihcm.org/publications/artificial-intelligences-racial-bias-in-health-care (Accessed: 29 November 2023).

Lacey Bryant, S. et al. (2022) ‘NHS Knowledge and library services in England in the Digital age’, Health Information & Libraries Journal, 39(4), pp. 385–391.

Limonte, K. (2019) AI in Healthcare: Re-skilling the workforce with Digital Skills, Microsoft Industry Blogs – United Kingdom. Available at: https://www.microsoft.com/en-gb/industry/blog/health/2018/12/23/ai-healthcare-re-skilling-workforce-digital-skills/ (Accessed: 29 November 2023).

Mai, J.-E., Case, D. O., & Given, L. M. (2016) Looking for Information: A Survey of Research on Information Seeking, Needs, and Behavior. Fourth edition. Emerald.

Marr, B. (2023) 'ChatGPT: What Are Hallucinations and Why Are They a Problem for AI Systems?'. Available at: ChatGPT: What Are Hallucinations And Why Are They A Problem For AI Systems | Bernard Marr (Accessed: 29 November 2023).

McDonald, G. (2018) 'What is information literacy'. CILIP. Available at: What is information literacy? – CILIP: the library and information association (Accessed: 29 November 2023).

Searles, M. (2023) 'ChatGPT: Better Than GPS in Treating Depression, Unbiased on Gender'. The Telegraph. Available at: ChatGPT is better than GPs at treating depression because it is not biased by class and gender (telegraph.co.uk) (Accessed: 29 November
2023).

Skip to content