{"title":"Gaining Perspective on Artificial Intelligence and Gerontological Nursing","authors":"Sarah H. Kagan","doi":"10.1111/opn.70046","DOIUrl":null,"url":null,"abstract":"<p>In an era where concerns about social isolation, discord, and our planet are rife, the coverage that artificial intelligence (AI) is receiving within and beyond healthcare is unprecedented. Publicity about AI far outstrips most people's basic familiarity with the crises of loneliness, conflict, and planetary health. Indeed, many conversations about AI in care for older people and nursing research that I hear seem oblivious to principles of effective health technology evaluation. The inescapability of AI today drives me back to evaluating this technological domain, prompting contemplation of disadvantages as well as advantages, along with accessibility, cost, and unintended effects. As a gerontological nurse, I always frame analysis of any health technology within the multidimensional process of ageing and with consideration of the needs and preferences of older people. AI offers a notable case in point as we gerontological nurses contend with evaluating this technology that is taking the healthcare industry and the scientific world by a paradoxical electric storm.</p><p>Remember that AI represents a wide range of computerised functions and tools that are aimed at partially or completely replacing human actions. The word intelligence, however, is a misnomer. Machines are not intelligent. The allusion of intelligence comes with humans designing these tools in ways that mimic human function but can rarely even partially replace the higher-level aspects of our capacities. News coverage and social media posts constantly alert us to AI's manifold possibilities. Most excitement today about AI entails those tools that generate written or other content from data that already exist. As you might expect, this type of AI is called generative AI or GAI for short. Many of us are also aware of avid interest in integrating AI into robotics, especially for clinical applications that promote self-care among older people or extend nursing care for them. The AI domain is broad and diverse, garnering lots of enthusiasm but often spare critical analysis contextualised within our discipline.</p><p>Activities involving AI are increasingly common across gerontological nursing research, practice, and education. Within the domain of gerontechnology, for example, many gerontological nurses are investigating the effects of AI-driven interventions like wearable devices or chatbots and other apps on functional outcomes and quality of life metrics. In addition to studying AI in relation to older people, researchers across healthcare are now adopting AI in research data management and analysis. Computer-assisted qualitative data analysis software, for instance, now commonly features AI assistance, which promises to generate findings for investigators more quickly than they could do themselves. Clinical uses of AI are increasingly frequent, too. Many gerontological nurses and other clinicians are recommending smart devices and AI-powered apps to older people and their care partners. Lastly, AI is very visible in nursing education, influencing the preparation of future gerontological nurses with everything from AI-supported editing of their assignments to AI-enhanced clinical simulation. Is there a revolution happening before our eyes?</p><p>As I step back from coverage in the lay and professional media, several aspects of AI impress me in connection with ageing, older people, and gerontological nursing. Critically, intelligence in the term AI incorrectly implies the capacity for critical appraisal and evaluation. A machine or an algorithm cannot perform these actions. AI is a tool and, like all tools, it has strengths and limitations. Today, with enough money in our coffers and a bit of training at hand, we have a buffet of so-called smart AI tools to test out. From AI features in computer-assisted qualitative research data management and analysis platforms to smart apparel that stretches from head—smart spectacles—to toe—smart shoes, the options feel limitless. All these tools are touted as valuable to older people, nursing, and the study of ageing. But, as with all tools, successful use of AI relies on understanding relevant features and then precisely and accurately deploying those tools. Intelligence is human; artificiality is inherent in all AI tools.</p><p>As AI has come to dominate healthcare and health research discussions, a different story deserves our attention. Human beings and their relationships are under remarkable stress because of a multiplicity of forces. Rapidly changing lifeways, geographic migration, economic shifts, human conflict, and the planetary crisis each contribute to phenomena that are deeply concerning to us as gerontological nurses and as global citizens. For instance, AI is repeatedly being offered as a solution for loneliness felt by older people, isolation and overburden experienced by care partners of older people, and overwork that nurses and our entire health and social care workforces live with every day. Too few people are thinking about the limitations inherent in interactions between human beings and chatbots or other AI-powered objects. Those interactions are merely that—interaction between human being and machine. In a discipline anchored by relationships, recognising the difference between interaction and an emotionally meaningful connection seems to recede into the background as we marvel at the potential of AI applications.</p><p>Human beings, like most species, are fundamentally social. We cannot exist without social relationships. Many tools, like the automata of old (https://themadmuseum.co.uk/history-of-automata/) have engaged and entertained people for centuries. A machine mirroring the motions of a human interaction holds untold fascination for us. Computer technology and machine learning add a veneer that makes today's AI automata seem even more entrancing than in centuries past. Despite being entranced and entertained, we cannot forget that what we are seeking as human beings is relationships and not simply a sequence of interactions. At least for now, there are consistent and frequently predictable ‘tells’ of AI generated outputs including overuse of specific words as well as disconsonant content (Kobak et al. <span>2025</span>). These ‘tells’ alert us to what is lacking. What makes relationships delightful is the unpredictability of other beings' responses and the rewards for participants in coming to know each other. Human beings recognise this quality in other humans and in other species. Just ask the owner of a companion animal—be it a dog, a cat, a rabbit, or a snake—about their pet's personality.</p><p>Research on AI companionship has already begun to explore the possibility that long-term interaction with AI may alter brain function (Kosmyna et al. <span>2025</span>). Such research suggests further questions about how using AI might alter our brain health and capacity for relationships. As an illustration, few nurses are exploring cognitive impacts—a withering of imagination and capacity for abstract thought, for example—in relation to repeated AI use. As gerontological nurses, we hold a distinctive vantage point on cognition, emotion, and related capacities. So far, there is no clear evidence and gaps in our understanding persist, underscoring a need for our scepticism and reflection on AI's role in healthy living and brain health in later life.</p><p>While the remarkable usability of AI is oft touted, the environmental impacts are much debated. AI offers enormous potential as a tool for mitigating the climate crisis. Nonetheless, its growing use—particularly GAI—with current infrastructure represents a massive environmental threat with proportionately high use of water, electricity, rare elements and minerals, and resultant electronic waste (Bashir et al. <span>2024</span>). Claiming the right to use AI in research or projects to serve human need without a larger viewpoint on the environment simply avoids addressing the delicate balance of our relationship with the earth. Efforts to exempt our own use of resources and engagement with industries that contribute to the climate crisis only perpetuate damages while curtailing potential benefit. Our deliberative, responsible use of AI and our activism to push for mitigating planetary harm are essential to limiting damage to the planet and all lifeforms.</p><p>Importantly, we gerontological nurses need to think about the existential import of AI for phenomena within our domain. I began this essay by noting that we live in a time of loneliness, conflict, and planetary crisis. As nurses, we must take a broader view. For example, we can reflect on whether investigating AI-driven interventions for loneliness offers more sustainable solutions than working with and in communities to bring people together for better health and wellbeing. Whatever AI offers, it can only ever be an adjunct to what is innately human. Like other tools, AI may support our intelligence and relationships. It cannot replace them. People yearn for meaningful connections with other people, other beings like companion animals, and with nature. That interest in research that helps foster social connections and explores phenomena like animal-assisted therapy as well as investigating the health benefits of pet ownership and what is gained when spending time in nature pales in comparison with the buzz around AI should concern us all.</p><p>Imagining that AI companions or chatbots will significantly remedy older people's unmet needs for social contact ignores the importance of relationships and, in fact, represents shockingly ableist thinking. The machine tolerance for repetitive interactions and responses free of judgement is often heralded as valuable for older people and especially those living with cognitive impairment. The stereotype of the garrulous older person unable to resist repeating the same stories has emerged clearly in promoting the shiny allure of chatbots immune to the tedium of listening to such repetitiveness. Yet many younger people also actively seek out AI social companion tools, sharpening the point of the ageist stereotype in a way that should draw our activism and shape our research agenda.</p><p>I return now to the question that I posed earlier about AI constituting a revolution. AI represents a revolution only in terms of technology. For instance, the AI promises of savings in cost and time, ease, and higher volume or speed in targeted tasks are real only if its outputs are consistent and accurate. The use of smart devices can save time but only if data and alerts are accessible and comprehensible to all. Like all tools and interventions that we might consider using, AI promises benefits but may not warn of risks. Having more tools does not change the fundamentals of what it is to be human but the use of them does alter our productivity and may shape our functional capacities. This balance is not then a revolution for humankind.</p><p>In the same way that we would never use any intervention indiscriminately, we cannot think of AI as a panacea for what does not work in healthcare or as today's on-trend solution. Its application requires our meticulous appraisal and testing before widespread adoption. The list of considerations for use of AI is long; the list of topics involving AI that need research is far longer. Our way forward is to hold fast to our principles, to explore the potential of AI, and to place that exploration in the larger perspective of what it is to be human and to need human relationships across the arc of a lifetime. The time is right to expand our research agenda in AI, framing it in terms that allow us to better address the crises of social isolation, conflict, and planetary health.</p><p>With a shared perspective that AI represents but another set of tools—albeit a potentially remarkably powerful one—that we must use wisely, a great deal of work to effectively integrate AI into our research, education, and care lies ahead. Here at <i>IJOPN</i>, we look forward to reading your manuscripts reporting research that uses and examines AI. Meanwhile, please share your thoughts on AI in gerontological nursing and care for older people with us. <i>IJOPN</i> is on LinkedIn at https://uk.linkedin.com/in/international-journal-of-older-people-nursing-ijopn-10bb6674 and on Blue Sky at https://bsky.app/profile/intjnlopn.bsky.social. Just use our signature hashtag #GeroNurses as well as the hashtag #AIAndNursing when you tag us in your posts.</p><p>No artificial intelligence was used in preparing this manuscript.</p><p>The author declares no conflicts of interest.</p>","PeriodicalId":48651,"journal":{"name":"International Journal of Older People Nursing","volume":"20 5","pages":""},"PeriodicalIF":2.0000,"publicationDate":"2025-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/opn.70046","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Older People Nursing","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/opn.70046","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"GERIATRICS & GERONTOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
In an era where concerns about social isolation, discord, and our planet are rife, the coverage that artificial intelligence (AI) is receiving within and beyond healthcare is unprecedented. Publicity about AI far outstrips most people's basic familiarity with the crises of loneliness, conflict, and planetary health. Indeed, many conversations about AI in care for older people and nursing research that I hear seem oblivious to principles of effective health technology evaluation. The inescapability of AI today drives me back to evaluating this technological domain, prompting contemplation of disadvantages as well as advantages, along with accessibility, cost, and unintended effects. As a gerontological nurse, I always frame analysis of any health technology within the multidimensional process of ageing and with consideration of the needs and preferences of older people. AI offers a notable case in point as we gerontological nurses contend with evaluating this technology that is taking the healthcare industry and the scientific world by a paradoxical electric storm.
Remember that AI represents a wide range of computerised functions and tools that are aimed at partially or completely replacing human actions. The word intelligence, however, is a misnomer. Machines are not intelligent. The allusion of intelligence comes with humans designing these tools in ways that mimic human function but can rarely even partially replace the higher-level aspects of our capacities. News coverage and social media posts constantly alert us to AI's manifold possibilities. Most excitement today about AI entails those tools that generate written or other content from data that already exist. As you might expect, this type of AI is called generative AI or GAI for short. Many of us are also aware of avid interest in integrating AI into robotics, especially for clinical applications that promote self-care among older people or extend nursing care for them. The AI domain is broad and diverse, garnering lots of enthusiasm but often spare critical analysis contextualised within our discipline.
Activities involving AI are increasingly common across gerontological nursing research, practice, and education. Within the domain of gerontechnology, for example, many gerontological nurses are investigating the effects of AI-driven interventions like wearable devices or chatbots and other apps on functional outcomes and quality of life metrics. In addition to studying AI in relation to older people, researchers across healthcare are now adopting AI in research data management and analysis. Computer-assisted qualitative data analysis software, for instance, now commonly features AI assistance, which promises to generate findings for investigators more quickly than they could do themselves. Clinical uses of AI are increasingly frequent, too. Many gerontological nurses and other clinicians are recommending smart devices and AI-powered apps to older people and their care partners. Lastly, AI is very visible in nursing education, influencing the preparation of future gerontological nurses with everything from AI-supported editing of their assignments to AI-enhanced clinical simulation. Is there a revolution happening before our eyes?
As I step back from coverage in the lay and professional media, several aspects of AI impress me in connection with ageing, older people, and gerontological nursing. Critically, intelligence in the term AI incorrectly implies the capacity for critical appraisal and evaluation. A machine or an algorithm cannot perform these actions. AI is a tool and, like all tools, it has strengths and limitations. Today, with enough money in our coffers and a bit of training at hand, we have a buffet of so-called smart AI tools to test out. From AI features in computer-assisted qualitative research data management and analysis platforms to smart apparel that stretches from head—smart spectacles—to toe—smart shoes, the options feel limitless. All these tools are touted as valuable to older people, nursing, and the study of ageing. But, as with all tools, successful use of AI relies on understanding relevant features and then precisely and accurately deploying those tools. Intelligence is human; artificiality is inherent in all AI tools.
As AI has come to dominate healthcare and health research discussions, a different story deserves our attention. Human beings and their relationships are under remarkable stress because of a multiplicity of forces. Rapidly changing lifeways, geographic migration, economic shifts, human conflict, and the planetary crisis each contribute to phenomena that are deeply concerning to us as gerontological nurses and as global citizens. For instance, AI is repeatedly being offered as a solution for loneliness felt by older people, isolation and overburden experienced by care partners of older people, and overwork that nurses and our entire health and social care workforces live with every day. Too few people are thinking about the limitations inherent in interactions between human beings and chatbots or other AI-powered objects. Those interactions are merely that—interaction between human being and machine. In a discipline anchored by relationships, recognising the difference between interaction and an emotionally meaningful connection seems to recede into the background as we marvel at the potential of AI applications.
Human beings, like most species, are fundamentally social. We cannot exist without social relationships. Many tools, like the automata of old (https://themadmuseum.co.uk/history-of-automata/) have engaged and entertained people for centuries. A machine mirroring the motions of a human interaction holds untold fascination for us. Computer technology and machine learning add a veneer that makes today's AI automata seem even more entrancing than in centuries past. Despite being entranced and entertained, we cannot forget that what we are seeking as human beings is relationships and not simply a sequence of interactions. At least for now, there are consistent and frequently predictable ‘tells’ of AI generated outputs including overuse of specific words as well as disconsonant content (Kobak et al. 2025). These ‘tells’ alert us to what is lacking. What makes relationships delightful is the unpredictability of other beings' responses and the rewards for participants in coming to know each other. Human beings recognise this quality in other humans and in other species. Just ask the owner of a companion animal—be it a dog, a cat, a rabbit, or a snake—about their pet's personality.
Research on AI companionship has already begun to explore the possibility that long-term interaction with AI may alter brain function (Kosmyna et al. 2025). Such research suggests further questions about how using AI might alter our brain health and capacity for relationships. As an illustration, few nurses are exploring cognitive impacts—a withering of imagination and capacity for abstract thought, for example—in relation to repeated AI use. As gerontological nurses, we hold a distinctive vantage point on cognition, emotion, and related capacities. So far, there is no clear evidence and gaps in our understanding persist, underscoring a need for our scepticism and reflection on AI's role in healthy living and brain health in later life.
While the remarkable usability of AI is oft touted, the environmental impacts are much debated. AI offers enormous potential as a tool for mitigating the climate crisis. Nonetheless, its growing use—particularly GAI—with current infrastructure represents a massive environmental threat with proportionately high use of water, electricity, rare elements and minerals, and resultant electronic waste (Bashir et al. 2024). Claiming the right to use AI in research or projects to serve human need without a larger viewpoint on the environment simply avoids addressing the delicate balance of our relationship with the earth. Efforts to exempt our own use of resources and engagement with industries that contribute to the climate crisis only perpetuate damages while curtailing potential benefit. Our deliberative, responsible use of AI and our activism to push for mitigating planetary harm are essential to limiting damage to the planet and all lifeforms.
Importantly, we gerontological nurses need to think about the existential import of AI for phenomena within our domain. I began this essay by noting that we live in a time of loneliness, conflict, and planetary crisis. As nurses, we must take a broader view. For example, we can reflect on whether investigating AI-driven interventions for loneliness offers more sustainable solutions than working with and in communities to bring people together for better health and wellbeing. Whatever AI offers, it can only ever be an adjunct to what is innately human. Like other tools, AI may support our intelligence and relationships. It cannot replace them. People yearn for meaningful connections with other people, other beings like companion animals, and with nature. That interest in research that helps foster social connections and explores phenomena like animal-assisted therapy as well as investigating the health benefits of pet ownership and what is gained when spending time in nature pales in comparison with the buzz around AI should concern us all.
Imagining that AI companions or chatbots will significantly remedy older people's unmet needs for social contact ignores the importance of relationships and, in fact, represents shockingly ableist thinking. The machine tolerance for repetitive interactions and responses free of judgement is often heralded as valuable for older people and especially those living with cognitive impairment. The stereotype of the garrulous older person unable to resist repeating the same stories has emerged clearly in promoting the shiny allure of chatbots immune to the tedium of listening to such repetitiveness. Yet many younger people also actively seek out AI social companion tools, sharpening the point of the ageist stereotype in a way that should draw our activism and shape our research agenda.
I return now to the question that I posed earlier about AI constituting a revolution. AI represents a revolution only in terms of technology. For instance, the AI promises of savings in cost and time, ease, and higher volume or speed in targeted tasks are real only if its outputs are consistent and accurate. The use of smart devices can save time but only if data and alerts are accessible and comprehensible to all. Like all tools and interventions that we might consider using, AI promises benefits but may not warn of risks. Having more tools does not change the fundamentals of what it is to be human but the use of them does alter our productivity and may shape our functional capacities. This balance is not then a revolution for humankind.
In the same way that we would never use any intervention indiscriminately, we cannot think of AI as a panacea for what does not work in healthcare or as today's on-trend solution. Its application requires our meticulous appraisal and testing before widespread adoption. The list of considerations for use of AI is long; the list of topics involving AI that need research is far longer. Our way forward is to hold fast to our principles, to explore the potential of AI, and to place that exploration in the larger perspective of what it is to be human and to need human relationships across the arc of a lifetime. The time is right to expand our research agenda in AI, framing it in terms that allow us to better address the crises of social isolation, conflict, and planetary health.
With a shared perspective that AI represents but another set of tools—albeit a potentially remarkably powerful one—that we must use wisely, a great deal of work to effectively integrate AI into our research, education, and care lies ahead. Here at IJOPN, we look forward to reading your manuscripts reporting research that uses and examines AI. Meanwhile, please share your thoughts on AI in gerontological nursing and care for older people with us. IJOPN is on LinkedIn at https://uk.linkedin.com/in/international-journal-of-older-people-nursing-ijopn-10bb6674 and on Blue Sky at https://bsky.app/profile/intjnlopn.bsky.social. Just use our signature hashtag #GeroNurses as well as the hashtag #AIAndNursing when you tag us in your posts.
No artificial intelligence was used in preparing this manuscript.
在一个对社会隔离、不和谐和我们的星球感到担忧的时代,人工智能(AI)在医疗保健领域内外获得的报道是前所未有的。关于人工智能的宣传远远超过了大多数人对孤独、冲突和地球健康危机的基本熟悉。事实上,我听到的许多关于人工智能在老年人护理和护理研究中的讨论似乎都忽略了有效的健康技术评估原则。今天人工智能的不可避免性促使我重新评估这一技术领域,促使我思考缺点和优点,以及可访问性、成本和意想不到的影响。作为一名老年护士,我总是在老龄化的多维过程中分析任何健康技术,并考虑老年人的需求和偏好。人工智能提供了一个值得注意的例子,我们老年护士正在努力评估这项技术,这项技术正在以一场矛盾的风暴席卷医疗保健行业和科学界。请记住,人工智能代表了广泛的计算机化功能和工具,旨在部分或完全取代人类的行为。然而,“智力”这个词并不恰当。机器并不智能。智能的暗示来自于人类以模仿人类功能的方式设计这些工具,但几乎不能部分取代我们能力的高级方面。新闻报道和社交媒体帖子不断提醒我们人工智能的多种可能性。今天关于人工智能最令人兴奋的是那些从已有数据中生成书面或其他内容的工具。正如你所料,这种类型的AI被称为生成AI或简称GAI。我们中的许多人也意识到将人工智能融入机器人技术的强烈兴趣,特别是在促进老年人自我护理或延长护理的临床应用方面。人工智能领域是广泛而多样的,获得了许多热情,但通常在我们的学科背景下缺乏批判性分析。涉及人工智能的活动在老年护理研究、实践和教育中越来越普遍。例如,在老年科技领域,许多老年护士正在研究人工智能驱动的干预措施,如可穿戴设备、聊天机器人和其他应用程序,对功能结果和生活质量指标的影响。除了研究与老年人相关的人工智能之外,医疗保健领域的研究人员现在还在研究数据管理和分析中采用人工智能。例如,计算机辅助定性数据分析软件现在普遍采用人工智能辅助,这有望比调查人员自己更快地得出结论。人工智能的临床应用也越来越频繁。许多老年护士和其他临床医生向老年人及其护理伙伴推荐智能设备和人工智能应用程序。最后,人工智能在护理教育中非常明显,影响着未来老年护士的准备工作,从人工智能支持的作业编辑到人工智能增强的临床模拟。一场革命正在我们眼前发生吗?当我退出非专业和专业媒体的报道时,人工智能在老龄化、老年人和老年护理方面的几个方面给我留下了深刻印象。关键的是,人工智能这个术语中的智能错误地暗示了批判性评估和评估的能力。机器或算法无法执行这些操作。人工智能是一种工具,像所有工具一样,它有优势也有局限。如今,我们有了足够的资金和一些培训,我们有了一系列所谓的智能人工智能工具来进行测试。从计算机辅助定性研究数据管理和分析平台中的人工智能功能,到从头部智能眼镜到脚趾智能鞋的智能服装,选择似乎是无限的。所有这些工具都被吹捧为对老年人、护理和衰老研究有价值。但是,与所有工具一样,人工智能的成功使用依赖于对相关功能的理解,然后精确地部署这些工具。智慧是人类的;人工智能是所有人工智能工具所固有的。随着人工智能开始主导医疗保健和健康研究讨论,一个不同的故事值得我们关注。由于多种力量的作用,人类及其关系正处于巨大的压力之下。快速变化的生活方式、地理迁移、经济转变、人类冲突和地球危机,每一个都导致了我们作为老年护士和全球公民深切关注的现象。例如,人工智能一再被提供作为老年人感到孤独的解决方案,老年人的护理伙伴所经历的孤立和负担过重,以及护士和我们整个卫生和社会护理工作人员每天都要承受的过度工作。 很少有人考虑到人类与聊天机器人或其他人工智能物体之间互动的固有局限性。这些交互仅仅是人与机器之间的交互。在一个以人际关系为基础的学科中,当我们惊叹于人工智能应用的潜力时,认识到互动和情感上有意义的联系之间的区别似乎已经退居次要地位。和大多数物种一样,人类从根本上来说是社会性的。没有社会关系,我们就无法生存。许多工具,如旧的自动机(https://themadmuseum.co.uk/history-of-automata/)已经吸引和娱乐了人们几个世纪。一台能反映人类互动动作的机器对我们有着难以言喻的魅力。计算机技术和机器学习为今天的人工智能自动机增添了一层外衣,使其看起来比过去几个世纪更加引人入胜。尽管我们被迷住了,被娱乐了,但我们不能忘记,作为人类,我们寻求的是关系,而不仅仅是一系列的互动。至少到目前为止,人工智能生成的输出有一致且经常可预测的“告诉”,包括过度使用特定单词以及不一致的内容(Kobak et al. 2025)。这些“提示”提醒我们所缺少的东西。人际关系之所以令人愉快,是因为对方反应的不可预测性,以及参与者相互了解的回报。人类认识到其他人类和其他物种的这种品质。只要问问伴侣动物的主人——无论是狗、猫、兔子还是蛇——他们宠物的性格就行了。关于人工智能陪伴的研究已经开始探索与人工智能长期互动可能改变大脑功能的可能性(Kosmyna et al. 2025)。这样的研究提出了进一步的问题,即使用人工智能可能会如何改变我们的大脑健康和人际关系能力。举个例子,很少有护士在探索重复使用人工智能对认知的影响——例如,想象力和抽象思维能力的萎缩。作为老年护士,我们在认知、情感和相关能力方面具有独特的优势。到目前为止,还没有明确的证据,我们在理解上的差距仍然存在,这突显了我们有必要怀疑和反思人工智能在晚年健康生活和大脑健康中的作用。虽然人工智能卓越的可用性经常被吹捧,但其对环境的影响却备受争议。人工智能作为缓解气候危机的工具具有巨大的潜力。尽管如此,随着现有基础设施的不断增长,尤其是人工智能的使用,对水、电、稀有元素和矿物的比例高,以及由此产生的电子废物,构成了巨大的环境威胁(Bashir et al. 2024)。声称有权在研究或项目中使用人工智能来满足人类的需求,而没有从更大的角度看待环境,这只会避免解决我们与地球关系的微妙平衡。免除我们自己使用资源的努力,以及与导致气候危机的行业合作,只会使损害永久化,同时减少潜在的利益。我们审慎、负责地使用人工智能,并积极推动减轻对地球的损害,对于限制对地球和所有生命形式的损害至关重要。重要的是,我们老年护士需要考虑人工智能对我们领域内现象的存在意义。我在这篇文章的开头指出,我们生活在一个孤独、冲突和全球危机的时代。作为护士,我们必须有更广阔的视野。例如,我们可以思考,调查人工智能驱动的孤独感干预措施是否比与社区合作并在社区内将人们聚集在一起以改善健康和福祉提供了更可持续的解决方案。无论人工智能提供什么,它都只能是人类天性的附属物。像其他工具一样,人工智能可以支持我们的智力和人际关系。它不能取代它们。人们渴望与他人、其他生物(如伴侣动物)和大自然建立有意义的联系。对有助于促进社会联系、探索动物辅助疗法等现象的研究的兴趣,以及调查养宠物对健康的好处,以及与围绕人工智能的嗡嗡声相比,在大自然中度过时光所获得的收获,这些都应该引起我们所有人的关注。想象人工智能伴侣或聊天机器人将显著弥补老年人未被满足的社交需求,忽视了人际关系的重要性,事实上,这代表了令人震惊的健康主义思维。机器对重复互动和无判断反应的容忍度通常被认为对老年人,尤其是那些有认知障碍的人很有价值。喋喋不休的老人无法抗拒重复同样的故事的刻板印象,在促进聊天机器人的闪亮吸引力时明显出现了,聊天机器人对听这种重复的无聊感免疫。 然而,许多年轻人也积极地寻找人工智能社交伴侣工具,以一种应该吸引我们的行动主义和塑造我们的研究议程的方式,强化了年龄歧视的刻板印象。现在我回到我之前提出的关于人工智能构成一场革命的问题。人工智能只是在技术方面代表了一场革命。例如,只有在输出一致且准确的情况下,人工智能承诺在目标任务中节省成本和时间、易用性和更高的数量或速度才会成为现实。使用智能设备可以节省时间,但前提是所有人都可以访问和理解数据和警报。就像我们可能考虑使用的所有工具和干预措施一样,人工智能承诺会带来好处,但可能不会警告风险。拥有更多的工具并不会改变人类的基本特征,但它们的使用确实会改变我们的生产力,并可能塑造我们的功能能力。这种平衡并不是人类的革命。同样,我们永远不会不加选择地使用任何干预措施,我们也不能认为人工智能是解决医疗保健领域无效问题的灵丹妙药,或者是当今的趋势解决方案。它的应用需要我们在广泛采用之前进行细致的评估和测试。使用人工智能的考虑因素很长;需要研究的涉及人工智能的主题要长得多。我们的前进之路是坚持我们的原则,探索人工智能的潜力,并将这种探索置于人类的更大视角中,以及在一生的弧线中需要人际关系。现在是时候扩大我们在人工智能方面的研究议程,使我们能够更好地解决社会孤立、冲突和地球健康等危机。大家都认为,人工智能代表了另一套工具——尽管是一种潜在的非常强大的工具——我们必须明智地使用它,要将人工智能有效地整合到我们的研究、教育和护理中,还有大量的工作要做。在IJOPN,我们期待着阅读您的手稿,报告使用和检查人工智能的研究。同时,请大家分享你对人工智能在老年护理和老年人护理方面的看法。IJOPN的网址是LinkedIn https://uk.linkedin.com/in/international-journal-of-older-people-nursing-ijopn-10bb6674, Blue Sky的网址是https://bsky.app/profile/intjnlopn.bsky.social。当你在你的帖子中标记我们时,请使用我们的签名标签#GeroNurses以及#AIAndNursing。本文未使用人工智能。作者声明无利益冲突。
期刊介绍:
International Journal of Older People Nursing welcomes scholarly papers on all aspects of older people nursing including research, practice, education, management, and policy. We publish manuscripts that further scholarly inquiry and improve practice through innovation and creativity in all aspects of gerontological nursing. We encourage submission of integrative and systematic reviews; original quantitative, qualitative, and mixed methods research; secondary analyses of existing data; historical works; theoretical and conceptual analyses; evidence based practice projects and other practice improvement reports; and policy analyses. All submissions must reflect consideration of IJOPN''s international readership and include explicit perspective on gerontological nursing. We particularly welcome submissions from regions of the world underrepresented in the gerontological nursing literature and from settings and situations not typically addressed in that literature. Editorial perspectives are published in each issue. Editorial perspectives are submitted by invitation only.