Medford, OR – As artificial intelligence (AI) continues to make inroads into the healthcare industry, some states are beginning to take action to ensure patient safety and preserve the roles of healthcare professionals. In Oregon, state Rep. Travis Nelson, a nurse and Portland Democrat, is spearheading an effort to prevent AI systems from assuming the title of “nurse.” His proposed legislation, House Bill 2748, would ban any nonhuman entity, including AI, from using this title in the state.
The bill comes at a time of increasing concern within the nursing community about AI technologies that threaten to replace or devalue human roles. A recent announcement from the tech company Hippocratic AI revealed the development of an AI program designed to handle “low-risk, non-diagnostic” patient-facing tasks, including pre-assessments and basic care interactions, for just $9 an hour—significantly lower than the wages of human nurses, who typically earn ten times that amount.
Nelson, who is both a nurse and a legislator, expressed concerns over the rising use of AI in healthcare settings, particularly when it comes to replacing human professionals with cheaper, automated solutions. While high-tech algorithms and AI systems are increasingly used in healthcare to streamline operations, such as triaging patients, assisting with diagnostics, and improving efficiency, critics argue that these systems often lack the empathy, critical thinking, and nuanced judgment that nurses bring to their work.
Nelson’s bill aims to specifically address the issue of AI pretending to be a nurse, underscoring the need for regulatory safeguards in an industry that is rapidly embracing technology. The bill, which is just one page long, would make it illegal for any nonhuman entity to use the term “nurse,” ensuring that the designation remains reserved for licensed healthcare professionals who have completed the necessary training and education.
Jennifer Mensik Kennedy, president of the Oregon-based American Nurses Association, voiced her support for the bill, emphasizing the importance of maintaining public trust in healthcare. “A lot of this is public safety,” Kennedy said. “The public needs to know if I call myself a ‘nurse,’ what that entails. Someone off the street can’t call themselves a nurse because there is an assumption of education and licensure.”
The growing role of AI in healthcare is not without controversy. While proponents argue that AI can increase efficiency, reduce costs, and alleviate staffing shortages, many healthcare professionals and advocates are wary of its potential to harm care quality. Some fear that AI systems might undermine the judgment of human healthcare providers, replacing their decision-making power with potentially flawed or biased algorithms.
Nelson’s bill is among the first state-led attempts to regulate AI in healthcare and would provide a measure of protection for patients by ensuring that the critical work performed by nurses cannot be automated by nonhuman entities. However, Nelson acknowledged that more comprehensive legislation may be needed to address the broader issue of AI in healthcare, particularly in terms of patient consent. He suggested that in the future, patients might need the ability to opt out of being treated by AI systems in favor of human providers.
The pushback against AI in nursing has been especially strong following the announcement of AI-driven products such as Hippocratic AI’s $9-per-hour software, which promises to replace some aspects of nursing work, such as phone-based patient interactions. This development, along with similar technologies, has led to heightened concerns about patient safety, job displacement, and the erosion of the human touch in nursing care.
Oregon Nurses Association (ONA) also expressed their concerns about AI replacing nurses. The ONA issued a statement criticizing the potential for AI to diminish the quality of patient care, stressing that tasks requiring empathy, nuanced decision-making, and critical thinking cannot be effectively carried out by machines. “You can augment the care the nurse gives with AI, but you cannot replace it,” Kennedy added, highlighting the irreplaceable value that human nurses bring to the healthcare environment.
Dr. William Hersh, a professor of medical informatics and clinical epidemiology at Oregon Health and Science University, weighed in on the broader implications of AI in healthcare. Hersh acknowledged that AI has the potential to assist in clinical decision-making, particularly in resource-limited settings, but stressed that human clinicians must remain involved to correct errors and ensure accurate diagnoses. “What if an AI system gets it wrong? Who gets sued?” Hersh posed, underlining the significant legal and ethical questions that remain unanswered in the face of widespread AI adoption.
While the bill has only just begun to be discussed in the Oregon legislature, the growing debate over AI in healthcare highlights the urgent need for clear regulations to govern the use of technology in patient care. As Rep. Nelson noted, “By the time the session ends, we’ll have taken another big leap on the AI front,” emphasizing the rapid pace at which AI is evolving and the importance of taking timely action to safeguard the healthcare workforce and patient welfare.
As the session progresses, lawmakers will continue to evaluate the bill and its potential impact on both the healthcare workforce and the future of AI in medical settings. For now, Oregon’s House Bill 2748 represents a significant step in addressing the challenges posed by AI in healthcare, ensuring that human nurses remain at the heart of patient care.