Electronic Health Record Errors Are a Serious Problem
Many were shocked by the 1999 report by the Institute of Medicine, which claimed that some 98,000 American patients died due to medical error each year and a further million suffered injuries. The report, entitled “To Err is Human,” suggested that electronic medical record keeping might in fact reduce the levels of error and thus prevent harm to patients.
In some cases, the uptake of electronic health record (EHR) systems has made some progress toward this goal. Paper records are messy and difficult to keep track of — and transferring them between institutions is a nightmare of complexity.
Digitizing electronic medical records (EMRs) — the data that EHRs manage — has certainly ironed out some of the inconsistencies and failures of communication that occurred prior to their introduction. (The terms EMR and EHR are often used interchangeably and here we will refer to all electronic health data as EHRs.)
But despite incremental improvements in safety performance over time, EHRs still pose significant patient safety threats.
The first electronic systems for tracking patient data were developed beginning in the 1960s. The Regenstrief Institute introduced a pilot system in 1972 and by the 1990s many larger medical institutions had computerized some aspects of their recordkeeping systems. The Health Insurance Portability and Accountability Act (HIPAA) was passed in 1996, further facilitating adoption of secure practices and underscoring the urgency of electronic recordkeeping.
Wide adoption of EHRs did not occur until the early 2000s in the United States, following the introduction of the Office of the National Coordinator for Health Information Technology in 2004. The HITECH Act of 2009 urged providers to adopt electronic record keeping by 2014, with incentives provided through Medicare and Medicaid. As of 2021, 78% of office-based providers had adopted some form of EHR. And 96% of non-federal acute care hospitals had done the same.
Since then, most American medical patients have become accustomed to the ensuing rigmarole. An administrative assistant or a nurse or a doctor stand in front of a machine, reading out computerized prompts and entering your responses. Even when patients are not actively participating, staffers are busily dumping your information into these systems: tapping in surgical results, medication recommendations, physical therapy orders, and thousands of other types of data.
But what happens when the wrong data is entered? Sometimes these clerical errors are inconsequential and will never be noticed. And sometimes they can result in life-threatening events. The introduction of EHRs, for all its benefits, has introduced a new set of risks. One researcher dubbed the errors resulting from their use “e-iatrogenesis.” Iatrogenesis refers to harm created by medical treatment.
These vulnerabilities align with frightening statistics on the rise in cyberattacks against healthcare institutions. These systems are practically oozing data, which is not lost on those who would exploit it.
Some 88% of healthcare institutions were hit with cyberattacks, according to a 2023 Ponemon Institute report. The healthcare sector is the top target of ransomware attacks, according to the FBI. In 2018, health insurer Anthem paid a $16 million fine after cyberattackers stole the data of nearly 79 million patients three years earlier — the largest such breach yet recorded. While much of this data is simply held until payment is made to the hackers, the fact that it is accessible at all is cause for alarm.
I found out how permeable these systems are myself. In an alarming incident earlier this year at a Northwestern Medicine outpatient center in Chicago, I discovered just how easy it is for one patient’s data to be entered in another’s record.
While thankfully this did not result in deleterious effects to me, the vulnerability it exposed suggests that electronic health records often lack basic security protocols that could lead to all manner of catastrophe. Indeed, the literature on the subject suggests that electronic health records are riddled with such vulnerabilities. And, as I found out, the widely touted HIPAA regulations that we all think protect our private health data are in fact tenuous and weakly enforced.
Here, I investigate the subject for InformationWeek. During my research, I talked to Darren Ghanayem, managing director of AArete, a healthcare consulting firm. Ghanayem brings experience from his tenure at such companies as WellCare and Anthem, Inc. (now Elevance). I also spoke with Deven McGraw, chief regulatory officer for consumer-health technology start-up Citizen Health. She previously served as deputy director of health information privacy at the Health and Human Services (HHS) Office for Civil Rights (OCR) and chief privacy officer (acting) of the Office of the National Coordinator for Health IT.
I have included further insights from two people who shared my experience — the introduction of erroneous information into their healthcare records or those of relatives.
Northwestern Medicine, Epic Systems, and the OCR, which deals with HIPAA complaints, declined to answer the majority of my questions.
Security Gaps in a Major Electronic Health Record System
On March 5, 2024, I checked in for my annual physical at Northwestern Medicine’s Old Irving Park location in Chicago. Northwestern Medicine is a major healthcare provider in the Chicagoland area: It operates 10 hospitals and some 200 outpatient facilities. The facility where I received care opened on November 1, 2023. When I walked in, it was nearly vacant — a typically sterile, institutional building filled with long stretches of padded furniture abutting large windows overlooking a scruffy stretch of Irving Park Road. I was quickly called in to the examination room, asked a standard battery of questions by a nurse and left to wait for the doctor.
When he entered, he probed me for additional details — more thoroughly than many primary care doctors that I had encountered, which I appreciated. We quickly worked our way through my medications, my ongoing issues, and finally arrived at my surgical history. At that point, one of his questions stopped me short.
“You had retinal detachment surgery,” I recall him asking. “How is that doing?”
Retinal detachment occurs when the retina, the nerve tissue containing the receptors that perceive light and allow our brains to form images, pulls away from the back of the eye. It has never happened to me. However, it did happen to my father, with whom I share a first and last name, in 2017. He underwent surgery to correct it at Wheaton Eye Clinic, in the Chicago suburbs, and recovered in reasonably short order.
The person who interviewed him had inadvertently entered his information into my record. Odd, I thought. I have never visited a Wheaton Eye Clinic location and have never had a serious vision problem that required surgery. I wondered how they might have had access to my EHR in the first place.
As it turns out, Northwestern Medicine shares a medical record system, provided by Epic Systems, with Wheaton Eye Clinic. As Rayan Venkatesh, a compliance officer for Northwestern Memorial Healthcare, explained to me by email, Northwestern Medicine and Wheaton Eye Clinic had entered into an Epic Community Connect Services Agreement, which allowed the clinic to utilize Northwestern’s recordkeeping system.
Devin McGraw, CRO, Citizen Health
“Licensing their systems to other organizations provides incentive for these little external places to refer their more serious patients into Northwestern because it’s all part of the same system,” McGraw suggests. “Measures were passed to enable people to provide access to health information technology for these clinics without violating anti-kickback or fraud and abuse laws. That’s a significant benefit to that little eye clinic to not have to pay for their own system. They’re paying for Northwestern’s system as opposed to paying Epic to build a system for them.”
There are some 305 million medical records hosted by Epic Systems software according to their website.
Because I had a pre-existing record with Northwestern, the nurse or administrative staffer who did my father’s intake selected my record and simply typed his information into it. There were no programmatic protocols that required a birthdate or Social Security Number or driver’s license number in order to access my record. A name simply popped up and the staffer selected it. I asked why these safeguards were not in place.
“The applicable safeguards that you mention here are largely administrative (e.g., policies, procedures, training) re: verifying identity,” Venkatesh said. So, the medical record system itself does not provide any barriers to access. It is incumbent upon the staff member to verify that the patient is who they say they are.
“Epic publishes best practices for correct patient identification. These include using the most specific identifiers available such as phone number, e-mail address, mailing address, and date of birth,” an Epic spokesperson told me in a prepared statement. “Each organization chooses their own patient identification requirements.”
“With respect to record access across Epic customers, each organization owns its data and controls how it is used and shared,” the statement continued, suggesting that the responsibility lay solely with the practices observed by Wheaton Eye Clinic and Northwestern Medicine.
This is particularly alarming given the rise in breaches of small medical practices — at least according to one 2022 survey, almost a quarter had experienced a breach and nearly half attributed the breach to human error. The same report found that more than 40% of respondents spent no more than two hours on security training and half had no incident response plan in place.
“Anyone in this industry would have been subjected to mandated annual HIPAA training,” Ghanayem says. “It reinforces some of the basic principles about authorization, about what private healthcare information is, how you need to validate who you’re speaking with and what the ramifications for violations are.”
In this case, whatever lessons were imparted to that employee did not stick.
The Weak Enforcement of HIPAA
In the wake of this discovery, I filed two complaints with the Health and Human Services OCR.
I first registered a complaint against Wheaton Eye Clinic, as their staffer was the one who made the error. The OCR declined to pursue the issue, claiming that they would offer what they referred to as “technical assistance” to the clinic. I then followed up with a complaint against Northwestern Medicine. The OCR again concluded that “technical assistance” was the best course of action.
While HIPAA is widely perceived as being a set of privacy regulations, those regulations are in fact not the main gist of the legislation. It is worth emphasizing that the ‘P’ in HIPAA stands for portability, not privacy. The act, while it included provisions for protection of private information, was largely intended to smooth the way for the transfer of insurance when people changed jobs and to facilitate the transmission of their healthcare information.
According to data from April 2024, the OCR has received 358,975 HIPAA complaints since the implementation of the Privacy Rule in 2003. Only 1,188 resulted in compliance reviews. And a mere 145 cases ended with the issuance of monetary fines — though the total fines collected are a seemingly impressive $142,663,772.
In rare cases, substantial fines have been assessed in single cases. A decade ago, New York and Presbyterian Hospital and Columbia University had to pay a combined $4.8 million to HHS after leaked medical records were found online.
The OCR is dependent on money from these fines in order to operate. As eyebrow raising as that total might be, budgetary constraints appear to be a major factor in the ability to enforce and penalize HIPAA violations. The OCR has been criticized in the past for its failure to fully investigate violations of the privacy rule.
“A small portion of these cases are actually fully pursued. There are judgment calls that get made,” McGraw explains. “Has the organization been the subject of multiple complaints in different areas? Did they previously have some complaints filed? Did they receive a corrective action letter and then didn’t fix the behavior that led to the problem?”
“If you are found to be in violation, you’re going to be audited more often,” Ghanayem concurs. “They’re going to watch you more closely. OCR has to prioritize where they go.”
Intriguingly, HIPAA violations are not a basis for legal action by affected patients in most cases. If a patient believes they have been harmed by a HIPAA violation, they must resort to tort law if they wish to bring private action against the responsible entity. While privacy provisions were originally intended to be written into law, the inaction of Congress led HHS to write regulations on its own, as authorized by the act.
“It dawned on Congress at the last minute before they were getting ready to pass the bill. So, they literally put a line in the statute that says we’re going to give ourselves two years to pass privacy legislation. And if we don’t pass any privacy legislation, we’re gonna punt it to HHS and have them develop regulations,” McGraw says. And that is exactly what happened.
Thus, HIPAA’s privacy stipulations cannot be the basis for a lawsuit when harm occurs — they simply aren’t laws.
“Every now and then you do see a state lawsuit around medical record privacy,” McGraw says. “They file it as a state law action — a violation of privacy claim. They’ll look to the HIPAA provisions as establishing the standard of care.” Even these types of claims have rarely been successful.
If the HIPAA violation involves more than 500 individuals, healthcare organizations are required to disclose to their patients and to the media that their information has been compromised within 60 days. In cases like that of my father and myself, the regulations are ambiguous. While the vulnerability likely affected thousands of patients — Northwestern declined to disclose how many patients it serves and how many organizations it licenses its record keeping system to — the incident itself only concerned two people.
The Prevalence of Wrong Patient Errors
In my own case, no adverse events occurred as a result of the lack of security in Northwestern’s EHR system. But my first thought was that it very easily could have if the situation were even slightly different. Many people share the same name — and they may not even be related. How many of them have had their information entered into others’ records or vice versa?
Unsurprisingly, this type of event is not unprecedented. Up to 20% of patients may not be correctly matched to their records according to one survey — and 20% of CIOs in the same survey said they believed patients had been harmed as a result. Patient safety advocates are acutely aware of this issue — in some cases, due to having encountered it themselves.
“When my father passed, I asked for a copy of his records from his primary care doctor. Included in my dad’s records were the colonoscopy results for another patient. Not only did I get her detailed results, but also her name, address, and other contacts,” Marian Hollingsworth, co-founder of The Patient Safety League, tells me. Hollingsworth is a member of the Patient Safety Action Network (PSAN), to which I belong as well.
“That was a surprise,” she confides. I was similarly dismayed when it happened to me, but the literature on the subject suggests we are not alone.
Over an eight-year period, 31 out of 182 lab errors in Veteran’s Health Administration institutions were due to inputting data into the incorrect chart according to a 2010 study. A study from 2014 indicated that even according to self-reports, emergency room physicians entered information into the wrong charts 1.3% of the time.
That may seem like a small number, but 97% of the respondents reported having entered data into the wrong chart in the preceding three months. The authors suggest that a highly visible watermark depicting the patient’s room number might cut down on these errors.
A 2016 study found that of nearly 12,000 patients put under anesthesia, 57 had their information put into the wrong charts. In a 2020 study, 6.5% of patients found that their records had been mixed up with those of other patients. And some of these had serious implications — one patient who was undergoing an organ transplant found that their records had been swapped with someone who had the same name. Another found that someone else’s lab results indicated that they should change medication. And a third discovered that they had been inaccurately characterized as a patient with a history of binge drinking and gastric cancer.
The Emergency Care Research Institute compiled a litany of wrong patient errors in a 2016 paper as well.
Though the consequences of these errors are not clear in much of this research, other instances suggest that in some cases they may be grave. A 2011 study of a four-year period at an Australian hospital found that of 487 misidentification incidents, 25.7% involved the administration of medication to the incorrect patient and 15.2% saw a procedure performed on the wrong patient. Another 2011 study found that of 101 adverse events during surgeries in the VHA, 30 were due to them being conducted on the wrong patient.
Other Vulnerabilities of EHRs
EHR errors occur for a variety of reasons, from poor safeguarding measures that fail to prevent the person entering the data from accessing the incorrect record to simple interface difficulties.
For example, “adjacency errors” lead doctors and other professionals to select the wrong patient or medication or procedure simply because dropdown menus are in close proximity to each other. Even such issues as consistency can pose a major problem — changes in the location of the button that closes out a record, for example.
Doctors often copy and paste information rather than fill it out again, meaning that it may not be updated as needed or may not include all relevant data. A majority of notes had at least 20% copied information per one study. The fields used to upload documents may also lack verification, sometimes resulting in the wrong documents being added to a patient’s record.
Other usability issues related to navigation, workflow, and cognitive overload can lead to the introduction of errors too.
These types of documentation errors make up 72% of EHR-related liabilities, according to a report from 2020 on a decade worth of malpractice claims. Another study found that 61% of diagnosis-related claims were due to EHR errors of multiple types.
EHRs vary widely in quality — and in age. Plenty of organizations are limping along systems that are decades old.
Darren-Ghanayem, Managing Director, AArete
“A modern new electronic medical record system is going to have security standards,” Ghanayem says. “They’re going to have specific HIPAA-compliant logins and credentialing and role management. But that’s not the world that we live in. In the real world there are systems that were bought 20 years ago, 10 years ago, or five years ago.” Our EHRs thus drift through a landscape of widely variable protections. Sometimes our data is entered correctly and zipped up tight. And sometimes it just isn’t. It depends on which systems it passes through and who is operating them.
Confusion created by EMR systems has led to all manner of alarming discrepancies and actual events even when patients are correctly identified. A Swedish study discovered that 84% of medication records in a group of primary healthcare providers were not up to date—on a single day. Another study found that over half of patient records had medication discrepancies and 39% of those had the potential to cause harm upon hospital admission. An earlier study found that 60% of inpatients had medications missing from their charts. And a Pew report lists a variety of pediatric medication errors caused by EHR systems — missed or inaccurate doses and allergies among them.
One woman’s cervical cancer treatment was delayed by two years because the EHR system used by her provider failed to notify her of abnormal pap smear results — and her doctors did not notice those results on subsequent visits. In another case, doctors confused orders of potassium chloride infusions for a patient who had a potassium deficiency, leading to an overdose.
And that is just a sampling of what happens with EHRs internally. Records have been stolen from devices left unattended — including one case in which some 365,000 records were obtained from disks sitting in a staffer’s vehicle. In another case, an employee who had been terminated from a Colorado hospital did not have their access to a scheduling calendar revoked and was thus able to access healthcare information of 557 patients. The event resulted in a fine of more than $100,000.
The exposure of healthcare records, in even minor ways, leaves patients highly vulnerable. “I never reached out to this woman [whose records were entered into my father’s], but I had all her contact information. I could have gone to her house and handed her the copy of the results I had found in my dad’s records,” Hollingsworth says.
Barbara Grau, another PSAN member, described her difficulties getting inaccurate information in her EHR corrected. “My health record prior to March 2019 suddenly disappeared, went black. I couldn’t retrieve it, so I contacted a tech support person responsible for EHRs at the medical facility to get help with the problem,” she explains. “He informed me of the mix-up and stated he would flag my account to make people aware that the mix-up occurred and what they need to do to prevent it from happening again. He also restored my access to all of my EHR.”
Yet, the problem persisted. Further doctors’ visits revealed that the record had not actually been corrected. Information about her health history remained inaccurate. She filed HIPAA complaints and contacted an attorney. Neither recourse was effective: HIPAA declined to pursue the complaints, and the attorney was not able to help her resolve the issue. According to Grau, the problems persist to this day.
Even the wealthy and famous have encountered issues with their medical privacy being breached — three staffers are accused of having accessed Duchess of Wales Kate Middleton’s healthcare records despite having no involvement in her treatment for cancer, for example.
According to data compiled by The HIPAA Journal, an estimated 140 million healthcare records were exposed due to breaches in the US alone in 2023 — underscoring the vulnerabilities revealed by my own experience. If an irresponsible employee could access my record by simply typing in my name inadvertently during the routine entry of another patient’s information, what could a motivated individual accomplish?
Clearinghouses of medical data, which transmit it to insurance companies, pose a particular exposure threat, Ghanayem warns. The data is most vulnerable when it is traveling. It’s remarkably easy to snatch data from the cloud.
“A clearinghouse has to reformat [the data] into a consumable form. Then another system grabs it to actually tear that data apart and put it back together,” Ghanayem says. The data thus bounces between the clearinghouse, the provider, the payer, and the patient. “You’ve got all of these different touch points where that data can be exposed and can be vulnerable.”
Data aggregators pose a further risk. These organizations may collect deidentified data to perform analyses on population-level health issues for both healthcare organizations and insurance companies. “Are they following the same security standards that we follow in the health care transaction world?” Ghanayem asks. “I don’t know.”
Improving EHR safety
Both the design of EHR platforms and their use are in need of substantial improvement. Building in safeguards — such as image verification and the mandatory use of fields for identifying information such as birthdates or SSNs — may substantially reduce patient misidentification. And more intuitive, less distracting interfaces may cut down on the entering of inaccurate information.
Clear distinctions between important information fields must be made to cut down on adjacency errors. Concise patient summaries at the beginning of each record and usable search features may increase usability and decrease frustration that leads to the introduction of errors. And refining when alerts are issued can decrease alert fatigue, which may lead providers to simply ignore alerts even when they are valid.
Further, reporting systems must be in place to flag when errors occur and then forwarded both to staffers and to the vendors of the systems.
Some have proposed unique patient identifiers (UPIs) as a potential corrective mechanism. “Congress actually authorized the creation of a unique ID for every patient,” McGraw says. But HHS has not been authorized to implement these IDs due to disagreement over how they will be chosen and monitored. In fact, HHS is specifically banned from doing so due to 1999 legislation attached to a funding bill. Opponents claim, in fact, that UPIs may infringe on privacy. The prohibition remains in place despite wide support among medical organizations that cite patient safety issues and high costs associated with duplicate and inaccurate records.
As legislators dither and the private sector struggles to maintain the privacy of EHRs on its own, patients pay the price. More recent research has suggested that the number of deaths due to medical error may be as high as 440,000 — significantly higher than the Institute of Medicine’s 1999 estimate. EHRs almost certainly factor into many of those errors.
The privacy of health data has long been sacred in medicine — it is referenced in various translations and revisions of the Hippocratic oath, attributed to a Greek doctor from the fourth or fifth century BC. In the digital age, these ancient principles are more important than ever.