Thursday, July 28, 2011

Preserving EHRs: Time to Worry?

Health Data Management Magazine, 06/01/2011


With meaningful use taking up all the top slots on the national EHR to-do list, record retention and preservation don't even make the first page: Data storage is so cheap, so the popular thinking seems to be, we'll just keep everything and worry about it later. But Milton Corn, M.D., deputy director for research and education at the National Library of Medicine, thinks we should worry about it now

He's been worrying about it since shortly after the American Recovery and Reinvestment Act allocated billions of dollars for EHRs and he first began to consider the torrents of electronic medical data that will result.

What should be kept? For how long? What storage methods should be used, and will they be vulnerable to technological obsolescence?

How can we ensure that the trove of information locked in the records can be analyzed by researchers without compromising patient privacy?

"I think it's a rich issue and the discussion has just gotten started," Corn says. "I would like it if every hospital and physician's office started giving some thought to what they're going to do."

To that end, he organized a workshop in April, held at the NLM and co-sponsored by the National Institute for Standards and Technology, the Department of Veterans Affairs, and the National Archives and Records Administration.

It attracted more than 90 attendees and identified some basic issues that all providers will have to deal with eventually. (See sidebar, below)

"Our data will change formats and media many times," says Mark Frisse, M.D., professor of biomedical informatics at Vanderbilt University, who spoke at the workshop. "The question is, what's the cost of ownership and what is its real value? Do we need data on every American, or is it better to have really intensive data on 500 people or 1,000? Archivists must make these decisions in the here and now."

Legally, medical record retention requirements haven't changed with the advent of EHRs, and few EHR users have had their systems long enough for the records to have aged beyond statutory limits.

Idiosyncratic

Each state has its own idiosyncratic requirements, often mirroring its statute of limitations for filing malpractice claims. Tennessee requires records to be retained for 10 years after the last patient contact, Virginia for six.

North Carolina has a retention requirement of 11 years for hospitals but none for physician offices. Colorado requires pediatric records to be retained for 10 years after the patient reaches the age of majority.

The Cost Factor

In any case, state retention requirements are a minimum, and many providers, especially research institutions, are loath to throw anything away when the price of electronic storage is so low. John Halamka, M.D., CIO of Beth Israel Deaconess Medical Center in Boston, calls the cost "insignificant."

BIDMC uses a tiered storage system so that infrequently accessed information can be stored at a lower cost.

Halamka showed workshop attendees a detailed analysis of his data retention costs, which range from 34 cents to 89 cents per gigabyte per year, depending on whether it's replicated and how quickly accessible the information is.

The average per-patient-per-year cost ranges from 5 cents to $1.89, depending on the type of data stored and how quickly it needs to be accessed.

"We save everything forever," says Mary Ann Leach, CIO at Children's Hospital of Denver, which has had a full EHR for about four years and has scanned its paper records so that all information since 2000 is now electronic. "Once we get meaningful use and health information exchange and certification standards put to bed, data retention is next," she adds.

Not only does a pediatric hospital have to save all of its records longer than usual, but many of its patients have chronic conditions and medical records that, were they on paper, would be several inches thick. Add to that a lively research community that never knows exactly what it's going to want, and it's a recipe for massive data accumulation.

Easy to Navigate


"The challenge is to maintain it so that it's easy to navigate for the physician," Leach says. "How do we synthesize this data so they know at a glance what's happening?"

Leach says she doesn't think storage media, in themselves, will pose a barrier, even if they have to be swapped periodically due to changing technology. Children's uses a variety of storage media already.

The issue is accountability for the data, protecting it from breaches and inadvertent releases, and managing required releases. "The more records we have, the more we have to manage," she says.

The "keep everything" principle won't be embraced in all hospitals, says Todd Richardson, CIO at Deaconess Health System, Evansville, Ind., which is currently doing a big-picture analysis of its data preservation needs.

Different Perspectives

"The people in the CFO's office ask why we need to keep it, and the people at the nurse's station want to have everything and don't see the cost of data storage," he says. His priorities are maximizing the efficiency of storage, minimizing the risk of duplication, and making sure everything can be retrieved as quickly as it needs to be.

"You have to make business decisions without affecting patient care," he says. "At some point information gets stale-what good is a 15-year-old EKG? -but it's not easy to pick what you're going to delete, so we haven't made a concerted effort to purge anything."

An Onslaught

Many medical innovations, from 64-slice CT scans to inexpensive gene sequencing, create massive new accumulations of data. Lynn Vogel, CIO of MD Anderson Cancer Center in Houston, is on the front lines of the data onslaught, and presented some of its conundrums at the NLM workshop.

His institution wants to sequence the genomes on 30,000 patients a year, to help determine which types of cancer treatments will be most effective for them. A complete genomic sequence for one person can take up 30 terabytes of storage space.

While the information that's of immediate and known use can be boiled down to a couple of gigabytes, Vogel says genomic researchers have their eye on all that raw data, which may contain answers that will be lost if only the condensed version is preserved.

"Personally I'd like to keep it, because three or four years from now, scientists might come up with a new way to look at the data," Vogel says. "We understand a lot about breast cancer that we didn't understand two years ago."

It's possible, he adds, that gene sequencing will become so cheap that rather than storing the data, providers will choose just to redo the sequencing as needed.

In addition to the growing store of genomic data, Anderson has a billion images, which Vogel thinks is the largest accumulation in the country. The institution uses a "private cloud"-a combination of three petabytes worth of storage and an 8,000-processor supercomputer-to provide speedy data access for clinicians and researchers.

A Quaint Idea?

With so many users and agendas for Anderson's clinical data, the whole idea of a "patient record" seems almost quaint, Vogel says.

Different combinations of data are useful to different people: while the joint surgeon wants to see a few days worth of data relevant to a patient's upcoming knee surgery, his primary care physician may want to look at blood tests going back several years to see how well-controlled his diabetes is.

And a researcher may want to analyze the same series of test results from a thousand patients at once, to see which treatments worked best for a given condition.

Vogel worries that even the EHRs currently being installed, which are designed to look like electronic versions of the familiar paper medical record forms, cling too closely to the manila folder construct and won't be flexible enough to cope with varying demands.

"Health I.T. is pretty much stuck in the year 1990 in terms of architecture and vision," he says. "I worry that we have built systems that aren't amenable to change."

Value vs. Privacy

Be that as it may, the more data the better, says Vanderbilt's Mark Frisse. His workshop talk drew parallels between a medical record and the fossil record.

One bone is moderately interesting; an entire skeleton is a major paleontological discovery, and a sample of dinosaur DNA could lead to Jurassic Park.

Likewise, a single patient's medical information is of value only to that patient and decreases over time (plummeting to zero at death), but its value to the research community could grow exponentially, even long after the patient's death, if it's combined with the information of millions of other patients to offer a more complete picture of a disease process or the effectiveness of a treatment.

The ability to compile such massive resources hinges on having patients comfortable with the use of their data.

Vanderbilt maintains 85,000 "anonymized" sets of medical data that researchers can use.

Not only have names been removed, but the records have been examined for any nuggets of data that could be used to identify the patient, and those have been changed.

If they're in a sparsely populated Zip code, the code has been changed. If there were only a couple of patients receiving a given treatment on a given day, their treatment dates are changed (though the time relationships between treatments have been preserved).

Norms Needed

Frisse says society needs to develop some norms about preservation and use of medical data, so that every patient has a "living will" for his or her information.

Some may want their data destroyed on schedule, while others will be "information altruists" who allow use of their data immediately and indefinitely. Some may want theirs preserved for family use but not for research.

Next of kin should be allowed to make decisions about their deceased relatives' information.

However, all that effort will be for naught if EHRs don't do a good job of collecting, analyzing and retrieving information. "Our primary focus should be on making health information available for patient care later today," he says.

EHR Preservation Issues

The recent preservation workshop at the National Library of Medicine wasn't designed to come up with definitive answers about how to preserve a growing mass of electronic health records, but it did an excellent job of framing the questions. The 90-plus attendees each gave their ideas on two key questions. Here's a summary of their responses.

Question 1: What is the single most important thing that needs to be done to enable preservation and reuse of EHR?

* An industry and government-wide definition of what an EHR is and how long it should be preserved

* Agreement on the objectives of preserving the EHR

* Determination of whether all data in the EHR should be preserved and what representations of the data should be preserved

* Establishment of universal standards for preservation of EHR data

* Determination of system requirements and functionality necessary to preserve the designated data

* Identification of what metadata needs to be preserved with the records

* Documenting the process for preservation, testing to confirm preserved data can be retrieved, and implementing access controls for archived data.

Question 2. What is the single greatest obstacle to preservation and reuse of an EHR?

* The need for a comprehensive and consistent set of standards

* Lack of foresight * Resistance to change

* Insufficient dialogue among different stakeholders

* Uncertainty concerning what should be preserved and how

* Failure to articulate policies that show how preservation works and are consistent with other requirements such as privacy and security

* Rapid changes in technology

* Inadequate approach to managing technological obsolescence

* Uncertainty over what will work in the long run

* The impact on security mechanisms


Source: NIST
Elizabeth Gardner

No comments:

Post a Comment