A Collection of Articles
Edit

Biomedicine

A Digital Health-Care Revolution

Twenty billion dollars might finally turn the U.S. health-care system digital.

The more wired the hospital, the better off its patients: there are fewer deaths and complications, and lower bills. That’s the conclusion of a large study of Texas hospitals released earlier this week. Unfortunately, only a small percentage of hospitals and doctors’ offices in the United States are wired, and the country lags far behind other developed nations in implementing such systems. However, legislators and health-technology specialists hope to change that with a $20 billion cash influx, part of the U.S. government’s proposed stimulus bill.

Dubbed the Health Information Technology for Economic and Clinical Health Act (HITECH), the plan would encourage doctors and hospitals to use electronic record-keeping and ordering systems by providing $18 million in incentives through Medicare and Medicaid reimbursements. Starting in 2011, physicians who show that they are “meaningfully” using health IT would be eligible for $40,000 to $65,000, and hospitals would be eligible for several million dollars. The incentives would be phased out over time, with penalties in place by 2016.

The bill allocates $2 billion over the next two years for planning and training, including ensuring that new programs adhere to specific interoperability standards. That will be crucial in making certain that data can be transferred between different medical centers and physicians, and that doctors are schooled in how to incorporate electronic record keeping and other technologies into their practices. It would also strengthen privacy and security laws to protect the growing amount of personal medical information that will become electronic.

Currently, less than a quarter of physicians in the United States are using electronic health records (EHRs). The stimulus spending should help overcome two of the major barriers to adoption: lack of funding and misaligned incentives, says John Halamka, chief information officer and dean for technology at Harvard Medical School. Currently, doctors must invest time and money to implement EHR systems, but it’s the insurers and payers who ultimately benefit, thanks to a reduction in unnecessary tests and medications.

The $20 billion boost will be a huge leap for an industry that has seen little government spending. According to a 2006 study, the United States spends 43 cents per capita on health-care IT, compared with the $193 per capita spent in the United Kingdom. The entire health-care IT industry had an estimated budget of $26 billion in 2008, says Halamka. He reckons that the bill could create 50,000 new IT jobs. “We’re not talking about MDs or PhDs,” says Halamka. “I think we can take tech professionals and train them in health care within the next two years.”

While the United States faces different challenges than other nations that have gone digital–the United Kingdom, Canada, and Scandinavia all have nationalized health care–experts say that we still have lessons to learn from those program. The United Kingdom spent $13 billion on a highly criticized health-care transformation. One problem with that program was a lack of physician participation in the early stages. “They didn’t engage local physicians or governments,” says Halamka. So states with little existing health-care IT infrastructure may need extra help in setting up their programs, he says.

Physician education will also be paramount, says David Bates, chief of the Division of General Medicine at the Brigham and Women’s Hospital, in Boston. “It’s important to make sure that providers are using the system, and that we have approaches for finding people having difficulties,” he says. Bates notes, however, that the United Kingdom is doing something right: only three general practitioners there are not currently using electronic medical records.

A study published this week in the Archives of Internal Medicine suggests that broad adoption of IT systems may provide significant health benefits for patients. Researchers at the Johns Hopkins University School of Medicine, in Baltimore, rated clinical information technologies at 41 hospitals in Texas and compared those results with discharge information for more than 160,000 patients. Technologies recorded included electronic note taking, treatment records, test results, drugs orders, and decision-support systems that offer information concerning treatment options and drug interactions. The researchers found that hospitals that rated highly on automated note taking had a 15 percent decrease in the odds that a patient would die while hospitalized. Hospitals with highly rated decision-support systems also had 20 percent lower complication rates. Researchers found that electronic systems reduced costs by about $100 to $500 per admission.

The findings confirm that commercially available health-care IT programs can improve patient care in a diverse range of settings. Most previous studies have focused on just one or two institutions–mostly academic medical centers, which have often developed their own systems. “I think this study really helps justify the stimulus package,” says Bates.

The House of Representatives’ Committee on Ways and Means approved the health-care IT portion of the plan last week, and the House and Senate are now finalizing their versions of the broader stimulus bills. Congress hopes to bring a bill before the president by Presidents’ Day: February 16.

Uh oh–you've read all five of your free articles for this month.

Insider basic

$29.95/yr US PRICE

Subscribe
What's Included
  • 1 year (6 issues) of MIT Technology Review magazine in print OR digital format
  • Access to the entire online story archive: 1997-present
  • Special discounts to select partners
  • Discounts to our events

You've read of free articles this month.