IT may not be a panacea yet, but most healthcare experts agree with IHA's Williams that technology is critical to solving the nation's healthcare woes, which are reflected primarily in skyrocketing prices and clinical care that is inconsistent at best.
In the past few months, while promoting a ten-year plan to modernize the U.S. healthcare system, Secretary of Health and Human Services Tommy Thompson has repeatedly complained that the nation's medical information system is stuck in the early 1900s, in the age of manila folders. Thompson argues that electronic record systems would improve patient care, diminish dangerous medical mistakes and in the process cut the nation's $1.6 trillion-a-year healthcare bill by at least 10 percent.
The lack of computerization is at least symptomatic of why the healthcare system in the U.S. is in such dismal shape.
While most industries spent the 1980s and 1990s installing computers and adopting leaner, customer-targeted operational systems, the healthcare industry lagged woefully behind.
In 2004, healthcare companies spent about 3 percent of revenue on IT, compared with about 5.5 percent for financial services firms, according to META Group Inc. That gap was even more pronounced a decade ago. By 2002, only about 13 percent of hospitals and 28 percent of physicians' offices had electronic records systems, according to a recent HHS report.
Among healthcare companies, medical plan insurers have been the most enthusiastic users of IT, in part because they are as much financial services firms as medical companies.
Chiefly, they have built electronic claims networks to automate the reimbursement and payment process with the goal of squeezing labor and other administrative costs out of their payment systems and directing these savings straight to their bottom line. And they've done a fairly good job of that, as evidenced by the recent spate of improved insurance company profits and margins.
But in the grand scheme of the nation's massive healthcare system, these claims networks are the equivalent of stand-alone devices—without permanent links between insurers and healthcare providers to offer real-time information on individual clinical history and plan coverage at the doctor's office or the hospital bed. As a result, they've had little impact on improving the level of treatment patients receive.
In fact, while hospitals, doctors and insurers have dawdled, medical care in the U.S. has suffered. In most industries, improvements initiated by any player—a real-time, point-of-sale database system at a retailer, for instance, or a supplier's factory automation effort—cascade throughout the supply chain, lowering costs, advancing efficiency and quality, and ultimately offering customers better products and service.
But in healthcare, the relationships among companies are more adversarial than collaborative. As a result, the impact of an initiative tends to bloat the total system, not streamline it, and it can further confuse patient care rather than simplify it.
For instance, although hospitals have generally avoided implementing computerized clinical records, they have freely invested in advanced diagnostic technology, such as million-dollar-plus magnetic resonance imaging machines. These purchases are justified because MRIs are a doctor-magnet—physicians prefer to work in hospitals with the most up-to-date equipment—and the machines pay for themselves since hospitals can charge a high price for these procedures.
In 2001 alone, new clinical technologies accounted for more than 20 percent of the increase in healthcare costs, according to PricewaterhouseCoopers. Some of these higher costs are, of course, acceptable, because these procedures can save lives. But a large segment of the increase is the result of physicians recommending high-tech options when they aren't clinically warranted.
Source: CIO Insight, 2004-11-01