Although computers currently play a substantial role in health care- from patient check-in all the way to health care delivery- this has not always been the case. Indeed, computers first appeared in health care settings in the 1960s; the use of computers in health care represented a more general diffusion of information technology into the workplace during this time.
Computers were initially used in hospitals, primarily to organize information related to administrative and fiscal functions. At the same time, computers also were also used to help with the clinical delivery of care: it was found that computers could not only store information, but also could help in understanding clinical results and keeping track of patients’ medications and disease course. The introduction of computers into the clinical space quickly spread as different organizations moved to create entire systems that could be customized across different settings: For example, The HELP system was created at the LDS Hospital in Utah, Massachusetts General Hospital created the COSTAR computer system, and the TMR system at Duke University.
Overall, the introduction of computers into health care settings in the 1960s (particularly hospitals) set the stage for the rapid evolution of information technology in the administration of health care organizations and health care treatments. Indeed, with the basic IT infrastructure set by the 1980s, health care organizations could focus on more sophisticated interfaces and applications including the first electronic individual health record and patient tracking and diagnostic systems.
Stevens, R. Health Care in the Early 1960s. (1985). Available at: http://www.ssa.gov/history/pdf/HealthCareEarly1960s.pdf.
Berner, E., Detmer, D.E., and Simborg, D. (2005). Will the wave finally break? A brief view of the adoption of electronic medical records in the United States. Journal of the American Medical Informatics Association, 12(1), 3-7.