Skip to main content

Measuring the impact of a health information exchange intervention on provider-based notifiable disease reporting using mixed methods: a study protocol



Health information exchange (HIE) is the electronic sharing of data and information between clinical care and public health entities. Previous research has shown that using HIE to electronically report laboratory results to public health can improve surveillance practice, yet there has been little utilization of HIE for improving provider-based disease reporting. This article describes a study protocol that uses mixed methods to evaluate an intervention to electronically pre-populate provider-based notifiable disease case reporting forms with clinical, laboratory and patient data available through an operational HIE. The evaluation seeks to: (1) identify barriers and facilitators to implementation, adoption and utilization of the intervention; (2) measure impacts on workflow, provider awareness, and end-user satisfaction; and (3) describe the contextual factors that impact the effectiveness of the intervention within heterogeneous clinical settings and the HIE.


The intervention will be implemented over a staggered schedule in one of the largest and oldest HIE infrastructures in the U.S., the Indiana Network for Patient Care. Evaluation will be conducted utilizing a concurrent design mixed methods framework in which qualitative methods are embedded within the quantitative methods. Quantitative data will include reporting rates, timeliness and burden and report completeness and accuracy, analyzed using interrupted time-series and other pre-post comparisons. Qualitative data regarding pre-post provider perceptions of report completeness, accuracy, and timeliness, reporting burden, data quality, benefits, utility, adoption, utilization and impact on reporting workflow will be collected using semi-structured interviews and open-ended survey items. Data will be triangulated to find convergence or agreement by cross-validating results to produce a contextualized portrayal of the facilitators and barriers to implementation and use of the intervention.


By applying mixed research methods and measuring context, facilitators and barriers, and individual, organizational and data quality factors that may impact adoption and utilization of the intervention, we will document whether and how the intervention streamlines provider-based manual reporting workflows, lowers barriers to reporting, increases data completeness, improves reporting timeliness and captures a greater portion of communicable disease burden in the community.

Peer Review reports


Health information exchange (HIE) is the capacity to electronically transfer and share health information among health care-related stakeholders and organizations such as clinics, laboratories, payers, hospitals, pharmacies and public health[1]. HIEs promise to reduce health care costs[2], improve patient safety[3], and provide access to more timely surveillance data for public health organizations[4]; however the success of HIE implementations and their system improvements can depend on the context in which data flow and clinical care workflow are integrated. For example, an investigation of one HIE’s impact on emergency department (ED) charges reported a reduction of approximately $26 per encounter (p = 0.03) at one hospital with no effect on charges at a second independent hospital in the same HIE[5]. Similarly, in a comparison of HIE utilization among municipalities, Maenpaa et al. (2012) reported a difference in system usage between specialized and primary care providers depending on the numbers of ED visits, laboratory tests, radiology exams and appointments[6]. An investigation of HIE usage in EDs and ambulatory clinics also found that system utilization varied by site, patient population and characteristics, specialization of services, and site policies governing use and administrative access[7]. Studies such as these suggest that outcomes resulting from HIE interactions, access and interventions can be context-sensitive.

An unexplored area is reporting of communicable and infectious diseases, such as pertussis, tuberculosis, salmonella, Chlamydia and Hepatitis C, among others, to public health in an HIE. Most states utilize a dual reporting structure: mandatory case reporting of a disease by providers and mandatory reporting of test results by laboratories to public health authorities. Conventional provider-initiated paper-based reports, transmitted through fax and mail, have been shown to be incomplete, error-prone and untimely. Report completeness ranges from 9 to 99 percent[8]. Highly prevalent diseases like sexually transmitted infections are reported approximately 79 percent of the time and many diseases like pertussis and Lyme disease are reported less than 50 percent of the time[8]. Report timeliness of reporting ranges from one day to three weeks after diagnosis, depending on the disease[9]. In addition, provider reports often lack demographic details that public health workers need, requiring them to perform follow-up calls to get this additional information[10].

In comparison to conventional paper-based reporting, electronic laboratory reporting (ELR) has been successful in delivering more timely laboratory test results to public health[11] and increasing the proportion of notifiable disease reports that are reported to public health[11, 12]. However, implementation of ELR can increase or exceed local investigative capacity by significantly increasing the volume of reported cases to public health agencies[13, 14] and, like provider reports, ELR can lack clinical and treatment details, such as complete patient demographics, vital signs, pregnancy status or prescribed drugs, needed to fully characterize or prioritize a case for public health purposes[15, 16].

Integration of ELR into electronic health record (EHR) systems and HIE networks—i.e., an ELR-EHR-HIE infrastructure—has been shown to connect clinical and public health stakeholders without interrupting existing workflows or adding burden to clinical providers[17]. Given that many states have (or will have in the near future) an infrastructure supporting ELR and HIE[18] and EHRs contain clinical and treatment data often missing from ELR, there is potential to use an integrated ELR-EHR-HIE infrastructure to improve provider-based notifiable disease reporting beyond existing improvements to lab-based reporting.

The “Improving Population Health through Enhanced Targeted Regional Decision Support” study aims to leverage an available ELR-EHR-HIE infrastructure to electronically pre-populate provider-submitted notifiable disease report forms with available clinical, lab and patient data. This intervention has the potential to streamline provider-based reporting workflows, lower barriers to reporting, increase data completeness, improve reporting timeliness and capture a greater portion of communicable disease burden in the community.

We hypothesize clinics that implement the intervention will effectively incorporate the pre-populated reporting form into their workflow, providers will report high satisfaction with the intervention, and that, compared to clinics in which the intervention is not implemented, barriers to reporting will be reduced, timeliness of reporting and completeness of report data fields will improve, and need for providers to provide supplemental or corrected data to public health will be reduced. The implementation of the intervention will be evaluated using mixed methods. The study protocol described will evaluate the implementation of the pre-populated form intervention in the context of an operational HIE.


The evaluation-specific objectives are to assess the implementation of the reporting form intervention at the clinic level to: (1) identify barriers and facilitators to implementation, adoption and utilization of the pre-populated reporting form; (2) measure impacts of the tool on workflow, provider awareness, and end-user satisfaction; and (3) describe the contextual factors that impact the effectiveness of the intervention within heterogeneous clinical settings and the HIE.


The intervention will be implemented within one of the largest and oldest HIE infrastructures in the U.S., the Indiana Network for Patient Care (INPC). The INPC is anchored by the Regenstrief Medical Record System which collects data from a variety of sources, including hospitals, clinics, pharmacies, and laboratories[19]. Lab results are routinely delivered to ordering physicians using the Regenstrief DOCS4DOCS® software and analyzed as ELR transactions by the Notifiable Condition Detector (NCD) which identifies mandated reportable diseases and notifies local and state public health departments of these laboratory results[20].


The study will include outpatient INPC primary care practices operating in urban and rural settings that are part of the INPC, representing multiple health systems and clinics. Clinics that do not use the DOCS4DOCS® software will be excluded.


Two technical interventions will be deployed over a staggered schedule at participating clinics: 1) “standard” pre-populated forms and 2) “enhanced” pre-populated forms. The “standard” forms intervention will use EHR (patient demographics and clinic information) and ELR (notifiable disease test results) data available in the HIE to pre-populate and deliver an electronic version of the official state notifiable disease reporting form to the provider. Providers will be able to review the pre-populated form, add any additional information, and fax completed forms to their local health department. The “enhanced” forms intervention will pre-populate an alternative reporting form with an expanded set of data available in the HIE. For example, the “enhanced” form will not only include test results data for a case of hepatitis B but also corollary results on the patient’s liver enzymes; this is information the health department typically requests from the provider in a follow-up phone call when investigating the reported case of hepatitis. Providers will still be able to review the pre-populated “enhanced” form, add any additional information, and fax completed forms to their local health department. Since deployment is staggered, at any point in time the non-intervention sites can act as natural controls for the intervention sites without the selection bias that is generally present in non-randomized experiments. Therefore, the study protocol is theoretically equivalent in its ability to generate causal evidence to a traditional randomized controlled experiment.

Research questions

Mixed methods studies require that research questions be linked to and drive the data collection and analysis methods, as well as inform the study design, sample size, sampling, instruments developed and administered, and data analysis techniques[21]. Our primary research questions are:

  1. 1.

    What individual, organizational and data quality factors may act as barriers or facilitators to the successful adoption and utilization of pre-populated reporting forms and enhanced data transaction processing to public health; and

  2. 2.

    What is the relationship of these barriers and facilitators to fostering improvements in provider-based population health reporting workflows, lowering barriers to reporting and case follow-up, increasing data completeness, and enabling greater capture of communicable disease burden in the community?

Data collection

Using a concurrent mixed methods design, data collection will be conducted during the three project phases: baseline or pre-implementation; post-implementation of the standard form; and post-implementation of the enhanced form. In each phase, qualitative and quantitative data are collected in tandem as coordinated but independent studies. This design will allow us to triangulate the quantitative results from surveys, time-series, and data quality measures with qualitative interview and open-ended survey results to understand experiences with public health reporting before and after each form implementation. Table 1 summarizes the categories of data collected.

Table 1 Summary of study constructs, data collection, analysis approaches and outcomes measurements by method (qualitative, quantitative)

Quantitative data collection

The following data will be collected at baseline (retrospective to 12 months prior to introduction of the intervention) and at 6-, 9-, and 12-months after implementation of both form interventions: reporting rates (the number of reports for individual diseases and in aggregate submitted to public health daily); report data completeness (completeness of fields) and accuracy (errors); reporting timeliness (length of time between the laboratory test date and treatment); treatment timeliness (length of time between the laboratory test date and treatment); and reporting burden defined as communication volume (number of phone calls or FAX communications between public health and clinics/providers or laboratories) and duration (total number of minutes) measured at the level of phone call.

Qualitative data collection

Semi-structured interviews will be conducted with representative clinical and public health workers will collect qualitative data regarding provider and public health perceptions of completeness, accuracy, timeliness and burden associated with notifiable disease reporting prior to and after each form intervention. In addition, perceptions regarding data quality, benefits, utility, adoption, utilization and impact on workflow of reporting prior to and after the intervention will be collected during baseline and at 12-months after implementation of each form intervention. Interviews will be digitally audio-taped and transcribed. Public health practitioner input regarding enhancements and supplemental data preferences for the enhanced report form will be captured by convening focus groups to establish consensus on desirable data elements.

Data analysis

Data analysis methods will be more thoroughly reported in the methods sections of future publications presenting the findings of specific analyses.

Data analysis (quantitative)

Quantitative analysis will provide measurable evidence of the impacts of the intervention and enable us to establish likely cause and effect. Longitudinal effects of the interventions will be evaluated in aggregate and across covariates on reporting rates, reporting timeliness and treatment timeliness using an interrupted time-series design to test for the intervention effect and the time trend post-intervention. This approach has been used in other decision support and time-series evaluations[22] and has been recommended for multiple baseline time-series designs that involve two or more sites that are repeatedly assessed while an intervention is introduced into one site at a time[23]. Since the effects of the interventions may vary across clinics, stratified regression analyses will also be performed. Following comparative assessment of reporting completeness, accuracy and communication burden, these data will be added to the regression model. Assessment of data completeness, timeliness, and accuracy will involve pre-post comparison of data quality metrics[15].

Data analysis (qualitative)

Qualitative analysis will provide in-depth context regarding facilitators and barriers to implementation, adoption, benefits and use of the intervention; identify and describe their impacts on HIE individual and organizational processes; and look at the broad range of interconnected processes or causes at play regarding data quality. Analyses will be conducted using qualitative software by experienced coders using the constant comparative method of analysis[24] and utilizing standard approaches to ensure credibility, consistency and robustness of the findings. Transcribed interviews and open-ended survey items will undergo a series of well-established steps to identify emerging themes and trends and, ultimately, build a model to describe the intervention phenomenon in a conceptual form[25]. The process will begin with developing a coding scheme which will be developed from combining concepts derived a priori from the conceptual frameworks driving the study and inductively as the analysis proceeds. Content will be grouped into nodes, a codebook will be built, and codes or code combinations will be summarized and stratified by contextual factors such as demographics, respondent role, etc. These summaries will be entered into appropriate data displays that specify interactions between the intervention, its context and its effects as preparation for triangulation.

Triangulation of quantitative and qualitative data

Data will be triangulated to find convergence or agreement by cross-validating results. The first step will be exploratory to determine most appropriate “transformation” of the data: conversion of quantitative data into narrative data (“qualitized”) or conversion of qualitative data into numerical codes (“quantitized”) that can be represented statistically. There are several approaches to quantifying qualitative data, including enumerating the frequency of themes within a sample, the percentage of themes associated with a given category of respondent, or the percentage of people selecting specific themes. In these approaches the quantified data can be statistically compared to quantitative data collected concurrently but separately. Another strategy for quantifying qualitative data enumerates whether or not qualitative responses included certain codes—i.e., rather than seeking to understand how many times a certain code was provided by each participant or the frequency with which they appeared, the presence or absence of each code for each participant is quantified into dichotomous variables 0 or 1 based on absence or presence of each coded response. We will determine the most appropriate approach after reviewing the descriptive analysis. Depending on which approach is used, this mixed data will be correlated (quantitative data correlated with qualitized data or vice versa)[26]. Guided by the nature of the data collected, quantitative and qualitative data may be collated to create new or consolidated variables or data sets in order to further compare and integrate data.

Once transformed, we will calculate simple correlations, stratify codes by provider type, demographics, geographic location or organization attributes or look at similarity among respondents. This process will allow us to identify areas in which findings agree (convergence), contradict (discrepant or dissonant) or deepen understanding on the same issue (complementarity)[27]. We anticipate that the results will allow us to identify, analyze and explain social, behavioral and environmental similarities and differences between HIE settings, provider types and perceptions across pre- and post-implementation time units that will describe identified barriers and facilitators to the intervention.

Limitations and biases

There are several limitations to our proposed work. First, some clinical sites may be more open to recruitment and enrollment in the study and thus introduce a bias in our sample. For example, some sites may serve patient populations that are more likely to require notifiable disease reporting (for example, a women’s clinic with a high Chlamydia reporting history) and thus be more incentivized to participate. It is possible given use of an interrupted time-series design, that confounding may occur due to covariates that may change over time. The INPC is a growing HIE so it is possible that policy, governance, legal mandates or information technology changes could occur during the course of this study that may impact data collection or introduce additional confounding issues. This HIE includes an academic affiliation and medical training program so providers may rotate through training sites at different stages of the intervention which may introduce confounding. Introduction and adoption of the intervention may be more rapid in some settings (small ambulatory clinic) than others (large hospital) as the workplace may accommodate or adjust to the intervention more easily or require administrative protocols to support the intervention. It is also possible that the staggered implementation of the intervention may complicate quality control of data collection. Also, given data collection covers only one baseline year and a maximum of two years post-intervention, depending on site, this short time frame may limit ability for analysis to account for seasonal trends in reporting. Our findings may also be limited by the context in which this study is conducted. The INPC is one of the most advanced HIEs in the country; therefore, our results may have limited generalizability. However, we believe our emphasis on context and by clearly documenting the characteristics of the sites in which the intervention, our findings could inform implementation other HIE settings. Given current mandates to build systems and infrastructures for ELR and HIE in communities where it is absent, expand efforts in communities where ELR is already occurring, and requirement for eligible hospitals to routinely transmit reportable laboratory results to a public health agency, our findings may provide insights and inform roadmaps for more nascent HIEs to move forward towards better notifiable disease surveillance.

Ethical considerations

The project received approval by the Institutional Review Board of Indiana University with a concurrent Institutional Review Board deferral from the University of Washington to Indiana University. Informed consent will be obtained from all participants and confidentiality will be ensured. All data will be stored according to the rules of the research ethics committee. Because this study does not meet ICMJE guidelines it has not been registered in a publicly accessible registry.


Context is recognized as a critical element for understanding the effects of intervention components individually and in combination[28]; appropriately generalizing findings across multiple sites[29]; and accelerating the process of translating research into practice[30]. Context includes the characteristics of an organization and its environment, e.g. size, organizational structure, personnel, training, governance, financial factors, leadership, legal mandates, communication flows, which can influence the implementation and effectiveness of an intervention. However, the use of conventional quantitative research methods alone often limit detailed understanding of the phenomena under study, ignore the context of the problem or intervention, minimize the impact of the changing nature of the study subject and/or its context, and exclude outliers in preference to generating implications from the mean[31]. Sensitivity to context and “context-heterogeneity”—i.e., how differences in context change, shape and are shaped by the phenomena under study—is one of the values of qualitative methods.

Mixed or multi-methods research, which integrates quantitative and qualitative data collection and analysis in a single study or a program of inquiry, provides methodologies for measuring the effects of an intervention within complex, growing and evolving contexts while also exploring the context of the intervention itself[32, 33]. In the HIE setting, employing mixed methods could contribute to better understanding of the contextual issues that influence the potential quality, safety and efficiency outcomes associated with HIE implementations[34].

Combining quantitative and qualitative methods in our evaluation of a complex HIE intervention has two potential benefits: complementarity and multiplicativeness, i.e., the methods may illuminate different aspects of the intervention impacts and the combination of the two methods may provide greater insight than either method individually[35]. By measuring context, facilitators and barriers, and individual, organizational and data quality factors that may impact adoption and utilization of the intervention, we will document whether and how the intervention streamlines provider-based manual reporting workflows, lowers barriers to reporting, increases data completeness, improves reporting timeliness and captures a greater portion of communicable disease burden in the community. We believe that our approach demonstrates a multi-faceted and integrative research “attitude” towards the problems we are concerned with, the insights we are seeking, the comprehensive ways in which we desire to generate knowledge, and the creative and scientific curiosity required to rigorously tackle an interdisciplinary investigation.


HIEs are rapidly expanding in the U.S., fueled by national policies that are investing in the integration of ELR, EHR, and other information systems. Yet only one-third of public health departments are engaged in local efforts to electronically exchange data between clinical and public health[36]. Our study not only presents the opportunity to study the evolving public health infrastructure with integration of ELR-EHR-HIE technologies but also the use of mixed methods to better understand the context in which HIE and HIE-based interventions are implemented. To our knowledge, this approach is unique as we were unable to identify previous articles that detail a systematic and rigorous application of mixed research methods to the evaluation of HIE. By testing the feasibility of using mixed methods to study a complex informatics intervention in a complex health care setting, we believe our approach extends prior work in informatics that calls for incremental evaluation of maturing HIE models[34, 37]. Similar studies of HIE and public health informatics interventions should consider leveraging the combination of quantitative and qualitative methods to more fully explore, observe, and document the impact of interventions on population outcomes as well as information system adoption and use.



Electronic health record


Electronic laboratory reporting


Emergency Department


Health information exchange


Indiana network for patient care.


  1. 1.

    AHRQ: Health information exchange. 2009,,

    Google Scholar 

  2. 2.

    Walker J, Pan E, Johnston D, Adler-Milstein J, Bates DW, Middleton B: The value of health care information exchange and interoperability. Health Aff (Millwood). 2005, Suppl Web Exclusives: W5-10-W5-18.

    Google Scholar 

  3. 3.

    Kaelber DC, Bates DW: Health information exchange and patient safety. J Biomed Inform. 2007, 40 (6 Suppl): S40-S45.

    Article  PubMed  Google Scholar 

  4. 4.

    Smith PF, Hadler JL, Stanbury M, Rolfs RT, Hopkins RS: “Blueprint version 2.0”: updating public health surveillance for the 21st century. J Public Health Manag Pract. 2013, 19: 231-239. 10.1097/PHH.0b013e318262906e.

    Article  PubMed  Google Scholar 

  5. 5.

    Overhage JM, Dexter PR, Perkins SM, Cordell WH, McGoff J, McGrath R, McDonald CJ: A randomized, controlled trial of clinical information shared from another institution. Ann Emerg Med. 2002, 39: 14-23. 10.1067/mem.2002.120794.

    Article  PubMed  Google Scholar 

  6. 6.

    Maenpaa T, Asikainen P, Gissler M, Siponen K, Maass M, Saranto K, Suominen T: The utilization rate of the regional health information exchange: how it impacts on health care delivery outcomes. J Public Health Manag Pract. 2012, 18: 215-223. 10.1097/PHH.0b013e318226c9b9.

    Article  PubMed  Google Scholar 

  7. 7.

    Johnson KB, Unertl KM, Chen Q, Lorenzi NM, Nian H, Bailey J, Frisse M: Health information exchange usage in emergency departments and clinics: the who, what, and why. J Am Med Inform Assoc. 2011, 18: 690-697. 10.1136/amiajnl-2011-000308.

    Article  PubMed  PubMed Central  Google Scholar 

  8. 8.

    Doyle TJ, Glynn MK, Groseclose SL: Completeness of notifiable infectious disease reporting in the United States: an analytical literature review. Am J Epidemiol. 2002, 155: 866-874. 10.1093/aje/155.9.866.

    Article  PubMed  Google Scholar 

  9. 9.

    Jajosky RA, Groseclose SL: Evaluation of reporting timeliness of public health surveillance systems for infectious diseases. BMC Public Health. 2004, 4: 29-10.1186/1471-2458-4-29.

    Article  PubMed  PubMed Central  Google Scholar 

  10. 10.

    Sickbert-Bennett EE, Weber DJ, Poole C, MacDonald PD, Maillard JM: Completeness of communicable disease reporting, North Carolina, USA, 1995–1997 and 2000–2006. Emerg Infect Dis. 2011, 17: 23-29. 10.3201/eid1701.100660.

    Article  PubMed  PubMed Central  Google Scholar 

  11. 11.

    Overhage JM, Grannis S, McDonald CJ: A comparison of the completeness and timeliness of automated electronic laboratory reporting and spontaneous reporting of notifiable conditions. Am J Public Health. 2008, 98: 344-350. 10.2105/AJPH.2006.092700.

    Article  PubMed  PubMed Central  Google Scholar 

  12. 12.

    Nguyen TQ, Thorpe L, Makki HA, Mostashari F: Benefits and barriers to electronic laboratory results reporting for notifiable diseases: the New York City department of health and mental hygiene experience. Am J Public Health. 2007, 97 (Suppl 1): S142-S145.

    Article  PubMed  PubMed Central  Google Scholar 

  13. 13.

    Kite-Powell A, Hamilton JJ, Hopkins RS, DePasquale JM: Potential effects of electronic laboratory reporting on improving timeliness of infectious disease notification—Florida, 2002–2006. MMWR Morb Mortal Wkly Rep. 2008, 57: 1325-1328.

    Google Scholar 

  14. 14.

    McHugh LA, Semple S, Sorhage FE, Tan CG, Langer AJ: Effect of electronic laboratory reporting on the burden of lyme disease surveillance—New Jersey, 2001–2006. MMWR Morb Mortal Wkly Rep. 2008, 57: 42-45.

    Google Scholar 

  15. 15.

    Dixon BE, McGowan JJ, Grannis SJ: Electronic laboratory data quality and the value of a health information exchange to support public health reporting processes. AMIA Annu Symp Proc. 2011, 2011: 322-330.

    PubMed  PubMed Central  Google Scholar 

  16. 16.

    Lazarus R, Klompas M, Campion FX, McNabb SJ, Hou X, Daniel J, Haney G, DeMaria A, Lenert L, Platt R: Electronic support for public health: validated case finding and reporting for notifiable diseases using electronic medical data. J Am Med Inform Assoc. 2009, 16: 18-24. 10.1197/jamia.M2848.

    Article  PubMed  PubMed Central  Google Scholar 

  17. 17.

    Klompas M, McVetta J, Lazarus R, Eggleston E, Haney G, Kruskal BA, Yih WK, Daly P, Oppedisano P, Beagan B, Lee M, Kirby C, Heisey-Grove D, DeMaria A, Platt R: Integrating clinical practice and public health surveillance using electronic medical record systems. Am J Public Health. 2012, 102 (Suppl 3): S325-S332.

    Article  PubMed  PubMed Central  Google Scholar 

  18. 18.

    Grannis SJ, Stevens K, Merriwether R: Leveraging health information exchange to support public health situational awareness: the Indiana experience. Online J Public Health Inform. 2010, 2: 3213-

    Google Scholar 

  19. 19.

    McDonald CJ, Overhage JM, Barnes M, Schadow G, Blevins L, Dexter PR, Mamlin B, INPC Management Committee: The Indiana network for patient care: a working local health information infrastructure. An example of a working infrastructure collaboration that links data from five health systems and hundreds of millions of entries. Health Aff (Millwood). 2005, 24: 1214-1220. 10.1377/hlthaff.24.5.1214.

    Article  Google Scholar 

  20. 20.

    Fidahussein M, Friedlin J, Grannis S: Practical challenges in the secondary use of real-world data: the notifiable condition detector. AMIA Annu Symp Proc. 2011, 2011: 402-408.

    PubMed  PubMed Central  Google Scholar 

  21. 21.

    Onwuegbuzie AJ, Leech NL: Linking research questions to mixed methods data analysis procedures. Qual Report. 2006, 11: 474-498.

    Google Scholar 

  22. 22.

    Goldberg HI, Neighbor WE, Cheadle AD, Ramsey SD, Diehr P, Gore E: A controlled time-series trial of clinical reminders: using computerized firm systems to make quality improvement research a routine part of mainstream practice. Health Serv Res. 2000, 34: 1519-1534.

    CAS  PubMed  PubMed Central  Google Scholar 

  23. 23.

    Biglan A, Ary D, Wagenaar AC: The value of interrupted time-series experiments for community intervention research. Prev Sci. 2000, 1: 31-49. 10.1023/A:1010024016308.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  24. 24.

    Strauss A, Corbin J: Basics of qualitative research. 1998, London: Sage

    Google Scholar 

  25. 25.

    Reeder B, Revere D, Hills RA, Baseman JG, Lober WB: Public health practice within a health information exchange: information needs and barriers to disease surveillance. Online J Public Health Inform. 2012, 4: 4277-

    Google Scholar 

  26. 26.

    Tashakkori A, Teddlie C: Handbook on mixed methods in the behavioral and social sciences. 2003, Thousand Oaks CA: Sage

    Google Scholar 

  27. 27.

    O’Cathain A, Murphy D, Nicholl J: Three techniques for integrating data in mixed methods studies. BMJ. 2010, 341: 1147-1150.

    Google Scholar 

  28. 28.

    Bonell C, Fletcher A, Morton M, Lorenc T, Moore L: Realist randomised controlled trials: a new approach to evaluating complex public health interventions. Soc Sci Med. 2012, 75: 2299-2306. 10.1016/j.socscimed.2012.08.032.

    Article  PubMed  Google Scholar 

  29. 29.

    Ovretveit JC, Shekelle PG, Dy SM, McDonald KM, Hempel S, Pronovost P, Rubenstein L, Taylor SL, Foy R, Wachter RM: How does context affect interventions to improve patient safety? An assessment of evidence from studies of five patient safety practices and proposals for research. BMJ Qual Saf. 2011, 20: 604-610. 10.1136/bmjqs.2010.047035.

    Article  PubMed  Google Scholar 

  30. 30.

    Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C: National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012, 102: 1274-1281. 10.2105/AJPH.2012.300755.

    Article  PubMed  PubMed Central  Google Scholar 

  31. 31.

    Ovretveit J: The contribution of new social science research to patient safety. Soc Sci Med. 2009, 69: 1780-1783. 10.1016/j.socscimed.2009.09.053.

    Article  PubMed  Google Scholar 

  32. 32.

    Johnson RB, Onwuegbuzie AJ: Mixed methods research: a research paradigm whose time has come. Educ Res. 2004, 33: 14-26.

    Article  Google Scholar 

  33. 33.

    Zhang W, Creswell J: The use of “mixing” procedure of mixed methods in health services research. Med Care. 2013, in press

    Google Scholar 

  34. 34.

    Ash JS, Guappone KP: Qualitative evaluation of health information exchange efforts. J Biomed Inform. 2007, 40 (6 Suppl): S33-S39.

    Article  PubMed  PubMed Central  Google Scholar 

  35. 35.

    Protheroe J, Bower P, Chew-Graham C: The use of mixed methodology in evaluating complex interventions: identifying patient factors that moderate the effects of a decision aid. Family Pract. 2007, 24: 594-600. 10.1093/fampra/cmm066.

    Article  Google Scholar 

  36. 36.

    Hessler BJ, Soper P, Bondy J, Hanes P, Davidson A: Assessing the relationship between health information exchanges and public health agencies. J Public Health Manag Pract. 2009, 15: 416-424. 10.1097/01.PHH.0000359636.63529.74.

    CAS  Article  PubMed  Google Scholar 

  37. 37.

    Shapiro JS: Evaluating public health uses of health information exchange. J Biomed Inform. 2007, 40 (6 Suppl): S46-S49.

    Article  PubMed  PubMed Central  Google Scholar 

Pre-publication history

  1. The pre-publication history for this paper can be accessed here:

Download references


The “Improving Population Health through Enhanced Targeted Regional Decision Support” research project is a collaboration between Indiana University, Regenstrief Institute and the University of Washington. The authors wish to acknowledge the project team: Janet Arno, Roland Gamache, Joseph Gibson, Rebecca Hills, Uzay Kirbiyik, Patrick Lai, Melissa McMaster, and Jennifer Williams. This project was supported by grant number R01HS020909 from the Agency for Healthcare Research and Quality (AHRQ). The content is solely the responsibility of the authors and does not necessarily represent the official views of AHRQ or the Department of Veterans Affairs.

Author information



Corresponding author

Correspondence to Debra Revere.

Additional information

Competing interests

The authors declare that they have no competing interest.

Authors’ contributions

SG, DR and BD conceived of the study. All authors were involved in study design. DR and BD wrote the draft manuscript. All authors reviewed and participated in revisions of this manuscript. All authors read and approved the final manuscript.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver ( ) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Cite this article

Dixon, B.E., Grannis, S.J. & Revere, D. Measuring the impact of a health information exchange intervention on provider-based notifiable disease reporting using mixed methods: a study protocol. BMC Med Inform Decis Mak 13, 121 (2013).

Download citation


  • Evaluation
  • Health information exchange
  • Mixed methods
  • Public health reporting