- Research article
- Open Access
- Open Peer Review
Group differences in physician responses to handheld presentation of clinical evidence: a verbal protocol analysis
© Lottridge et al; licensee BioMed Central Ltd. 2007
- Received: 13 December 2006
- Accepted: 26 July 2007
- Published: 26 July 2007
To identify individual differences in physicians' needs for the presentation of evidence resources and preferences for mobile devices.
Within-groups analysis of responses to semi-structured interviews. Interviews consisted of using prototypes in response to task-based scenarios. The prototypes were implemented on two different form factors: a tablet style PC and a pocketPC. Participants were from three user groups: general internists, family physicians and medicine residents, and from two different settings: urban and semi-urban. Verbal protocol analysis, which consists of coding utterances, was conducted on the transcripts of the testing sessions. Statistical relationships were investigated between staff physicians' and residents' background variables, self-reported experiences with the interfaces, and verbal code frequencies.
47 physicians were recruited from general internal medicine, family practice clinics and a residency training program. The mean age of participants was 42.6 years. Physician specialty had a greater effect on device and information-presentation preferences than gender, age, setting or previous technical experience. Family physicians preferred the screen size of the tablet computer and were less concerned about its portability. Residents liked the screen size of the tablet, but preferred the portability of the pocketPC. Internists liked the portability of the pocketPC, but saw less advantage to the large screen of the tablet computer (F[2,44] = 4.94, p = .012).
Different types of physicians have different needs and preferences for evidence-based resources and handheld devices. This study shows how user testing can be incorporated into the process of design to inform group-based customization.
- Mobile Device
- Family Physician
- General Internist
- User Group
- Screen Size
Physicians are common users of mobile computers in the health care environment . Given this trend, it is useful to obtain information about users' needs and preferences regarding these devices and relevant clinical practice tools available for use on them. Fundamental work in human computer interaction  has found differences in the order of twenty to one in users' speed and accuracy in common computing tasks, that users' individual differences can predict these differences, and that interfaces can be modified to account for them. In the medical domain, there are critical consequences due to failure to meet user needs, which include unused systems [3–6], wasted time, inadequate care  and physician errors . This study examines group differences in responses to evidence-based resources on a tablet and pocketPC to make inferences about physicians' use of evidence resources and preferences for mobile devices.
Inconsistent access and application of relevant evidence is a significant cause of adverse events: research evidence, generated at an exponential rate, is not readily available to clinicians; when it is available, it is infrequently applied in clinical practice leading to care gaps [9–14]. Moreover, clinicians are limited by their inability to afford more than a few seconds per patient to find and assimilate relevant evidence [15–17].
Providing access to high-quality evidence resources at the point of care is one way to meet these challenges. Sackett and Straus evaluated the impact of evidence at the point of care and found that use of an 'evidence cart' increased the extent to which evidence was sought and incorporated into patient care decisions . Clinicians were found to use evidence resources if they were easily accessible . Practicing evidence based medicine (EBM) as little as once per month was related to better quality of care [19, 20]. Using developments in information technology that have occurred since the Sackett and Straus study , this project aims to provide easily accessible evidence resources at the point of care using mobile computers.
Our objective was to develop a wireless medical information system that would bring the latest evidence to frontline physicians via handheld devices. The present study examines user needs to inform system design. Given that these are complex interventions aimed at improving the quality of care, a rigorous, iterative process of design, development and evaluation must occur prior to the actual clinical trial. Complex interventions are comprised of multiple components including behaviors, and methods of organizing and implementing these behaviors. The UK Medical Research Council has suggested a framework for development and evaluation of such complex interventions that includes exploring relevant theory and models . During the initial phase, relevant theory is explored to optimize the choice of intervention and to predict major confounders. In the next phase, the components of the intervention are developed and their relationship to potential outcomes explored. For complex interventions involving health informatics technologies, we believe an extensive and methodologically rigorous process of design and development must occur with inclusion of the targeted users.
eHealth initiatives that are developed without including the end-user may lead to implementation failure [3, 4, 6]. A system that does not meet the needs of the users may cause the user to, at a minimum, waste time and provide lower quality care,  or even make errors . The assessment of user needs is a unique challenge because of widely varying users, systems and settings. Several design methodologies that assess user-role and contextual needs have been introduced for medical interfaces [22, 23].
Many surveys have been developed that identify user preferences for various mobile software and devices [1, 24–37]. A notable user-group difference was that residents were found to have more expectations regarding mobile devices than faculty . Further, the same study used work-role constructs to explain differences in frequency of accessing clinical data, patterns of email, pager and computer use . In contrast, another survey study found no difference in usage preferences between physicians from different sub-specialties and medical students . A third study used focus groups to elicit preferences about mobile computers. They found that physicians could be categorized into non-users, niche users, routine users and power users based on patterns of preferences . Groups differed in their computer use, what the usage replaces (i.e., no, some or all paper) and their characteristics (respectively: skeptical, busy, open, technophiles).
Cognitive engineering principles can complement surveys to assess and meet user needs. The 'think-aloud' method elicits user knowledge that is useful for development . Incorporating such methods into the design cycle can improve systems. For example, one such method was used to create a new medical record system for pediatric oncologists and was found to lower cognitive load and increase satisfaction .
Inclusion of the targeted end-users is a goal of this project: the Evidence at the Point of Care project (EPoCare). EPoCare is comprised of human factors engineers, computer scientists and practicing physicians working together to iteratively design, develop and evaluate clinical practice tools for mobile devices. The multi-disciplinary team used insights from an investigation into the use of evidence during clinical rounds  and an assessment of clinician needs for evidence at the point of care [40, 41] to adapt paper and online versions of the evidence resources for mobile devices. During Phase I of the project  an HTML-based prototype was built with search interface screens and evidence resources. Group differences in needs and preferences were observed: family physicians tended to prefer short key messages, in contrast to general internists and internal medicine residents who wanted more detailed information . Based on the Phase I findings, we decided to examine group differences more closely in Phase II to ensure that users' needs for these practice tools were met. A tablet and pocketPC were identified as suitable platforms for study because of comparable capabilities of concern to physicians (e.g., wireless capabilities, processing speed, battery life ) and also because of the fundamental differences that enabled us to examine the portability versus screen size tradeoff. Finding differences between groups of individuals would suggest opportunities and necessities for personalizing the presentation of clinical evidence according to the individual using that evidence. In summary, this research investigates the differences that impact physicians' needs for the content and presentation of clinical evidence on mobile devices, and the display formats and device form-factors that meet distinct groups' needs.
This section first outlines the study, then describes the participants involved in the study, the session flow, the prototypes, the measures and the analysis that was carried out.
The methodology of a large-scale usability study was adopted in order to assess the differential customization requirements pertaining to identifiable subgroups of users . Physicians from 3 user groups were identified in Phase I: general internists, family physicians, and internal medicine residents. General internists and family physicians were randomly selected from a sample of physicians who completed a survey on the use of mobile computers (n = 275 and n = 275, respectively). Staff physicians from university and non-university-affiliated settings were selected from Toronto, a large urban centre, and Sault St. Marie, a small urban centre. Internal medicine residents were recruited from the 120 residents in the General Internal Medicine Training Program at the University of Toronto. These physician groups were selected because they provided the bulk of care to patients in Ontario and are representative of the user groups for the proposed system.
After consent was obtained, 47 participants completed a 70-minute usability testing session that required them to complete a set of representative tasks using various evidence-based resources to answer relevant clinical questions. Participants were given two clinical scenarios to review that were relevant to their clinical practice. The scenarios were developed in consultation with a practicing family physician and a general internist. The family physician was asked to tape record his clinical questions during several clinics, while his workflow was observed by a human factors expert. The questions that arose during this physician's clinic were used to generate scenarios for the tasks. All unique patient identifiers were removed to preserve anonymity. Thus, one representative scenario for a specialized hospital setting was given to residents and general internists, one representative scenario for primary care clinic setting was given to general practitioners (shown below), and lastly one scenario appropriate for both the specialty hospital and primary care clinic environment was given to all participants. The first two scenarios were implemented on the pocketPC and the third on the tablet. Scenarios were designed to be equally difficult and representative. For example:
You see a 7-year-old child with asthma in your office. She is on fluticasone and salbutamol currently and was recently discharged from hospital following her 4th admission for asthma exacerbation. During the most recent admission, the dose of fluticasone was increased. Her mother is concerned about the impact of the additional dose of steroids on her daughter's growth. Together you formulate the question: In a child with asthma, do increased doses of inhaled corticosteroids lead to a decrease in growth?
Following a demonstration from the session facilitator, participants were asked to 'think aloud' as they used the prototype to search or browse for the answer to queries such as the one given above . High-quality evidence resources were provided for use: Clinical Evidence (CE)  and Evidence-based Acute Medicine (EBOC) . By high-quality evidence we mean that which has been appraised for validity and importance using methodologically explicit and rigorous techniques . The content for each of these resources were provided to the research team in the form of XML files, which was formatted for the prototypes. Participants had access to both resources on both devices. Participants could search both resources, but could only browse one resource at a time.
True/False Items assessing 'Attitude towards computers'. Scores for items 1, 3, 5 are reversed, and scores for all items added.
I use computers only because they are necessary for work
I think that on-line shopping is a good idea
I don't want to know more about computers than I have to
Computers have a positive impact on my quality of life
I find dealing with computers to be frustrating
I am confident in my ability to master new skills with computers
Likert scale items from the usability-testing session
1. The categories of the questions were useful.
2. The category that I should use for my question was clear.
3. I clearly understood what needed to be entered in each of the fields.
4. The description in the help files was useful.
5. This information would help me in the management of the patient in the scenario
6. I prefer seeing tables displayed in the text rather than having to tap on a link to see them.
7. It's easy to understand the table
8. I prefer the large window format rather than the small window with sub-windows.
9. This is the right amount of information on this drug
10. Online prescribing would be useful in my practice
11. The preset dosages are useful
12. This 'Limited Use Drug' (LUD) form would be useful in my practice.
13. I prefer the screen size of the tablet rather than that of the pocketPC.
14. I prefer the portability of the pocketPC rather than that of the tablet.
Coding Categories: Descriptions of Levels A through F
A. Specifies the main category of the code: usability, content or use/needs.
B. Identifies the portion of the prototype in which the point is being made about. An attribute to specify the location was optional. (E.g. Drug Database, Cascading Window).
C. Identifies the element on the screen. An attribute to specify the element was optional. (E.g. Format, Font).
D. Identifies the main point in subjects comment. (E.g. Usefulness)
E. Identifies the valence of the comment. (E.g. positive, negative, or neutral).
F. Identifies whether the comment was spontaneous or prompted. Any additional information was placed here.
Examples of coded statements in a session transcripts
Example 1. The participant is describing a preference to have clickable drug names within the evidence-based resource that link to additional drug information.
"you should be able to click on that, and it comes up with all the information, the dosing here, the, you know, side effects, and all that stuff, (...) [then] I would feel confident prescribing that drug...even though I have never prescribed it before (...)."
USE/NEED; CE; TEXT; USEFULNESS; NEUTRAL; SPONTANEOUS; "Drug names should link to more drug information."
Example 2. The participant is answering a prompt from the investigator to explain why she finds the search input field categories useful.
"why were the categories useful...consistent with evidence based medicine articles."
USABILITY; SEARCH; CATEGORIES; USEFULNESS; POSITIVE; PROMPTED
Example 3. The participant is commenting on the Summary section in CE.
"That's an awful lot of gibberish in the summary. Just a little hard. I tend to think in point form sometimes. I like the point forms the BMJ has taken on as to what these articles mean."
CONTENT; CE; LEVEL (of detail); NEGATIVE; SPONTANEOUS; "Wants summary in point form similar to BMJ".
Summary of Respondent Characteristics in the Sample
60 and over
'Attitude Towards Computers' (Table 1)
use e- medical databases
Responses to the baseline questionnaire indicated a significant negative correlation between age and search engine use with use declining with increasing age (Pearson r = -.4, p < .01; Frequencies of usage were based on a monthly average: never, less than once, 1–5, 6–10, 11–15, more than 15). Usage of electronic databases also decreased with age (F[4,42] = 2.72, p = .042).
A total of 2367 events were extracted from the transcripts. The inter-rater reliability between the coders was good (k = .73).
Participants' verbalizations included responses to questions, prompts, and spontaneous thoughts. The ratio of spontaneous to prompted comments by participants was 55:45. Comments that focused on use or needs yielded more spontaneous thoughts (80:20, comparison across coding categories: Chi Square value > 18, df = 2, p < .001). Some comments in the use/needs included: "Statistics would be helpful because some issues are very individual and can't be answered by evidence" and "It would be useful if drugs were in a table, then we can have direct comparisons in specific areas".
The majority of comments about CE were concerned with the usability of the presentation of the content. A high proportion of the usability comments were positive: approximately 3:1, whereas the positive and negative comments were more equally split with respect to comments about content (comparison across coding categories: Chi Square value > 18, df = 2, p < .001). Participants commented that the depth and detail of the information was commendable, and that a greater variety of topics should be covered in future additions. Positive usability comments focused on navigability, scrolling and formatting issues such as colours and spacing. CE had a more positive response with a 30:70 ratio of negative to positive comments, whereas EBOC had an even ratio (comparison across coding categories: Chi Square value = 5.84, df = 1, p = .017). With respect to the content of CE, residents made fewer negative comments and more positive comments than family physicians and general internists (comparison across user groups: Chi Square value > 18, df = 2, p < .001). The verbal data indicated that family physicians tended to prefer the prominent bottom line presented at the beginning of each section within EBOC, while the other groups preferred the initial appearance of the evidence as presented in CE: "You get key messages and if you want to know more about it, then you click on it, and then if you want to know even more about that, you click on, so you go into more and more detail as you want...". The groups differed in their comments on the amount of detailed information: family physicians and residents had a high ratio of positive:negative comments (4:1) whereas the general internists' ratio was more even (comparison across user groups: Chi Square value = 12.06, df = 2, p = .002). Overall, residents had more positive comments and fewer negative comments than family physicians and internists (comparison across user groups: Chi Square value = 8.41, df = 2, p = .015).
When comparing usability comments for the browsing versus the searching function, browsing resulted in a higher positive to negative ratio than the search interface. There was an approximately 3:1 ratio of positive:negative comments for browsing, and a 1:1.5 ratio for searching (comparison of browse vs. search codes: Chi Square value = 12.02, df = 1, p < .001). While browsing, family physicians were more likely to take indirect routes (defined as extra pages visited outside of the direct path) in finding the answer to the clinical question. Residents also took more indirect routes than the general internists (comparison across user groups for categories 'direct', 'indirect', and 'not found'; Chi Square value = 11.76, df = 4, p = .019).
Number of Coded Device-related Comments for each User-Group
Summary of main user group differences
pocketPC vs. tablet
Family physicians preferred the bottom-line format of EBOC and wanted more focused answers from CE.
They liked the high level mode of the drill down format.
They wanted to use the device with a larger screen.
Residents' needs were met with CE as they responded with more positive comments.
The detailed mode for drill down was fine for residents.
Both residents and general internists liked the small-screen form factor.
General Internists were positive and also critical of CE.
They were also critical of the amount of drill-down detail provided (they wanted more).
The overall high frequency of positive usability comments about the Phase II prototype was encouraging. The high proportion of positive usability comments, versus the even proportion of content comments, demonstrates that although participants had mixed reviews about the evidence resources, participants found the prototype to be highly usable.
With regards to evidence resources, CE was better received than EBOC. Residents tended to have the most positive views, followed by family physicians and general internists. Family physicians seemed to prefer the bottom-line, or guideline-focused, format of EBOC, whereas the other groups tended to prefer the evidence presented in CE. General internists had more negative comments regarding the level of detail than residents and family physicians. It is possible that general internists within university-affiliated settings want more detail, while family physicians, who in our study were predominantly from non-university affiliated settings, want the clinical bottom-line. Since significant variability exists for these aspects of content, their format should be personalized and customized. In other words, the format of the evidence should be initially personalized to reflect the users' group needs. The users' work role would then determine the amount of evidence shown, the amount of detail provided about the evidence, and the placement of bottom-line guidelines. Additionally, users should be able to customize the interface by setting personal preferences. For example, a user may choose to show all, some or no tables of results, or to show only certain columns of tests that he or she is familiar with. Appropriate formatting settings to suit the user will insure that he or she is not overloaded with information extraneous to his or her interests. Extraneous information can distract users from information necessary for their clinical decision-making.
Browsing through the content was generally preferred over searching. This finding is likely due to the difficulty of inputting text on handheld devices through a touch-screen keyboard. Family physicians demonstrated greater difficulty in navigating to the correct medical answer than other groups. This effect may be due to a lesser familiarity with the evidence resources, as family physician tended to use medical databases (e.g., Medline, Harrison's, MD Consult) less often than the other groups .
The main result noted in this study was the difference in user groups' differing device preferences. Family physicians were more positive towards the tablet while residents and internists preferred the pocketPC. These findings are echoed in the exit questionnaire . Family physicians preferred the screen size of the tablet and seemed less worried about its larger size. Family physicians' were more concerned about usefulness and less about usability. For example, they were interested in how the device would fit into their office setting and whether it could be used to print material for patients. Family physicians' preference for the tablet reflects that they tend to stay in their office, or move between adjacent rooms, when seeing patients. Residents liked the screen size of the tablet, but tended to prefer the portability of the pocketPC. The usability of the device and the types of tools available played a significant role in their preference for the pocketPC. Internists also preferred the portability of the pocketPC and saw less advantage in having the large screen of the tablet than family physicians and residents. Internists' choice likely stems from a greater need for portability due to the mobile nature of their work. In summary, portability seems to be less of a factor for family physicians, which suggests that a larger screen can be used to meet their needs. Portability is more of a factor for the internists, who valued increased mobility. The residents, however, wanted both the screen size of the tablet and the portability of the pocketPC. The younger residents seem to have higher expectations for technology and look forward to new devices on the market that are lighter and have more screen coverage on a smaller body . It is worthwhile to note that the difference between devices is pronounced even though the tablet used in this study is smaller and lighter than those currently available on the market.
The physician factor was more sensitive to differences in user needs in terms of evidence resource format and device form-factor when compared to other potential predictors such as age and setting. However, members of physician groups tend to have some similar task and practice characteristics. Variables such as age, time spent practicing, and search engine use correlated with physician type in a cluster analysis . The results of this study should be interpreted cautiously since other factors vary with physician type: for example, residents in this sample tended to be younger than the family physicians and general internists. Conversely, group differences reflect current population demographics and medical practice.
To date there are few studies that examine physicians' use of evidence-based resources on mobile devices 'in situ'. A small field study provided eleven residents with handhelds equipped with clinical evidence, an EBM calculator, a drug database and notes for a one month period . Residents' reported that they liked the device and the information provided, however, they wanted more resources and found the wireless network unreliable. Findings from the present study confirm the pocketPC form-factor as appropriate for this user-group and could serve further to customize the EMB resources to increase likelihood of adoption. A recent study deployed smart phones to link 31 physicians to online medical resources for information retrieval during clinical and academic activities in a community hospital for a seven month period . They found mixed reports regarding whether interns and residents located the target information and regarding the impact of the information, though participants reported high satisfaction. There were also usability concerns for the small screen and keyboard, which correspond with findings in this paper. User testing such as described in this study can serve to locate areas where the information presentation can be modified to better meet the users' expectations and needs.
A Cognitive Engineering approach to studying physician group needs is a valuable complementary method to surveys. Since surveys are self-reports, often removed from the situation under question, they may differ from actual clinical behavior [1, 53]. Moreover, many reported surveys were not designed specifically for the purpose of detecting between-group differences. The current study is one of few (e.g.,) that have carried out more in-depth task-based usability research employing multiple Cognitive Engineering assessment techniques. In addition, this study may be one of the first in this domain to quantify qualitative statements obtained from think aloud protocols in order to obtain a more reliable measure of their preferences. Finally, the study provides previously unreported description of user differences for mobile computers and evidence resources.
One of the limitations of this study is its sample size. A sample of 47 physicians is too small to confirm subgroup differences in the entire population, thus its conclusions serve to generate hypotheses for future research. Further, these devices were tested in a controlled laboratory study; investigation of usage in a clinical setting is essential to inform design prior to deployment. The information gained has been used to modify the prototypes according to individual clinicians' preferences to be further tested in subsequent clinical trials. The positive usability feedback suggests that the prototype has evolved to meet users' information needs. However, testing with a different sample of physicians who did not volunteer for this study is needed to confirm this finding. This study is a good example of how iterative usability testing can be used to drive interface design .
In accordance with clinicians' concern for additional tools, current directions of the EPoCare project include the design of electronic prescription and electronic health records for mobile devices to provide an integrated suite with the evidence resources. During this second phase of the EPoCare project, we examined hypotheses born out of the first phase of testing: how group differences interact with the usability of evidence resources on mobile devices. Applying the framework from the UK Medical Research Council  to the current study, we see that different form-factors may impact physicians' productivity and satisfaction. A future clinical trial will focus on these variables' relationship to quality of care. General internists, residents and family physicians should be included in any relevant clinical trials as they will likely experience different outcomes. Meanwhile, editors of evidence-based resources should consider personalizing the resources for different user groups in order to increase uptake and adoption. This work should also be extended to other user groups including nurses and patients. Finally, creators of eHealth tools for physicians and publishers of evidence resources should be aware that one size does not fit all. Targeted end users must be included in the design, development and testing of all of these innovations and their impact on clinical outcomes must be assessed.
Previous research underlines the criticality of meeting user needs in medical informatics systems [6–8, 22, 23, 39]. The present findings demonstrate that handheld presentation of clinical evidence should be personalized according to the requirements and preferences of different types of physicians. Regarding evidence resources, users demonstrate different needs for the amount of evidence shown and the level of detail provided. For example, only the conclusions from the strongest study designs should be shown to family physicians, versus the methods, statistical results, conclusions, and references from all studies for other groups. Family physicians prefer bottom-line guideline information more than the other groups. Regarding form-factor, family physicians prefer larger screens and are less concerned about mobility, while internists are most concerned about mobility. Residents present the most challenging design problem in their wish for both large screen size and high mobility. The strongest group differences were observed for physician type, with factors such as age, gender, and previous experience with the Internet and medical databases having relatively little effect on how the physicians responded to the prototype implementations. The information obtained in the current study demonstrates the value of adopting a rigorous framework of iterative development and evaluation concerning the use of mobile computers to improve clinical care.
Thanks to Adrian Alexander, Jennifer Guo, Anna Malandrino, Natalia Modjeska, Scott Orr, Harumi Takeshita, Eric Tursman, Peter Wong, and Robert Wu for their contributions to this project. Thanks to other members of the EPoCare team for their participation in this research: Geoff Anderson, Dave Davis, Dianne Davis, Mike Evans, Patricia Rodriguez Gianolli, Vivek Goel, Ilan Lenga, Natasha Martin, Greg McArthur, Mike Milton, Richa Mittel, John Mylopoulos, David Newton, Kashif Pirzada, Walter Rosser, and Lawrence Spero. Thanks to Holly Witteman for advice and statistical expertise. Thanks to Bell University Labs for their generous funding.
Details of funding: funding was provided by Health Canada, and the Bell University Labs. Also, access to Clinical Evidence was provided by BMJ Publishing, and access to EBOC was provided by the editors Chris Ball, and Bob Phillips. Sharon Straus is funded by Career Scientist Award from the Ontario Ministry of Health and Long-term Care, as a Principal Investigator in the Knowledge Translation Program, University of Toronto, and as a Tier 2, Canada Research Chair in Knowledge Translation and Quality of Care.
Statement of independence of researchers from funders: None of the researchers received personal support from the funding agencies involved and the funders had no control over the study design, analysis or the preparation of the MS
Permissions: Details of ethical approval: Ethics approval was received from the University Health Network (UHN) review board
- Criswell DF, Parchman ML: Handheld Computer Use in U.S. Family Practice Residency Programs. J Am Med Inform Assoc. 2002, 9: 80-86. 10.1197/jamia.M1234.View ArticlePubMedPubMed CentralGoogle Scholar
- Egan DE: Individual differences in human-computer interaction. Handbook of Human-Computer Interaction. Edited by: Helander M. 1988, Amsterdam: Elsevier Science Publishers, 543-568.Google Scholar
- Ash J: Organizational factors that influence technology diffusion in academic health sciences centres. J Am Med Inform Assoc. 1997, 4: 102-111.View ArticlePubMedPubMed CentralGoogle Scholar
- Southon FC, Sauer C, Dampney C: Information technology in complex health services: organizational impediments to successful technology transfer and diffusion. J Am Med Inform Assoc. 1997, 4: 112-124.View ArticlePubMedPubMed CentralGoogle Scholar
- Rogers DW, Jobes HM, Hinshaw JR, Lanzafame RJ: Ergonomics of medical lasers: operator's viewpoint. J Clin Laser Med Surg. 1992, 10 (3): 199-206.PubMedGoogle Scholar
- Lu YC, Xiao Y, Sears A, Jacko JA: A review and a framework of handheld computer adoption in healthcare. Int J Med Inform. 2005, 74: 409-422. 10.1016/j.ijmedinf.2005.03.001.View ArticlePubMedGoogle Scholar
- Holzman TG, Griffith A, Hunter WG, Allen T, Simpson RJ: Computer-Assisted trauma care prototype. Medinfo. 1995, 8 (Pt 2): 1685-PubMedGoogle Scholar
- Kushniruk AW, Triola M, Stein B, Borycki E, Kannry J: The Relationship of Usability to Medical Error: An Evaluation of Errors Associated with Usability Problems in the Use of a Handheld Application for Prescribing Medications. Medinfo. 2004, 11 (Pt 2): 1073-1076.Google Scholar
- Aronow WS: Under-utilisation of lipid-lowering drugs on older persons with prior myocardial infarction at a serum low-density lipoprotein cholesterol greater than 125 mg/dL. Am J Cardiol. 1998, 82: 668-669. 10.1016/S0002-9149(98)00401-9.View ArticlePubMedGoogle Scholar
- Flaker GC, McGowan DJ, Boechler M, Fortune G, Gage B: Underutilization of antithrombotic therapy in elderly rural patients with atrial fibrillation. Am Heart J. 1999, 137: 307-312. 10.1053/hj.1999.v137.91403.View ArticlePubMedGoogle Scholar
- Krumholz HM, Radford MJ, Wang Y, Chen J: National use and effectiveness of Beta-blockers for the treatment of elderly patients after acute myocardial infarction. National Cooperative Cardiovascular Project. JAMA. 1998, 280: 623-629. 10.1001/jama.280.7.623.View ArticlePubMedGoogle Scholar
- Mendelson G, Aronow WS: Underutilization of warfarin in older person with chronic nonvalvular atrial fibrillation at high risk for developing stroke. J Am Geriatric Soc. 1998, 46 (11): 1423-1424.View ArticleGoogle Scholar
- Stafford RS, Singer DE: Recent national patterns of warfarin use in atrial fibrillation. Circulation. 1998, 97: 1231-1232.View ArticlePubMedGoogle Scholar
- Wong JH, Findlay JM, Suarez-Almazor ME: Regional performance of carotid endarterectomy: appropriateness, outcomes and risk factors for complications. Stroke. 1997, 28: 891-898.View ArticlePubMedGoogle Scholar
- Haynes RB: Where's the meat in clinical journals (editorial)?. Ann Intern Med. 1993, 119 (suppl): A22-Google Scholar
- Putman W, Twohig PL, Burge FI, Jackson LA, Cox JL: A qualitative study of evidence in primary care: what the practitioners are saying. CMAJ. 2002, 166 (12): 1525-1530.Google Scholar
- Sackett DL, Straus SE: Finding and Applying Evidence During Clinical Rounds: The "Evidence Cart". JAMA. 1998, 280: 1336-1338. 10.1001/jama.280.15.1336.View ArticlePubMedGoogle Scholar
- Gosling AS, Westbrook JI: Allied health professionals' use of online evidence: a survey of 790 staff working in the Australian public hospital system. Int J Med Inform. 2004, 73: 391-401.View ArticlePubMedGoogle Scholar
- Gosling AS, Westbrook JI, Braithwate J: Clinical Team Functioning and IT Innovation: A Study of the Diffusion of a Point-of care Online Evidence System. J Am Med Inform Assoc. 2003, 10: 244-251. 10.1197/jamia.M1285.View ArticlePubMedPubMed CentralGoogle Scholar
- Westbrook JI, Gosling AS, Coiera E: Do Clinicians Use Online Evidence to Support Patient Care? A Study of 55,000 Clinicians. J Am Med Inform Assoc. 2004, 11: 113-120. 10.1197/jamia.M1385.View ArticlePubMedPubMed CentralGoogle Scholar
- Medical Research Council: A Framework for Development and Evaluation of RCTs for Complex Interventions to Improve Health. 2000Google Scholar
- Portoni L, Combi C, Pinciroli F: User-oriented views in health care information systems. IEEE Trans Biomed Eng. 2002, 49 (12): 1387-1398. 10.1109/TBME.2002.805455.View ArticlePubMedGoogle Scholar
- Boralv E, Goransson B, Olsson E, Sandblad B: Usability and efficiency. The HELIOS approach to development of user interfaces. Comput Methods Programs Biomed. 1994, S47-S64. 45 SupplGoogle Scholar
- Ammenworth E, Buchauer A, Bludau B, Haux R: Mobile information and communication tools in the hospital. Int J Med Inf. 2000, 57: 21-40. 10.1016/S1386-5056(99)00056-8.View ArticleGoogle Scholar
- Manning B, Gadd CS: Introducing handheld computing into a residency program: preliminary results from qualitative and quantitative inquiry. Proc AMIA Symp. 2001, 428-32.Google Scholar
- Beasley BW: Utility of palmtop computers in a residency program: a pilot study. South Med J. 2002, 95 (2): 207-211.View ArticlePubMedGoogle Scholar
- Gillingham W, Holt A, Gillies J: Hand-held computers in healthcare: what software programs are available?. N Z Med J. 2002, 115 (1162): U185-PubMedGoogle Scholar
- Rothschild JM, Lee TH, Bae T, Bates DW: Clinician Use of a Palmtop Drug Reference Guide. J Am Med Inform Assoc. 2002, 9: 223-229. 10.1197/jamia.M1001.View ArticlePubMedPubMed CentralGoogle Scholar
- Lu YC, Lee JK, Xiao Y, Sears A, Jacko JA, Charters K: Why don't physicians use their personal digital assistants?. Proc AMIA Annu Symp. 2003, 404-405.Google Scholar
- McLeod TG, Ebbert JO, Lymp JF: Survey assessment of personal digital assistant use among trainees and attending physicians. J Am Med Inform Assoc. 2003, 10 (6): 605-607. 10.1197/jamia.M1313.View ArticlePubMedPubMed CentralGoogle Scholar
- Barrett JR, Strayer SM, Schubart JR: Assessing medical residents' usage and perceived needs for personal digital assistants. Int J Med Inf. 2004, 73 (1): 25-34. 10.1016/j.ijmedinf.2003.12.005.View ArticleGoogle Scholar
- Brilla R, Wartenberg KE: Introducing new technology: handheld computers and drug databases. A comparison between two residency programs. J Med Syst. 2004, 28 (1): 57-61. 10.1023/B:JOMS.0000021520.50986.48.View ArticlePubMedGoogle Scholar
- Busch JM, Barbaras L, Wei J, Nishino M, Yam CS, Hatabu H: A mobile solution: PDA-based platform for radiology information management. Am J Roentgenol. 2004, 183 (1): 237-242.View ArticleGoogle Scholar
- Carroll AE, Christakis DA: Pediatricians' use of and attitudes about personal digital assistants. Pediatrics. 2004, 113 (2): 238-242. 10.1542/peds.113.2.238.View ArticlePubMedGoogle Scholar
- Choi J, Chun J, Lee K, Lee S, Shin D, Hyun S, Kim D, Kim D: MobileNurse: hand-held information system for point of nursing care. Computer Methods and Program in Biomed. 2004, 74: 245-254. 10.1016/j.cmpb.2003.07.002.View ArticleGoogle Scholar
- Joy S, Benrubi G: Personal digital assistant use in Florida obstetrics and gynecology residency programs. South Med J. 2004, 97 (5): 430-433. 10.1097/00007611-200405000-00002.View ArticlePubMedGoogle Scholar
- McAlearney AS, Schweikhart SB, Medow MA: Doctors' experience with handheld computers in clinical practice: qualitative study. BMJ. 2004, 328 (7449): 1162-10.1136/bmj.328.7449.1162.View ArticlePubMedPubMed CentralGoogle Scholar
- Ericsson K, Simon HA: Protocol Analysis: Verbal Reports as data. 1984, Cambridge, MA: MIT PressGoogle Scholar
- Jaspers MW, Steen T, van den Bos C, Geenen M: The think aloud method: a guide to user interface design. Int J Med Inform. 2004, 73 (11–12): 781-795. 10.1016/j.ijmedinf.2004.08.003.View ArticlePubMedGoogle Scholar
- Sackett DL, Straus SE: Bringing evidence to the point of care. JAMA. 1999, 281: 1171-1172. 10.1001/jama.281.13.1171.View ArticleGoogle Scholar
- Straus SE, Evans M, Davis DA, Goel V: Bringing evidence to the point of care: needs of clinicians. JGIM. 2002, 17: 213-10.1046/j.1525-1497.2002.20102.x.View ArticleGoogle Scholar
- Takeshita H, Davis D, Straus SE: Clinical Evidence at the Point of Care in Acute Medicine: A Handheld Usability Case Study. Proc HFES Ann Mtg. 2002, 1409-1413.Google Scholar
- Wiggins RH: Personal Digital Assistants. J Digital Imag. 2004, 17,1: 5-17. 10.1007/s10278-003-1665-8.View ArticleGoogle Scholar
- Lottridge DL, Chignell MC, Straus SE: Requirements Analysis for Customization using Large Sample User Testing: A Case Study of Mobile Computing in Healthcare. Int J Human Comp Studies accepted.Google Scholar
- Straus SE, Evans M, Goel V, Davis D: Bringing evidence to the point of care: what do clinicians want [Abstract]. Biomed Central Meeting Abstracts: 4th International Cochrane Colloquium. 2001, 1: op047-Google Scholar
- Kushniruk AW, Patel VL, Cimino JJ: Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces. Proc AMIA Annu Symp. 1997, 218-222.Google Scholar
- CE, Clinical Evidence [monograph on CD-ROM]. 2001, London: BMJ Publishing Group Ltd, 6:Google Scholar
- Straus SE, Hsu SI, Ball CM, Phillips RS: Evidence-Based On-Call: Acute Medicine. 2001, Edinburgh: Churchill LivingstoneGoogle Scholar
- Straus SE, Richardson WS, Glasziou P, Haynes RB: Evidence-Based Medicine; How to practice and teach EBM. 2005, Edinburgh: Churchill Livingstone, ThirdGoogle Scholar
- DeYoung C, Spence I: Profiling information technology users: En route to dynamic personalization. Comp in Hum Beh. 2004, 20: 55-65. 10.1016/S0747-5632(03)00045-1.View ArticleGoogle Scholar
- National Physician Survey 2005, College of Family Physicians of Canada (CFPC). 2005, [http://www.nationalphysiciansurvey.ca/nps/reports/publications-e.asp#2005] , College of Family Physicians of Canada (CFPC)
- Lottridge DL, Chignell MC, Straus SE: Physicians Responses to Handheld Presentation of Clinical Evidence: Analysis of Group Differences. Proc HFES Ann Mtg. 2004, 1783-7.Google Scholar
- Ridderikhoff J, van Herk B: Who is afraid of the system? Doctors' attitude towards diagnostic systems. Int J of Med Inform. 1999, 53: 91-100. 10.1016/S1386-5056(98)00145-2.View ArticleGoogle Scholar
- Lottridge DM, Chignell M, Danicic-Mizdrak R, Straus SE: The When, Where and Why of Mobile Access to Medical Evidence: A Socio-Technical Analysis of a Field Trial. Proc Int'l Conf ITHC. 2004, #61: 5-Google Scholar
- León SA, Fontelo P, Green L, Ackerman M, Liu F: Evidence-based medicine among internal medicine residents in a community hospital program using smart phones. BMC Med Informatics and Decision Making. 2007, 7: 5-10.1186/1472-6947-7-5.View ArticleGoogle Scholar
- Kushniruk AW, Patel VL: Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Infor. 2004, 37: 56-76. 10.1016/j.jbi.2004.01.003.View ArticleGoogle Scholar
- The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1472-6947/7/22/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.