Skip to main content

How do medical doctors use a web-based oncology protocol system? A comparison of Australian doctors at different levels of medical training using logfile analysis and an online survey



Electronic decision support is commonplace in medical practice. However, its adoption at the point-of-care is dependent on a range of organisational, patient and clinician-related factors. In particular, level of clinical experience is an important driver of electronic decision support uptake. Our objective was to examine the way in which Australian doctors at different stages of medical training use a web-based oncology system (


We used logfiles to examine the characteristics of eviQ registrants (2009–2012) and patterns of eviQ use in 2012, according to level of medical training. We also used a web-based survey to evaluate the way doctors at different levels of medical training use the online system and to elicit perceptions of the system’s utility in oncology care.


Our study cohort comprised 2,549 eviQ registrants who were hospital-based medical doctors across all levels of training. 65% of the cohort used eviQ in 2012, with 25% of interns/residents, 61% of advanced oncology trainees and 47% of speciality-qualified oncologists accessing eviQ in the last 3 months of 2012. The cohort accounted for 445,492 webhits in 2012. On average, advanced trainees used eviQ up to five-times more than other doctors (42.6 webhits/month compared to 22.8 for specialty-qualified doctors and 7.4 webhits/month for interns/residents). Of the 52 survey respondents, 89% accessed eviQ’s chemotherapy protocols on a daily or weekly basis in the month prior to the survey. 79% of respondents used eviQ at least weekly to initiate therapy and to support monitoring (29%), altering (35%) or ceasing therapy (19%). Consistent with the logfile analysis, advanced oncology trainees report more frequent eviQ use than doctors at other stages of medical training.


The majority of the Australian oncology workforce are registered on eviQ. The frequency of use directly mirrors the clinical role of doctors and attitudes about the utility of eviQ in decision-making. Evaluations of this kind generate important data for system developers and medical educators to drive improvements in electronic decision support to better meet the needs of clinicians. This end-user focus will optimise the uptake of systems which will translate into improvements in processes of care and patient outcomes.

Peer Review reports


Evidence-based practice is the cornerstone of modern medicine. Rapid advances in medicine and information technology have provided the necessary impetus for the development and deployment of electronic decision support systems [13]. These systems synthesise large bodies of evidence, a task beyond that of individual clinicians. Electronic decision support systems have grown in popularity, play an important role in providing up-to-date resources for point-of-care use, and have been demonstrated to improve processes of medical care and patient outcomes [47].

Despite the significant benefits of electronic decision support, its adoption has been highly variable. Well-documented barriers to access are organisational, provider and patient-related [6, 810]. Importantly, there are concerns on the part of clinicians, particularly doctors, that reliance on such systems may lead to deskilling in decision-making. Moreover, many experienced clinicians report their practices align with best evidence and that electronic decision support threatens professional autonomy. Conversely, these systems are more likely to be used when clinicians perceive they enhance decision-making and improve practice. As the medical profession becomes more technologically savvy and the culture continues to embrace the necessity to have the most up-to-date evidence at clinicians’ finger tips, some of these traditional barriers may carry less importance. These changes are also likely to be more apparent in medical specialties where new technologies and treatments are changing rapidly and there is a high risk of adverse patient outcomes.

Oncology practice is particularly demanding due to the complex nature of care and the challenges of achieving the delicate balance between maximising treatment effects and minimising toxicity [11, 12]. As such, medical oncology like many other medical specialties has taken advantage of web-based technology by developing online guideline and protocol systems to support clinicians in their day-to-day practice [13]. However, the rapid proliferation of these online systems globally has not been accompanied by comprehensive evaluation of their use and impact in cancer treatments settings.

We have developed a multi-faceted research program evaluating an Australian web-based oncology system, eviQ treatments online [1318]. We have demonstrated high rates of eviQ adoption by all cancer care health professionals but the nature and extent of use is highly dependent on clinicians’ specific roles in cancer care [14]. Our interview-based study of oncology practitioners found that junior cancer clinicians accessed eviQ more frequently than their senior counterparts. This finding is driven mainly by levels of familiarity with treatment practices; many senior doctors felt their experience negated the necessity to refer to the protocol system while junior doctors relied heavily on the program to guide decision-making [15, 16]. Importantly, junior doctors were more inclined to embrace information technology than their senior colleagues and felt that eviQ gave them a greater sense of autonomy in their day-to-day practice.

Our aim is to expand and strengthen our previous research by examining how medical doctors at different stages of training use eviQ. Specifically, we will examine:

  1. 1)

    Characteristics of medical doctors registered on eviQ;

  2. 2)

    Patterns and frequency of eviQ use (logfile analysis) according to level of medical training and years of oncology experience; and

  3. 3)

    Doctors’ perceptions about how eviQ is used in clinical practice (online survey) and how responses of clinicians in training compare with speciality-qualified doctors.

General methods

Study setting

In Australia, cancer services are funded primarily by Australia’s universal health care funding arrangements. Medical and radiation oncology therapies are mostly delivered in the ambulatory care (outpatient) setting at metropolitan hospitals (university-affiliated, tertiary referral centres covering geographic areas of around 75 square kilometres), regional centres (with catchments up to 1,200 square kilometres) and rural hospitals (with catchments up to 3,400 square kilometres). Australia has over 700 public hospitals with a total of approximately 55,000 beds [19].

eviQ cancer treatments online

eviQ ( is a web-based oncology protocol system managed by the Cancer Institute New South Wales (NSW), a government funded agency established to improve cancer control. eviQ primarily targets health professionals involved in implementing cancer care by providing detailed and extensive instructions on how to deliver evidence-based treatments safely and appropriately. Treatment information encompasses adolescent and young adult care, cancer genetics, haematology, haemapoietic progenitor cell transplants, medical oncology, nursing, primary health, palliative care and radiation oncology. The site comprises over 1,300 protocols, developed by a consensus process involving specialist physicians, nurses, pharmacists and allied health practitioners from across Australia. Each protocol undergoes a comprehensive review every 1 to 2 years. While the primary eviQ target audience is health professionals, the site also publishes information tailored specifically to cancer patients and their carers.

Until October 2009 eviQ was known as the Cancer Institute Standard Cancer Treatments Program (CI-SCaT); the system underwent a major rebuild and rebranding to better meet the specific needs of health professionals and patients. The previous platform (CI-SCaT) was taken offline on March 31 2010 to allow sufficient opportunity for users to transition and register on the eviQ website.

Medical training in Australia

In Australia, medical training generally occurs according to the following pathway; completion of a medical degree (4 to 6 years duration); at least 2 years of general medical training as an ‘intern/resident’; at least 2 years training as a ‘registrar’; and training for at least 3 years as an ‘advanced trainee’ (most of this time is focused on the doctors chosen medical speciality such as medical oncology, haematology). Doctors are specialty-qualified at the completion of their advanced training, at which time they become ‘fellows’ and ‘staff specialists’.

Doctors can be exposed to oncology practice from the time of internship when they may rotate through the specialty as part of their general medical training. Advanced trainees choosing an oncology specialty will practice almost exclusively in this field; will most often be the key interface between patients and the treating medical team; and will also supervise more junior medical staff. Staff-specialists are ultimately responsible for the medical team and treatment decisions.

Ethical approval

This study was approved by the Population and Health Services Research Ethics Committee (HREC/10/CIPHS/70). We sought a waiver of consent to undertake the logfile analysis. We obtained informed consent from cancer clinicians for the completion of the online survey.

Phase 1: eviQ logfile analysis


Study design

We conducted a retrospective study of Australian medical doctors registered on eviQ between October 2009 (eviQ launch date) and December 2012. We examined registrant details and patterns of use from web-logfiles generated from the eviQ platform in 2012. Specifically, we compared eviQ use in doctors at various stages of medical training, from intern/resident through to the most senior doctors, staff specialists.

Data sources and analysis

The eviQ secretariat provided the research team with access to data from the eviQ platform. Demographic registrant and logfile data were obtained on-site at the Cancer Institute NSW in unit record format (stripped of personal identifiers such as usernames). The eviQ platform has the capacity to generate data on the characteristics of all registered users including registrant type (individual clinician or unit registration), role (e.g., medical, nursing, pharmacy), health sector (public, private, or both), main area of work (e.g., medical oncology, haematology), geographical location of practice, years of oncology experience, and source of referral to the eviQ website. This information is reported by users upon registration and website registrants are prompted to update this information on an annual basis.

The eviQ platform also generates data on logfiles that monitor webhits, defined as one click anywhere on the eviQ website. Logfiles can be stratified by any of the aforementioned variables (e.g., clinician type, years of oncology experience) and by the time at which the webhits occur (e.g., time of day, month, year). However, the current eviQ logfile reports are aggregated and do not have the capacity to examine clinician-level use. Further, our inability to examine variability by clinician limits any statistical analyses.

Study cohort (Figure 1)

Our cohort included Australian health professionals registered as individual clinicians and identifying themselves as medical doctors upon eviQ registration (Figure 1); approximately 8% of all registrations are medical units [14]. Medical unit registrations were excluded from our current analysis as they are likely to represent the web-activity of a group of health professionals with varying roles and years of clinical experience. We also excluded medical students, doctors whose level of training could not be determined, and doctors unlikely to be practicing in the hospital setting (those identifying as general practitioners or primary care physicians).

Figure 1

Flowchart of study cohort derivation.

Data analysis and reporting

We used Microsoft Excel for logfile analyses and report on the following for our study cohort:

Registrant characteristics

We describe the registrant characteristics of our study cohort including level of training (interns/residents, registrars, advanced trainees, fellow or staff specialists), main area of work, year of registration, years of oncology experience, health sector, location (NSW or another Australian state), and eviQ referral source. We also report on recency of eviQ use in order to distinguish between frequent and infrequent eviQ users (login in the last 3 months, 6 months or 12 months of 2012).

We present this data for all medical doctors in our cohort and by level of training; interns/residents, registrars, advanced trainees, fellows, or staff specialists.

Patterns of eviQ use

We describe the patterns of eviQ use for our study cohort using logfiles generated in 2012. Importantly, we examined patterns of use from the launch of eviQ in October 2009 through to December 2012 and noted no trends in use over time with the exception of reductions in use over the Australian holiday periods (December, January). As such, the current manuscript focused solely on logfiles from the most recent year, 2012.

We report webhits by month (January-December 2012) using two metrics: raw webhits and rates (defined as webhits/100 registered medical doctors). The denominator for our rate analysis was the number of medical doctors who accessed the website at least once in 2012; approximately 35% of registrants did not use eviQ in 2012.

We stratified our analyses by the following:

  • Level of training: interns/residents, registrars, advanced trainees, fellows, and staff specialists.

  • Years of oncology experience: <2, 2–5, 5–10, and >10 years (categories pre-defined by eviQ website).

We also analysed raw webhits and rates of use by time of day and day of the week. Our previous logfile analyses of all Australian health professionals demonstrated that 87% of eviQ activity occurred during standard clinic hours (08:00–18:00, Monday through Friday) [14]. We examined whether similar patterns are observed in medical doctors overall and by level of training.


Registrant characteristics (Figure 1, Table 1)

At December 2012, there were 20,526 Australian health professionals registered on eviQ 3,150 (15.3%) of whom were medical doctors. After excluding general practitioners, students, and registrants not specifying their level of training, our cohort consisted of 2,549 medical doctors (Figure 1).

Table 1 Characteristics of study cohort

Doctors were at various stages of oncology training: interns/residents (21.5%), registrars (26.2%), advanced trainees (14.0%), fellows (5.3%), and staff specialists (32.9%) (Table 1). Our cohort worked primarily in medical oncology (34.5%), followed by haematology (17.3%), general medicine (17.1%), and radiation oncology (13.2%); most worked in the public sector (85.8%). Years of oncology experience generally matched level of training; 88.5% of interns had >2 years of oncology experience and 80% of staff specialists had 5–10 or >10 years of oncology experience.

Overall, 65.1% of our cohort had used eviQ at least once during 2012. A greater proportion of advanced trainees (61.2%) and staff specialists (46.9%) used eviQ in the last 3 months of 2012 compared with interns/residents (25.1%), registrars (36.3%), and fellows (38.5%).

The majority of doctors were referred to eviQ by a peer or colleague (73.1% overall); this varied by level of training, ranging from 54.7% (staff specialists) to 89.8% (interns/residents).

Patterns of use

Webhits by level of training (Figure 2)

Our cohort accounted for 445,492 total webhits in 2012. Staff specialists and advanced trainees had higher numbers of raw webhits compared with doctors at other levels of training; this pattern was consistent across 2012 (Figure 2).

Figure 2

Webhits by level of training during 2012. A) Raw webhits; B) Rates of use: hits per 100 medical doctors that accessed eviQ during 2012.

On average, medical doctors had 22.4 hits/month but rates of use varied by level of training, with advanced trainees using eviQ up to five times that of other medical doctors. Specifically, advanced trainees had an average of 42.6 hits/month whereas other groups had fewer hits/month: interns/residents (7.4 hits/month), registrars (20.0 hits/month), fellows (18.4 hits/month) and staff specialists (22.8 hits/month). This pattern was consistent across 2012. Further, based on our previous research demonstrating that 9 pages are viewed during a typical user session [14], we can approximate that on average, advanced trainees use eviQ at least weekly whereas other doctors appear to use the site on a fortnightly or monthly basis.

Webhits by years of oncology experience (Figure 3)

Raw webhits by years of oncology experience were proportional to the size of the clinician groups; clinicians with <2 years of experience accounted for the greatest number of webhits, followed by clinicians with 2–5 years, >10 and 5–10 years of experience; patterns were consistent during the 12 month study period (Figure 3). In contrast to our analysis by level of training, our rate analysis demonstrated less variation in eviQ use by years of oncology experience; mean hits per clinician/month ranged from 17.0 (>10 years’ experience) to 25.9 (2–5 years’ experience) This most likely reflects the fact that professional role (rather than years of oncology experience) determines the nature of doctors’ patient load and thus the utility of eviQ in day-to-day practice. For example, advanced trainees specialising in oncology are likely to see substantially more cancer patients than interns or registrars that rotate across more diverse areas of medicine.

Figure 3

Webhits by years of oncology experience in 2012. A) Raw webhits; B) Rates of use: hits per 100 medical doctors that accessed eviQ during 2012.

Webhits by time of day (Figure 4)

Our cohort used eviQ primarily during standard clinic hours, with 87.7% of all webhits occurring between 08:00–18:00 Monday to Friday; this pattern of use was similar across all levels of training (Figure 4).

Figure 4

Webhits by time of day in medical doctors according to level of training in 2012. A) Raw webhits; B) Rates of use: hits per 100 medical doctors that accessed eviQ during 2012.

Phase 2: Online survey of medical doctors


Study design

Our objective was to compare how eviQ is used by clinicians in training with senior doctors. We developed a 14 item self-report survey to assess doctors’ use of eviQ in their clinical practice. A link to the survey was on the eviQ home page for a 7 week period (September-October 2012); this was accompanied with an invitation to complete the survey. It was estimated that the survey would take no more than 15 minutes to complete and was populated on a server external to eviQ (Google Docs; Upon completion, respondents were invited to send their email address to the researchers to enter a draw to win an Apple iPad; delivered to the winner of the draw in November 2012.

Survey respondents

Our target population was Australian medical doctors. As with phase 1, we excluded survey respondents who were general practitioners, students, other health professionals, and those practising outside of Australia.

Survey development and content

The survey was developed based on our previous survey work [15, 16] and in consultation with the eviQ secretariat and was pilot tested with oncologists and medical students. As such, we established the instrument’s validity but not reliability. The survey consisted of 14 questions; 13 forced choice and one free-text. We collected demographic details, level of training (questions 1–8) and frequency of use of eviQ tools in the past month (e.g., chemotherapy protocols; questions 9–11). We also assessed frequency of eviQ use for clinical tasks (e.g., guiding chemotherapy administration; question 12) and the extent to which respondents agreed with 4 statements about eviQ (e.g., “is integral to my clinical practice”; question 13). Finally, we asked respondents to list any other computer-based systems or websites they use to support their clinical practice. See Additional file 1 for a complete version of the survey.

Analysis and reporting

Survey data were analysed using SPSS and Microsoft Excel. We report demographic details and survey responses overall and by level of training.


Characteristics of survey respondents (Table 2)

Of 67 survey respondents, 52 were Australian medical doctors (Table 2). Respondents were at various stages of their training including interns/residents (n = 9, 17.3%), registrars (n = 13, 25.0%), advanced trainees (n = 15, 28.8%), fellows (n=8, 15.4%) and staff specialists (n = 7, 13.5%). Due to small sample sizes, we describe survey results for interns/residents/registrars as one group (n=22) and fellows/staff specialists as one group (n=15). Similar to the characteristics of all medical doctors registered with eviQ (phase 1), most respondents worked in medical oncology (57.7%) and the public sector (94.2%). The majority of respondents used eviQ on a daily basis in the last month (53.8%), the remainder of respondents used eviQ weekly (34.6%) or fortnightly (9.6%).

Table 2 Characteristics of survey respondents meeting study eligibility criteria (N = 52)

Survey responses (Table 3)

Due to the low survey response rate and small numbers of respondents according to each level of training, we present outcomes for all respondents with some general findings by level of training.

Table 3 Survey responses of medical doctors meeting study eligibility criteria (N = 52)
eviQ tools

The majority of respondents used eviQ tools in the past month with the following tools used on a daily or weekly basis: chemotherapy protocols (88.8%), assessment tools (69.2%), drug calculator (63.4%), and clinical procedures (53.8%) (Table 3). Cancer genetics, patient information, supportive therapy, and discussion boards were the least frequently used tools (6-40% respondents used these tools on a daily/weekly basis).

Advanced trainees reported using eviQ tools more frequently than other medical groups. On average, over 70% of advanced trainees (11/15 respondents) used eviQ tools on a daily/weekly basis compared with less than 30% of fellow/staff specialists (4/15 respondents) and just over half of interns/registrars (13/22 respondents). These results contrast with phase 1, in which interns/residents had the lowest rates of use. However, survey respondents had visited the site in the survey period and were most probably engaged in an oncology rotation at the time. In contrast, the rates of use reported in phase 1 were based on the logfiles of any doctor who had accessed the site in 2012. Interns and residents rotate through different specialties approximately every 10 weeks during their training and it is highly unlikely they would access the site when the rotation is complete.

eviQ for clinical tasks

Consistent with its primary purpose, eviQ was most commonly used to initiate therapy (78.9% used eviQ daily or weekly for this task). However, as previously reported [15, 16] eviQ was used for the full spectrum of cancer care including monitoring (28.8%), alternating (34.6%), and ceasing therapy (19.2%) on a daily/weekly basis.

eviQ was used frequently (daily/weekly) for a number of specific tasks including: comparing treatment options (57.7%), prescribing medications (50%), calculating drug doses (65.4%), guiding chemotherapy administration (73.1%), and to access evidence-based practice information (69.3%). eviQ was not routinely used to guide radiotherapy administration, access treatment cost information, or as a teaching resource (19.2%, 25.0%, 30.8% reported using eviQ for these tasks on a daily/weekly basis, respectively).

Overall, advanced trainees reported using eviQ more frequently for clinical tasks compared with interns/registrars and staff specialists/fellows; approximately 70% of advanced trainees compared with 50% of interns/registrars and 25% of staff specialists/fellows.

Clinicians attitudes towards eviQ

The majority of respondents viewed eviQ favourably. It was endorsed as an up-to-date resource (94.2%), integral to clinical practice (78.9%), and a resource that allows clinicians to function autonomously (80.8%). Further, 67.3% of respondents stated that it was their primary source of oncology information. No respondent 'strongly disagreed' with any statement.

Unlike the frequency of eviQ use for various clinical tasks, clinicians’ attitudes toward eviQ were generally consistent across level of training. Across the four statements, advanced trainees viewed eviQ most favourably (87-100% agreed/strongly agreed), followed by interns/registrars (59-96%), and staff specialists/fellows (53-87%). The greatest variation was in relation to eviQ as a primary information source; 53% of staff specialists/fellows and 59% of interns/registrars compared with 93% of advanced trainees agreed/strongly agreed.

Use of other web-based resources

23.1% of respondents listed other online resources they used to support their oncology practice. These included journals/research databases, other oncology websites, and hospital-specific point-of-care programs.


Our study complements a large program evaluation of a web-based oncology protocol system [1318]. eviQ has been rated among the highest quality online cancer care applications internationally [13, 18] and is used at the point-of-care by the key health professionals involved in the delivery of cancer care across Australia [1416]. In this study, we focus specifically on medical doctors and their utilization of the system across the spectrum of medical training. Our multi-method approach builds upon our previous work and for the first time, allows us to understand the profile of medical doctors using eviQ frequently and intermittently.

Our study suggests that eviQ is an integral part of medical practice in Australia. Importantly, the number of medical oncologists registered on eviQ is similar to that reported in a recent audit of the Australian medical oncology workforce, highlighting the widespread interest in the system in this group of cancer care professionals [20]. Further, given global and local concerns about shortages in the medical oncology workforce [20, 21] and clinician burnout [22], systems like eviQ that synthesise the growing evidence-base and streamline clinical practice [15, 16] will be increasingly important for the safe and effective delivery of cancer care.

Our results demonstrate important differences in eviQ use according to level of medical training. We found rates of use increase according to level of training and peak with advanced trainees. After doctors are specialty-qualified their rates of use decline to match those of registrars. This is consistent with our previous research that found doctors in training viewed eviQ as important in guiding their clinical practice and for professional autonomy. In contrast, specialty-qualified doctors expressed reservations about routine use of standardised protocols and were less likely to use the system than their more junior colleagues [16].

The different patterns of eviQ use observed in this study directly mirror the roles of clinicians in oncology care at different levels of training. Specifically, interns, residents and registrars rotate across medical specialties and would only generally use eviQ during oncology terms. Advanced trainees plan to specialise in oncology and are likely to have the greatest patient load including direct involvement in initiating and monitoring treatments (under the direct supervision of a specialty-qualified clinician). Finally, staff specialists generally subspecialise in specific cancer treatments (e.g., breast cancer) and become highly familiar with a core group of protocols and treatments. They may only use eviQ when faced with new or unfamiliar clinical situations [15, 16]. Nevertheless, the lower rates of use observed in staff specialists may be viewed as less than optimal as speciality-qualified doctors are ultimately responsible for the outcomes of oncology care. Alternatively, our results may indicate that this group of doctors remain up-to-date with latest evidence using other information sources.

Differences in use across levels of training are not unique to online systems in oncology practice. A large-scale evaluation of another Australian evidence-based resource found junior clinicians were more likely to be aware of and use the system at the point-of-care than their senior counterparts [23]. Given that online systems have been found to improve accuracy and confidence of clinicians’ decisions [24, 25], reticence on the part of senior doctors may reflect their enduring preference to rely on their own experience or that of their colleagues as their primary information source for point-of-care decisions [26].

Our targeted analysis of doctors’ use of eviQ at various stages in their medical training made use of improvements to the eviQ platform since our previous analyses [14]. Our logfile analysis demonstrated that a significant proportion of medical registrants are not regular visitors to the site. As such, the rate analyses presented in the current study is likely to more accurately reflect current use of the system (22 hits/doctor vs 14 hits/doctor in our previously published logfile analysis). Even with improvements to the capabilities of the eviQ platform, a limitation of our study was the inability of the platform to examine use at an individual doctor level and track this use over time. Clearly, consideration of these issues has important implications for the interpretation of evaluations of this kind.

A further limitation to our current analysis is the lack of specificity of the eviQ platform in identifying years of oncology experience and possible underestimation of use by medical doctors who may have accessed eviQ through oncology unit registrations. We have previously demonstrated that that rates of eviQ use through unit registrations are three times that of individual clinicians [14]. Moreover, we attempted to complement our logfile analyses with an online survey but the low response rates, even when we offered an incentive to respond, highlights the challenge of gaining insights from a group of busy doctors, something that needs to be improved to enhance ongoing monitoring activities. Additionally, our survey only captured responses from doctors that used the website during the 7 week window that the survey was online; as such, it is unlikely that our sample is representative of all medical doctors who use eviQ.


Electronic decision support systems are undoubtedly the future of medicine [2]. However, the provision of the most state-of-the art system does not guarantee uptake, and this remains an ongoing challenge for contemporary medical practice. An important consideration is that these systems should never replace interactions with colleagues and patients. Rather, system developers, medical educators and health care administrators need to consider how these systems best complement such interactions to enhance medical care. Multi-method evaluation approaches examining end-user attitudes, knowledge and behaviour will add significant value to achieving this goal. Importantly, evaluation should also aim to better understand the way electronic decision support improves real-world processes and outcomes of care to assess the value for money or cost-effectiveness of their deployment.


  1. 1.

    Brandt AM: A reader's guide to 200 years of the New England Journal of Medicine. N Engl J Med. 2012, 366 (1): 1-7. 10.1056/NEJMp1112812.

    CAS  Article  PubMed  Google Scholar 

  2. 2.

    Kohane IS, Drazen JM, Campion EW: A glimpse of the next 100 years in medicine. N Engl J Med. 2012, 367 (26): 2538-2539. 10.1056/NEJMe1213371.

    CAS  Article  PubMed  Google Scholar 

  3. 3.

    Hoffmann T, Erueti C, Thorning S, Glasziou P: The scatter of research: cross sectional comparison of randomised trials and systematic reviews across specialties. BMJ. 2012, 344: e3223-10.1136/bmj.e3223.

    Article  PubMed  PubMed Central  Google Scholar 

  4. 4.

    Durieux P, Trinquart L, Colombet I, Nies J, Walton R, Rajeswaran A, Rege Walther M, Harvey E, Burnand B: Computerized advice on drug dosage to improve prescribing practice. Cochrane Database Syst Rev. 2008, CD002894-3

  5. 5.

    Hunt DL, Haynes RB, Hanna SE, Smith K: Effects of computer-based clinical decision support systems on physician performance and patient outcomes: a systematic review. JAMA. 1998, 280 (15): 1339-1346. 10.1001/jama.280.15.1339.

    CAS  Article  PubMed  Google Scholar 

  6. 6.

    Mollon B, Chong J, Holbrook AM, Sung M, Thabane L, Foster G: Features predicting the success of computerized decision support for prescribing: a systematic review of randomized controlled trials. BMC Med Inf Decis Mak. 2009, 9: 11-10.1186/1472-6947-9-11.

    Article  Google Scholar 

  7. 7.

    Pearson SA, Moxey A, Robertson J, Hains I, Williamson M, Reeve J, Newby D: Do computerised clinical decision support systems for prescribing change practice? A systematic review of the literature (1990–2007). BMC Health Serv Res. 2009, 9: 154-10.1186/1472-6963-9-154.

    Article  PubMed  PubMed Central  Google Scholar 

  8. 8.

    Moxey A, Robertson J, Newby D, Hains I, Williamson M, Pearson SA: Computerized clinical decision support for prescribing: provision does not guarantee uptake. J Am Med Inform Assoc. 2010, 17 (1): 25-33. 10.1197/jamia.M3170.

    Article  PubMed  PubMed Central  Google Scholar 

  9. 9.

    Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, Sam J, Haynes RB: Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005, 293 (10): 1223-1238. 10.1001/jama.293.10.1223.

    CAS  Article  PubMed  Google Scholar 

  10. 10.

    Kawamoto K, Houlihan CA, Balas EA, Lobach DF: Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005, 330 (7494): 765-10.1136/bmj.38398.500764.8F.

    Article  PubMed  PubMed Central  Google Scholar 

  11. 11.

    Norton PG, Baker GR: Patient safety in cancer care: a time for action. J Natl Cancer Inst. 2007, 99 (8): 579-580. 10.1093/jnci/djk161.

    Article  PubMed  Google Scholar 

  12. 12.

    Gandhi TK, Bartel SB, Shulman LN, Verrier D, Burdick E, Cleary A, Rothschild JM, Leape LL, Bates DW: Medication safety in the ambulatory chemotherapy setting. Cancer. 2005, 104 (11): 2477-2483. 10.1002/cncr.21442.

    Article  PubMed  Google Scholar 

  13. 13.

    Langton JM, Drew AK, Mellish L, Olivier J, Ward RL, Pearson SA: The quality of web-based oncology guidelines and protocols: how do international sites stack up?. Br J Cancer. 2011, 105 (8): 1166-1172. 10.1038/bjc.2011.378.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  14. 14.

    Langton JM, Pesa N, Rushton S, Ward RL, Pearson SA: Uptake of a web-based oncology protocol system: how do cancer clinicians use eviQ cancer treatments online?. BMC Cancer. 2013, 13 (1): 112-10.1186/1471-2407-13-112.

    Article  PubMed  PubMed Central  Google Scholar 

  15. 15.

    Hains IM, Ward RL, Pearson SA: Implementing a web-based oncology protocol system in Australia: evaluation of the first 3 years of operation. Intern Med J. 2012, 42 (1): 57-64. 10.1111/j.1445-5994.2010.02284.x.

    CAS  Article  PubMed  Google Scholar 

  16. 16.

    Hains IM, Fuller JM, Ward RL, Pearson SA: Standardizing care in medical oncology: are Web-based systems the answer?. Cancer. 2009, 115 (23): 5579-5588. 10.1002/cncr.24600.

    Article  PubMed  Google Scholar 

  17. 17.

    Langton JM, Pearson SA: Improving the quality of oncology guidelines and protocols. Asia Pac J Clin Oncol. 2012, 8 (3): 313-10.1111/j.1743-7563.2012.01570.x.

    Article  Google Scholar 

  18. 18.

    Langton JM, Pearson SA: eviQ cancer treatments online: how does the web-based protocol system fare in a comprehensive quality assessment?. Asia Pac J Clin Oncol. 2011, 7 (4): 357-363. 10.1111/j.1743-7563.2011.01431.x.

    Article  PubMed  Google Scholar 

  19. 19.

    Australian Institute of Health and Welfare: Health Services Series no.43. Cat. no. HSE 117. Australian hospital statistics 2010–11. 2012, Canberra: AIHW

    Google Scholar 

  20. 20.

    Blinman PL, Grimison P, Barton MB, Crossing S, Walpole ET, Wong N, Francis K, Koczwara B: The shortage of medical oncologists: the Australian Medical Oncologist Workforce Study. Med J Aust. 2012, 196 (1): 58-61.

    Article  PubMed  Google Scholar 

  21. 21.

    Erikson C, Salsberg E, Forte G, Bruinooge S, Goldstein M: Future supply and demand for oncologists : challenges to assuring access to oncology services. J Oncol Pract. 2007, 3 (2): 79-86. 10.1200/JOP.0723601.

    Article  PubMed  PubMed Central  Google Scholar 

  22. 22.

    Shanafelt T, Dyrbye L: Oncologist burnout: causes, consequences, and responses. J Clin Oncol. 2012, 30 (11): 1235-1241. 10.1200/JCO.2011.39.7380.

    Article  PubMed  Google Scholar 

  23. 23.

    Westbrook JI, Gosling AS, Westbrook MT: Use of point-of-care online clinical evidence by junior and senior doctors in New South Wales public hospitals. Intern Med J. 2005, 35 (7): 399-404. 10.1111/j.1445-5994.2005.00836.x.

    CAS  Article  PubMed  Google Scholar 

  24. 24.

    Westbrook JI, Coiera EW, Gosling AS: Do online information retrieval systems help experienced clinicians answer clinical questions?. J Am Med Inform Assoc. 2005, 12 (3): 315-321. 10.1197/jamia.M1717.

    Article  PubMed  PubMed Central  Google Scholar 

  25. 25.

    Westbrook JI, Gosling AS, Coiera EW: The impact of an online evidence system on confidence in decision making in a controlled setting. Med Decis Making. 2005, 25 (2): 178-185. 10.1177/0272989X05275155.

    Article  PubMed  Google Scholar 

  26. 26.

    Ousley AL, Swarz JA, Milliken EL, Ellis S: Cancer Education and Effective Dissemination: Information Access is not Enough. J Cancer Educ. 2010, 25 (2): 196-205. 10.1007/s13187-010-0129-3.

    Article  PubMed  Google Scholar 

Pre-publication history

  1. The pre-publication history for this paper can be accessed here:

Download references


We acknowledge Shelley Rushton, Cancer Institute New South Wales, for her assistance accessing and analysing eviQ logfiles and for providing feedback on the survey content and dissemination. We also acknowledge Dr Carole Harris for her contribution to interpreting the analyses and manuscript discussion. This research was funded by a Cancer Institute New South Wales Translational Research Grant (09/THS/2-10). SP is a Cancer Institute New South Wales Career Development Fellow (09/CDF/2-37).

Author information



Corresponding author

Correspondence to Sallie-Anne Pearson.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

JML contributed to design and conception of the study, data collection and analysis, interpretation and drafting of the manuscript. BB contributed to data collection and analysis, and drafting of the manuscript. NP contributed to data collection and analysis, and drafting of the manuscript. JMP contributed to data analysis and drafting of the manuscript. SP contributed to design and conception of the study, interpretation and drafting of the manuscript. All authors read and approved the final manuscript.

Electronic supplementary material

Authors’ original submitted files for images

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Langton, J.M., Blanch, B., Pesa, N. et al. How do medical doctors use a web-based oncology protocol system? A comparison of Australian doctors at different levels of medical training using logfile analysis and an online survey. BMC Med Inform Decis Mak 13, 82 (2013).

Download citation


  • Clinical decision support systems
  • Evidence-based practice
  • Medical education
  • Cancer chemotherapy protocols
  • Health personnel
  • ‘Medical staff
  • Hospital’