Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

An Interactive Internet-Based Continuing Education Course on Sexually Transmitted Diseases for Physicians and Midwives in Peru

  • Fredy A. Canchihuaman ,

    freancri@u.washington.edu

    Affiliations School of Public Health and Administration, Universidad Peruana Cayetano Heredia, Lima, Peru, Departments of Global Health and Epidemiology, University of Washington, Seattle, Washington, United States of America

  • Patricia J. Garcia,

    Affiliations School of Public Health and Administration, Universidad Peruana Cayetano Heredia, Lima, Peru, Center for AIDS and STD, University of Washington, Seattle, Washington, United States of America

  • Stephen S. Gloyd,

    Affiliations Departments of Global Health and Epidemiology, University of Washington, Seattle, Washington, United States of America, Center for AIDS and STD, University of Washington, Seattle, Washington, United States of America

  • King K. Holmes

    Affiliations Departments of Global Health and Epidemiology, University of Washington, Seattle, Washington, United States of America, Center for AIDS and STD, University of Washington, Seattle, Washington, United States of America, Department of Medicine, University of Washington, Seattle, Washington, United States of America

Abstract

Background

Clinicians in developing countries have had limited access to continuing education (CE) outside major cities, and CE strategies have had limited impact on sustainable change in performance. New educational tools could improve CE accessibility and effectiveness.

Methodology/Principal Findings

The objective of this study was to evaluate an interactive Internet-based CE course on Sexually Transmitted Diseases (STDs) management for clinicians in Peru. Participants included physicians and midwives in private practice drawn from a census of 10 Peruvian cities. The CE included a three-hour workshop for improving Internet skills, followed by a 22-hour online course on STD-syndrome-management, with subsequent educational support. The course used case-based clinical vignettes tailored to local STD problems. Knowledge and reported practices on STD management were assessed before, immediately after and at four months after completion of the course. Statistical analysis included parametric tests-linear regression multivariate analysis, paired t-test and repeated measures ANOVA using SPSS 14.0. Of 1,071 eligible clinicians, 510 agreed to participate, as did an additional 132 public sector clinicians. Of these 642 participants, 619 (96.4%) completed the course, and 596 (96.3%) took the four-month follow-up evaluation. Physician and midwife scores improved from 64.2% correct answers on the pre-test to 77.9% correct on the four-month follow-up test (p<0.001). Most participants (95%) found the online course useful for their work needs. Self reported STD management practices did not change.

Conclusions/Significance

Among physicians and midwives in Peru, an Internet-based CE course was feasible, acceptable with high participation rates, and led to sustained improvement in knowledge at four months. Further studies are needed to test it as a model for improving the training of physicians, midwives, and other health care providers.

Introduction

Continuing education (CE) for health care workers is required by professional credentialing, governmental and licensing agencies and is available in many developed countries, but in developing countries, accessibility to those programs is limited, especially outside major urban settings. In addition, traditional, didactic CE programs for health professionals have shown modest impact on sustained improvement in knowledge, health provider practices or patient outcomes[1], [2].

An alternative to traditional CE is Internet-based CE (I-CE). A number of advantages of I-CE have been proposed including the use of complex information, real-time interactive links, images, audio, and video; flexibility in location and time; potential for reinforcement through continuous availability; adaptability to adult learning approaches; potential low cost; and accessibility to providers outside major urban centers[3], [4], [5], [6].

In recent years, effectiveness of I-CE has improved by designing courses based upon educational theory and by including new educational tools such as case scenarios or clinical vignettes[4], [7], [8], [9]. For example, courses based upon situated learning theory (learning in the context of the interaction between the participants and their environment) and involving cognitive processes – decision-making, reasoning, and problem-solving – can help develop skills in medical practice[10], [11].

In Peru, as in other developing countries, the telecommunications infrastructure has improved rapidly[12] and Internet access is widely available. However, use of information and communication technologies in health remains limited[13], [14], [15]. Evidence for feasibility, acceptability, and effectiveness of I-CE for health care providers' training is largely lacking in our countries; therefore development and evaluation of such programs is warranted[13].

We designed and implemented an interactive, I-CE course on syndromic management of sexually transmitted diseases (STDs) using cognitive-educational theories. This study evaluates feasibility, acceptability, and impact of this course on the knowledge and reported STD management practices of participants.

Methods

Study design and population

The study was designed as a pre-post evaluation of the I-CE course, with repeated measures to compare knowledge and self-reported practices at baseline (before the course), immediately after, and at four-months after completion of the course.

The I-CE course was developed as a training component for the Urban Community-Randomized Trial of STD Prevention in Peru (The PREVEN study)[16]. The training component and the evaluation were implemented between August 2005 and March 2006 in the 10 intervention cities included in PREVEN, representing coastal, jungle and Andean regions of Peru. (Figure 1).

thumbnail
Figure 1. Map of Peru with the location of the 10 intervention cities (each city has more than 50,000 inhabitants).

https://doi.org/10.1371/journal.pone.0019318.g001

Based on a census of physicians and midwives in private practice done in 2003 and updated yearly[17], [18], we sent invitations to all of them to participate in the training program. The invitation described the nature of the course (internet-based), and the inclusion of initial training in use of the Internet tools. Advertisements were also posted in health centers.

Course design, content and certification

The course was designed in a user-friendly, modular platform using an open-source programming language, Hypertext Preprocessor (PHP). The program included not only the educational content but also an administrator module, a database to store participants' data (demographics, scores, etc) and a report generator.

The course was developed by a team from the Unit of Epidemiology and STD/HIV, Universidad Peruana Cayetano Heredia (UPCH), who wrote the national guidelines for STD management for the Ministry of Health of Peru, and had extensive experience in STD management training. The course was based on World Health Organization[19] guidelines for syndromic management of STDs, and on the Peruvian National Guidelines for STD management. Content addressed four STD syndromes (vaginal discharge, urethral discharge, pelvic inflammatory disease, and genital ulcer disease). Additional components of the course included learning materials and links to STD resources on the Internet, materials for the patient, opportunities to “ask the expert”, and responses to frequently asked questions (FAQ). Further feedback and educational support were provided through post-course consultations via e-mail and summarized on the FAQ section. Course completion averaged 22 hours duration over a three-week period.

Several case scenarios (clinical vignettes) were used to introduce each of the four STD-syndromes and to illustrate diagnosis, treatment, partner's treatment, promotion of condom use, and patient follow-up. Each case-scenario included an image and brief history from the clinical vignette. Sequential questions, addressing common mistakes in STD management[20], were posed after each vignette. (Appendix S1). Feedback was provided for each answer, whether answered correctly or incorrectly. Additional windows offered in-depth discussions.

The course presented an overview of history taking and physical examination, and counseling regarding risk behavior. (Appendix S2).

The course was evaluated by the Peruvian College of Medicine and was given 1.5 continuing medical education (CME) credits which could be used towards the renewal of medical licenses (10 CME credits every five years are required for renewal of medical licenses). UPCH also accredited 1.5 credits for the course, which was helpful for midwives. To obtain the credits, participants had to complete both pre- and post-tests and required a post-test score of more than 60 out of 100 points.

Course piloting and implementation

The course was piloted in Lima with a group of 15 physicians and midwives. Their feedback guided improvements in the design of the course and clarity of the questions, and allowed assessment of the time needed to complete of the course.

In each of the 10 cities, we next invited all interested clinicians to attend a three-hour workshop in Internet cafes rented for the purpose. The first hour provided training in use of the Internet and in completing the pre-test. During the remaining two hours, participants began reviewing the course Website. Each module could be completed during one or more sessions and in different places – participants were allowed to skip forward and return to the modules. The course was free and available for CME credits over a period of eight weeks. For each participant, a post-test was scheduled three weeks after beginning the course, and a follow-up test was scheduled four-months post-training.

Course evaluation

The training was evaluated through comparing pre-test knowledge with post-tests knowledge. The pre-test, post-test of knowledge and follow-up tests were the same, but the question order was altered. The test was structured with 22 questions and vignettes, each with one correct answer. Five additional questions about self-reported practices were included in the pre-test and follow-up test. The areas queried were frequency of use of algorithms for syndromic management of patients with STDs, use of “the four Cs” (counseling, condom promotion, compliance with treatment, and contact tracing/partner treatment) during patients' consultations, giving information to patients about their STDs, giving referral cards for patients' partners, and giving treatment for patients' sexual contacts. These questions were validated by experts and by 15 physicians and midwives for clarity of the language and content relevance.

Both pre-test and post-test were taken on the Internet. For the follow-up test participants were visited by a project worker and were given a paper-based test. After the post-test, a voluntary online survey assessed user satisfaction, acceptability, relevance of the course, and software/program performance.

Data entry and analysis

The scores and answers from pre and post-tests were maintained in a data base (My SQL). We compared mean percentages of correct answers on pre-, post-, and follow-up tests by t-test for paired comparisons and ANOVA test for repeated measures both for clinicians who completed training, and for a subset who did not receive training. Differences between subgroups in the change of scores were assessed by t-test for independent samples. Analyses of factors associated with gain in knowledge included parametric tests (linear regression multivariate analysis, t-test and repeated measures ANOVA). Data analysis used SPSS 14.0 (SPSS Inc, Chicago, Ill) software.

The Institutional Review Boards of UPCH and the University of Washington approved the project which qualified for exemption of consent. The study evaluated overall effectiveness of the course, posed no risk to the participants, and data were analyzed anonymously. Participants received no financial incentives.

Results

Characteristics of the study population

Of 1071 eligible private practice physicians and midwives identified in the 10 cities, 510 (47.6%) agreed to participate and took the pre-test. Additionally, 132 physicians and midwives not in private practice participated.

Table 1 summarizes characteristics of course participants and non-participants. Physician participants averaged 41.5 years of age and 13 years of practice, 72.1% were male, and 47.5% were general practitioners; 98.2% reported seeing STD cases, and 61.2% had previously attended didactic workshops on STD management and were members of the intervention trial network (the PREVEN network). Midwives averaged 38 years of age and 11 years of practice; 84.7% were female; 93.1% reported seeing STD cases and 58.6% were PREVEN network members.

thumbnail
Table 1. Characteristics of physician and midwife participants and non-participants in the STD I-CE course.*

https://doi.org/10.1371/journal.pone.0019318.t001

Physician non-participants had an older mean age than participants (45.5 SD±12.0 versus 41.5 SD±10.7), and had a longer mean duration of practice (16.6 SD±11.4 versus 13.0 SD±10.1 years). Midwives who did not participate were more likely than participants to be male (20.5% versus 15.3%).

Of the 642 course participants, 619 (96.4%) completed the course (course-users). Among these 619, 412 (66.6%) took the post- test; 394 (95.6%) passed the test, receiving CE credits. Among the 642 course participants, 596 (92.8%) took the four-month follow-up test. Of the 46 persons who did not take the follow-up test 4 (8.7%) were on vacation, 19 (41.3%) had moved to another town, and 23 (50.0%) were otherwise unavailable. (Figure 2).

thumbnail
Figure 2. Participation flow of the Internet-based CE intervention.

https://doi.org/10.1371/journal.pone.0019318.g002

Knowledge of STDs

Physicians' mean total scores improved from 65.9% correct answers at pre-test to 78.0% correct at four month follow-up test (p<0.001). Midwives improved from 62.8% to 77.8% (p<0.001). (Table 2). Further unpaired analysis comparing pre-test and follow-up test did not change these associations.

thumbnail
Table 2. Participants' knowledge scores and self-reported practices of syndromic management of STDs.

https://doi.org/10.1371/journal.pone.0019318.t002

For the 393 physicians and midwives who took all three tests (pre, post, and follow-up tests), mean post-course scores (81.4) and follow-up scores (78.8) both surpassed mean pre-test scores (65.0) (p<0.001 for both comparisons). The decline in follow-up test scores compared to post-test scores was modest but significant (p<0.001).

Among 23 non-course users, 18 (15 physicians and 3 midwives) took both the pre and follow-up test. Their average scores improved only slightly from 72.0 for the pre-test to 76 for the follow-up test (p = 0.29).

In bivariate analysis (Table 3), factors associated with greater improvements in knowledge scores included female gender, being a midwife, attending the initial training workshop, participating in the course, and not already being a PREVEN Network member (p<0.05 for all comparisons,).

thumbnail
Table 3. Bivariate and multivariate analysis of factors associated with improvements in knowledge of syndromic management of STDs.

https://doi.org/10.1371/journal.pone.0019318.t003

In the multivariate analysis (Table 3), the most important factors positively associated with improvement in knowledge score were attendance to the initial training workshop, not being a PREVEN member, and being female (p<0.05).

Self-reported STD syndromic management practices

Among all participants, there were no statistically significant differences in self- reported STD management practices between the pre-test and the follow-up test except for giving card referrals for partners or sexual contacts, which for midwives fell from 68.5% in the pre-test to 46.2% follow-up test. (Table 2).

Participants' satisfaction

Table 4 summarizes course satisfaction and acceptability among the 412 who completed the post-test; 388 (94.2%) rated the course “very useful” for their job needs and/or their professional effectiveness.

thumbnail
Table 4. Participants' satisfaction with the STD Internet-based CE course.

https://doi.org/10.1371/journal.pone.0019318.t004

Physicians were also asked to assess the following specific elements of the course: clarity, navigability, utility of images, and interactivity; mean scores of each exceeded 4.5 by Likert scale (1 = strongly disagree to 5 = strongly agree) for the statements that the course was clear and easy to understand; easy to use; that images, photos, and graphs enhanced understanding; and that the Web site was interactive.

When asked whether a course in text format (i.e., using written materials) would be preferred over an online course 305 (74%) “disagree or totally disagreed”. Physicians reported willingness to participate in a similar internet-based course in the future and reported they would recommend participation to colleagues.

Written comments about the course were consistently favorable, and many requested additional courses and ongoing updates, including more images, with CD and written materials as format options. Additional suggestions included broadening access of professionals in rural areas and of students.

Discussion

Of online-CE courses identified by systematic reviews[21], [22], not many in Latin America and Africa have provided comprehensive I-CE and to our knowledge, few were developed as case-based training within in a developing country[23], [24].

We found that an Internet-based CE course implemented in 10 cities in Peru was feasible, well accepted by physicians and midwives and improved knowledge that was sustained at four month follow-up.

Using local technology, minimal resources (post-doctoral trainee salary for FC for 12 months, plus approximately $5,000 for course production, including platform, content, and field work), and evidence-based strategies, we provided the course to nearly half of the physicians and midwives working in the private sector in 10 of the larger cities throughout Peru. Evaluation has provided evidence for impact on knowledge for those who took the course, but not for those who enrolled but did not actually take the course; but no impact even among participants on self- reported practice.

Rates of recruitment, participation, and follow-up testing, and the diversity of participation in our study were substantially higher than what is generally considered achievable for this type of CE[25]. Recruitment and participation of physicians has reportedly been challenging for educational interventions. An online course on chlamydia screening reported recruitment of 33% of physicians and rates of participation of those recruited to be about 52%[25]. Rates of participation in our study likely reflect high level of interest in this topic and in Internet-based CE per se, lack of alternative sources of CE, and feasibility of internet-based training throughout the country.

Acceptability of the course was reflected not only by high participation rates, but by satisfaction with the course as expressed in the post-test survey, with96% of participants rating the course as very useful and relevant to their clinical practice. Although response bias can explain high satisfaction rates, similar web-based studies have found rates as low as 47% for physicians' perceptions about course relevance to their clinical practice[26].

Improvements in knowledge were greatest among females, those who participated in the pre-course workshops, those who actually took the course, those with lower scores at baseline, and those not already PREVEN Network members. The beneficial effect of the pre-course workshop could reflect either an effect of in-person participation in the pre-course workshop on subsequent internet-based education, or selection bias (motivated participants could have taken the course and then performed better). The limited improvement in knowledge of PREVEN Network members could indicate a ceiling effect of this particular training program; information provided by this Internet course overlapped with that provided to PREVEN Network members a year earlier. It is conceivable that some of the improvement in scores could be explained by a direct per se effect of taking the pre-course test, or could represent a regression to the mean[27]. The last is a common criticism of evaluation of educational interventions without use of control groups[28]. However non-course users did not significantly improve their test scores, consistent with an effect of the course.

The significant and sustained improvement from pre- to post- to follow-up test in knowledge scores could be attributable to some elements of our course identified in randomized trials of other well-designed educational web-based programs[10], [29], [30], [31], [32], [33]. The course design used a learning theory approach, was based on participants' needs-assessments, had an interactive format, used case-based learning with performance feedback, and was tailored to local STD problems and included reinforcement components (e.g., mail consultation, and learning materials).

Although we attempted to avoid features commonly found and criticized in traditional I-CE[3], [34], [35], additional elements also shown to be effective were not used in this intervention. These elements include illustrative dynamic schemes and flow charts; use of slides synchronized with audio; didactic presentations using video; live Web conferences; online risk assessment calculators; e-mail reminders; asynchronous discussions with peers and facilitators; chat rooms; telephone contacts; game-like simulations; and other novel elements including computerized virtual patients for mobile phones. Reviews and meta-analysis of studies assessing impact of Internet-based courses have concluded that they are comparable or better than face-to-face courses[22], [29], and that a combination of course formats (on line and in class) are more effective than face-to-face courses[36].

In contrast to the overall improvement of test scores with our course, no significant improvement was noted in self-reported practices. Moreover a significant decrease in the provision of referral cards for partners' treatment was reported by midwives but not by physicians. The lack of reported improvement in practices could be explained by the high rates of correct practices reported at baseline, possibly related to the social desirability response bias. Other explanations could be self-selection of participants into the course (e.g., about 60% were already PREVEN Network members); lack of validity of the measure for assessing practice; or no actual translation of knowledge into practice. Translation of knowledge into practice is a complex process that is not always achieved[37]. Although partner referral cards were available on the course web site, the low rate of using referral cards for partner treatment suggests the need to establish a better enabling system to complement this training, and deserves further research on how to promote partner notification.

This was not a randomized controlled trial, and results are therefore subject to bias. Several factors were plausibly associated with score improvement in knowledge and with feasibility and acceptability of the implementation. Beyond the lack of a control group, a limitation of our study is that we did not measure objective changes in practices or in patients' outcomes. We measured changes in self-reported practices, not a highly reliable measurement. Self-reported measures are susceptible to biases (tendency to answer positively, give more socially acceptable answers, etc.). Another study limitation is the validity and reliability of knowledge assessments by questionnaires, which can be questioned because of the absence of standardization and difficulty when comparing them across studies[38], [39], [40]. Novel methods, such as computerized clinical vignettes, to assess knowledge and clinical performance in a variety of medical areas are more often being used and studied[29], [40]. Extrapolation of the study results may be done with caution to settings other than that represented by the study participants. However, inclusion of samples from varied setting - coastal, jungle, and Andean region does allow a wider generalization. Further studies are needed to explore the utility of I-CE courses in other resource restricted settings for broader applicability.

Conclusions

With rapidly growing access to the Internet in much of the developing world this study provides evidence that an I-CE course is feasible, acceptable, and attracts a high level of interest. I-CE has great potential to improve the training of health care workers and to reduce information gaps in developing country settings. Future research should include randomized controlled trials comparing different types of I-CE or instructional elements; cost effectiveness studies; and development and evaluation of instruments to measure clinicians' performances. In the long-term, creating comprehensive online training centers could be used to coordinate Internet-based curricula between universities, affiliated institutions, and the public sector in developing countries.

Supporting Information

Appendix S1.

Example of case-based clinical vignette and sequential questions.

https://doi.org/10.1371/journal.pone.0019318.s001

(DOC)

Appendix S2.

Internet-based CE Course Images: Illustration of computer screens showing Internet-based CE modules used in the Intervention.

https://doi.org/10.1371/journal.pone.0019318.s002

(PDF)

Author Contributions

Conceived and designed the experiments: FAC PJG KKH. Performed the experiments: FAC. Analyzed the data: FAC PJG KKH SSG. Contributed reagents/materials/analysis tools: FAC. Wrote the paper: FAC PJG KKH SSG.

References

  1. 1. Davis DA, Thomson MA, Oxman AD, Haynes RB (1992) Evidence for the effectiveness of CME. A review of 50 randomized controlled trials. JAMA 268: 1111–1117.
  2. 2. Forsetlund L, Bjorndal A, Rashidian A, Jamtvedt G, O'Brien MA, et al. (2009) Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database Syst Rev: CD003030.
  3. 3. Cook DA (2007) Web-based learning: pros, cons and controversies. Clin Med 7: 37–42.
  4. 4. Cook DA, Gelula MH, Lee MC, Bauer BA, Dupras DM, et al. (2007) A web-based course on complementary medicine for medical students and residents improves knowledge and changes attitudes. Teach Learn Med 19: 230–238.
  5. 5. Harden RM (2005) A new vision for distance learning and continuing medical education. J Contin Educ Health Prof 25: 43–51.
  6. 6. McKimm J, Jollie C, Cantillon P (2003) ABC of learning and teaching: Web based learning. BMJ 326: 870–873.
  7. 7. Casebeer LL, Strasser SM, Spettell CM, Wall TC, Weissman N, et al. (2003) Designing tailored Web-based instruction to improve practicing physicians' preventive practices. J Med Internet Res 5: e20.
  8. 8. Voelker R (2003) Virtual patients help medical students link basic science with clinical care. JAMA 290: 1700–1701.
  9. 9. Zary N, Johnson G, Boberg J, Fors UG (2006) Development, implementation and pilot evaluation of a Web-based Virtual Patient Case Simulation environment–Web-SP. BMC Med Educ 6: 10.
  10. 10. Allison JJ, Kiefe CI, Wall T, Casebeer L, Ray MN, et al. (2005) Multicomponent Internet continuing medical education to promote chlamydia screening. Am J Prev Med 28: 285–290.
  11. 11. Raupach T, Muenscher C, Anders S, Steinbach R, Pukrop T, et al. (2009) Web-based collaborative training of clinical reasoning: a randomized trial. Med Teach 31: e431–437.
  12. 12. Rodrigues RJ, Risk A (2003) eHealth in Latin America and the Caribbean: development and policy issues. J Med Internet Res 5: e4.
  13. 13. Chandrasekhar CP, Ghosh J (2001) Information and communication technologies and health in low income countries: the potential and the constraints. Bull World Health Organ 79: 850–855.
  14. 14. Godlee F, Pakenham-Walsh N, Ncayiyana D, Cohen B, Packer A (2004) Can we achieve health information for all by 2015? Lancet 364: 295–300.
  15. 15. Tomasi E, Facchini LA, Maia Mde F (2004) Health information technology in primary health care in developing countries: a literature review. Bull World Health Organ 82: 867–874.
  16. 16. Proyecto PREVEN: prevención comunitaria de enfermedades de transmisión sexual [Urban community randomized trial of STD/HIV prevention] [homepage on the Internet] Available: http://www.proyectopreven.org. Accessed 2010 Dec 01.
  17. 17. Hsieh EJ, Blas MM, La Rosa Roca S, Garcia PJ (2006) Sexually transmitted infections and private physicians in Peru, 2003. Rev Panam Salud Publica 20: 223–229.
  18. 18. Hsieh EJ, Garcia PJ, Roca SL (2008) Male midwives: preferred managers of sexually transmitted infections in men in developing countries? Rev Panam Salud Publica 24: 271–275.
  19. 19. World Health Organization (2003) Guidelines for the Management of Sexually Transmitted Infections. Geneva: World Health organization. Available: http://www.who.int/reproductivehealth/publications/rtis/9241546263/en/index.html. Accessed 2010 Dec 01.
  20. 20. Garcia PJ, Holmes KK (2003) STD trends and patterns of treatment for STD by physicians in private practice in Peru. Sex Transm Infect 79: 403–407.
  21. 21. Cook DA, Garside S, Levinson AJ, Dupras DM, Montori VM (2010) What do we mean by web-based learning? A systematic review of the variability of interventions. Med Educ 44: 765–774.
  22. 22. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, et al. (2008) Internet-based learning in the health professions: a meta-analysis. JAMA 300: 1181–1196.
  23. 23. Tian L, Tang S, Cao W, Zhang K, Li V, et al. (2007) Evaluation of a web-based intervention for improving HIV/AIDS knowledge in rural Yunnan, China. AIDS 21: Suppl 8S137–142.
  24. 24. Zbar RI, Otake LR, Miller MJ, Persing JA, Dingman DL (2001) Web-based medicine as a means to establish centers of surgical excellence in the developing world. Plast Reconstr Surg 108: 460–465.
  25. 25. Wall TC, Mian MA, Ray MN, Casebeer L, Collins BC, et al. (2005) Improving physician performance through Internet-based interventions: who will participate? J Med Internet Res 7: e48.
  26. 26. Chung S, Mandl KD, Shannon M, Fleisher GR (2004) Efficacy of an educational Web site for educating physicians about bioterrorism. Acad Emerg Med 11: 143–148.
  27. 27. Barnett AG, van der Pols JC, Dobson AJ (2005) Regression to the mean: what it is and how to deal with it. Int J Epidemiol 34: 215–220.
  28. 28. Harris AD, McGregor JC, Perencevich EN, Furuno JP, Zhu J, et al. (2006) The use and interpretation of quasi-experimental studies in medical informatics. J Am Med Inform Assoc 13: 16–23.
  29. 29. Casebeer L, Brown J, Roepke N, Grimes C, Henson B, et al. (2010) Evidence-based choices of physicians: a comparative analysis of physicians participating in Internet CME and non-participants. BMC Med Educ 10: 42.
  30. 30. Curran VR, Fleet LJ, Kirby F (2010) A comparative evaluation of the effect of Internet-based CME delivery format on satisfaction, knowledge and confidence. BMC Med Educ 10: 10.
  31. 31. Fordis M, King JE, Ballantyne CM, Jones PH, Schneider KH, et al. (2005) Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA 294: 1043–1051.
  32. 32. Kerfoot BP, Kearney MC, Connelly D, Ritchey ML (2009) Interactive spaced education to assess and improve knowledge of clinical practice guidelines: a randomized controlled trial. Ann Surg 249: 744–749.
  33. 33. Stewart M, Marshall JN, Ostbye T, Feightner JW, Brown JB, et al. (2005) Effectiveness of case-based on-line learning of evidence-based practice guidelines. Fam Med 37: 131–138.
  34. 34. Alur P, Fatima K, Joseph R (2002) Medical teaching websites: do they reflect the learning paradigm? Med Teach 24: 422–424.
  35. 35. Zimitat C (2001) Designing effective on-line continuing medical education. Med Teach 23: 117–122.
  36. 36. U.S. Department of Education Office of Planning E, and Policy Development Policy and Program Studies Service (2009) Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. Available: http://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf. Accessed 2010 Aug 01.
  37. 37. Green LA, Seifert CM (2005) Translation of research into practice: why we can't “just do it”. J Am Board Fam Pract 18: 541–545.
  38. 38. Casebeer L, Kristofco RE, Strasser S, Reilly M, Krishnamoorthy P, et al. (2004) Standardizing evaluation of on-line continuing medical education: physician knowledge, attitudes, and reflection on practice. J Contin Educ Health Prof 24: 68–75.
  39. 39. Curran VR, Fleet L (2005) A review of evaluation outcomes of web-based continuing medical education. Med Educ 39: 561–567.
  40. 40. Peabody JW, Luck J, Glassman P, Jain S, Hansen J, et al. (2004) Measuring the quality of physician practice by using clinical vignettes: a prospective validation study. Ann Intern Med 141: 771–780.