Greenberg SB, Long MJ, Klein SG.
University of Arkansas for Medical Sciences, Arkansas Children's Hospital, 800 Marshall St, Little Rock, AR 72202, USA.
Acad Radiol. 2003 Nov;10(11):1321-3
RATIONALE AND OBJECTIVES: The residency review committee (RRC) for diagnostic radiology of the Accreditation Council for Graduate Medical Education mandates core competencies including computer-aided applications in medicine. The purpose of this review was to evaluate the use of RadioGraphics' on-line CME to satisfy the RRC requirements. MATERIALS AND METHODS: Twenty radiology residents at a university training program read the same four articles in the on-line version of RadioGraphics. Before reading each article, the residents took the associated CME pre-test and, after completing the article, the CME post-test. Each resident completed a survey to evaluate the quality of the resident experience using RadioGraphics' CME on-line program after completing the four articles and tests. RESULTS: The combined mean scores of all four articles pre-test and post-test scores were 5.6 and 9.3. Significant improvement in the test scores was determined by a student t-test (P < .001). Fourteen residents agreed and one resident disagreed with the statement that the modules were time effective. Nineteen of 20 residents agreed with the statement that valuable information for future practice was gained and that they would continue to use RadioGraphics for CME in the future. All of the residents agreed with the statement that the experience satisfied the residency's requirement to teach computer skills appropriate for ongoing learning. CONCLUSION: RadioGraphics' on-line CME an effective method to teach residents skills required by the RRC.
Posted via PubMed for educational and discussion purposes only.
Link to PubMed Reference