Search Filters

  • Presentation Format
  • Media Type
  • Diagnosis / Condition
  • Diagnosis Method
  • Patient Populations
  • Treatment / Technique

How Reliable Is Dr. Google? A Systematic Review Evaluating Information Available On The Internet For Common Orthopedic Sports Medicine Terms

How Reliable Is Dr. Google? A Systematic Review Evaluating Information Available On The Internet For Common Orthopedic Sports Medicine Terms

Ilona Schwarz, MS, UNITED STATES Darby Adele Houck, BA, UNITED STATES John W. Belk, BA, UNITED STATES Jack C Hop, BA, UNITED STATES Jonathan T. Bravman, MD, UNITED STATES

University of Colorado School of Medicine, Aurora, CO, UNITED STATES


2021 Congress   ePoster Presentation     Not yet rated

 

Diagnosis Method

This media is available to current ISAKOS Members, Global Link All-Access Subscribers and Webinar/Course Registrants only.

Summary: Doctors can and should play an active role in closing the gap between the level of health literacy of their patients and that of most common online resources.


Purpose

To conduct a systematic review of studies evaluating the quality and content of internet-based information available for common orthopedic sports medicine diagnoses.

Methods

A search of PubMed, Embase, and Cochrane databases following PRISMA guidelines was performed. All English-language literature published from 2010-2020 discussing information quality pertaining to orthopedic sports medicine terms are included. Outcomes included the search engines used, number and type of websites evaluated, platform, and quality scoring metrics. Descriptive statistics are presented.

Results

This review includes 21 studies. Of them, three evaluated both the upper and lower extremity. Twelve focused on the upper and lower extremity, most commonly rotator cuff tears (3/12) and/or ACL pathologies (7/12). The most common engines were Google (18/21), Bing (16/21), Yahoo (16/21), YouTube (3/21), Ask (3/21), and AOL (2/21). The average number of media files assessed per study was 87±55. Website quality was assessed with DISCERN (7/21), Flesch-Kincaid (9/21), Health On The Net (HON) (7/21), and/or JAMA Benchmark (7/21) scores. YouTube was evaluated with JAMA Benchmark Scores (1.74±1.00). Image quality was reported in two studies and varied with search terminology.

Conclusion

The results of this systematic review suggest that physicians should improve the quality of online information and encourage patients to access credible sources when doing their own research. Doctors can and should play an active role in closing the gap between the level of health literacy of their patients and that of most common online resources.


More ISAKOS 2021: Global Content