885 Biomedical Literature Searching: PubMed MeSH vs. Clinical Queries

Friday, March 23, 2012: 2 p.m. - 3:15 p.m.
Presentation Type: Poster Session
C.C. KARTALTEPE, J.D. RUGH, J.P. HATCH, and E.J. KARTALTEPE, University of Texas - San Antonio / Health Science Ctr, San Antonio, TX

Objectives:  One problem facing dental professionals is staying current with research.  PubMed is the National Library of Medicine's online interface to Medline.  Using Medical Subject Headings (MeSH) searching has been the gold standard search strategy on PubMed.  Clinical Queries is an alternate way of searching PubMed.  The goal of this pilot study was to compare the abilities of PubMed's MeSH search and Clinical Queries search to locate evidence to answer focused clinical questions.

Methods:  Thirteen evaluators experienced with both search strategies performed searches to answer 12 clinical questions.  These included 4 where there was high-level of evidence (e.g., meta-analysis or systematic review), 4 of mid-level (e.g., randomized controlled trial), and 4 of low-level (e.g., case series, expert opinion).  Each evaluator searched for half of the clinical questions using MeSH search and half with Clinical Queries.  The goal was to identify the predetermined publication that provided the strongest and most current evidence.  The trials were counterbalanced for evaluators, level of evidence and search strategy (MeSH or Clinical Queries).  The test computer recorded search sequences, web pages visited and times.

Results:  There was no significant difference between overall success rates with MeSH and Clinical Queries (31% vs. 29%).  The mean time for the MeSH searches was longer (392 vs. 182 seconds, p<.001). Success rates for the two search methods were similar for all levels of evidence.  For low-level evidence, neither search strategy was very successful at locating the one article that had been identified as providing the best evidence.

Conclusion:  Clinical Queries appears to be a time-effective alternative to PubMed to answer clinical questions with any level evidence in the literature.  The research methods piloted in this study may be useful to test other biomedical search engines.

This abstract is based on research that was funded entirely or partially by an outside source: NIH/1R25DE018663

Keywords: Decision-making, Education research, Learning, Teaching and Technology