This question came up because of this:
Varela-Lema L, Punal-Riobóo J, Acción BC, Ruano-Ravina A, García ML. Making processes reliable: a validated pubmed search strategy for identifying new or emerging technologies. Int J Technol Assess Health Care. 2012 Oct;28(4):452-9. http://www.ncbi.nlm.nih.gov/pubmed/22995101
What did they mean when they said “a validated PubMed search strategy”? Our MLA systematic review team that is working on search strategies for emerging technologies identification was, shall we say, curious. For this article, it meant that they tested the search results against the next best method previously used (handsearching). The topic was emerging technologies, and what they did was select influential journals and scanned the TOCs manually (which actually means by using their own eyeballs). The journals they scanned were: Science, JAMA, Lancet, Annals of Internal Medicine, Archives of Internal Medicine, BMJ, Annals of Surgery, Am J Transplantation, Endoscopy, J Neurology Neurosurgery Psych, Archives of Surgery, Annals of Surgical Oncology, British Journal of Surgery, and Am J Surg Path. Of the 35 articles that qualified from these journals, the search strategy accounted for 29. The ‘missing’ articles lacked appropriate title words relating to the novelty of the concept, OR used text words that had been removed from the search strategy to improve specificity (reduce total numbers retrieved).
Is that an appropriate way to validate a search strategy? Probably a pretty fair approach for this one, IMHO, especially since they did such a good job of reporting the specific calculations and details of the actual findings of the searches. Is that how most search strategies are validated? Well, perhaps not.
What I’ve been doing to validate search strategies for systematic reviews is to test and compare the search results to a defined set of sentinel articles. The sentinel articles are selected by the team’s subject experts as being good examples of articles that should be retrieved by a search on the defined question. The requirements beyond topic are that each of the sentinel articles should be older than two years, newer than 1990 (this can be flexible, depending on the topic), and must meet all of the defined inclusion criteria for the review. I usually recommend that the pool of selected sentinel articles include no fewer than 3 and no more than 10 citations. This is to make it possible to achieve complete success, as with each added citation, inclusion of all of them becomes more difficult. I also emphasize that the articles do not need to be excellent or required articles on the topic (ie. “gold standard” articles), but that it is, in my opinion, actually more effective for testing if the articles are a selection of relevant, but not necessarily the best ever written on the topic.
Draft versions of the search are tested against this set of articles, and if any “drop out” (are not included) we need to then figure out why, and determine whether to revise the search to include them, or justify the exclusion, or request NLM to correct the coding error in that article’s record. In these last two cases, the exclusion must be reported in the methods. Ideally, one would also describe the strengths, weaknesses, and/or limitations of the search strategy.
Here are some citations to other ways in which searches are validated.
Hausner E, Waffenschmidt S, Kaiser T, Simon M. Routine development of objectively derived search strategies. Systematic Reviews 2012 1:19. http://www.systematicreviewsjournal.com/content/1/1/19
NOTE: This is basically the same “sentinel articles” approach described above.
Hausner E, Guddat C, Hermanns T, Lampert U, Waffenschmidt S. Development of search strategies for systematic reviews: validation showed the noninferiority of the objective approach. J Clin Epid Feb 2015 68(2):191-199. http://www.sciencedirect.com/science/article/pii/S0895435614003874
NOTE: Interesting article tests the reproducibility of Cochrane reviews and their reported search strategies. The emphasis is on the need for objective and reproducible search strategies in systematic review publications.
Van Walraven C, Bennett C, Forster AJ. Derivation and validation of a MEDLINE search strategy for research studies that use administrative data. Health Serv Res. 2010 Dec;45(6 Pt 1):1836-45. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3026961/
NOTE: Compared to handsearching.
Walsh ES, Peterson JJ, Judkins DZ. Searching for disability in electronic databases of published literature. Disability and Health Journal Jan 2014 7(1):114-118. http://www.sciencedirect.com/science/article/pii/S1936657413001647
NOTE: Very interesting two part test to manage the quality control of the search strategy. First, they used the method described above (compared to sentinel articles), then, because the search excluded specific topic terms in favor of broad keyword searching, they validated by comparing retrieval to the results of a known topic search.
Hempel S, Rubenstein LV, Shanman RM, Foy R, Golder S, Danz M, Shekelle PG. Identifying quality improvement intervention publications–a comparison of electronic search strategies. Implement Sci. 2011 Aug 1;6:85. doi: 10.1186/1748-5908-6-85. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3170235/
NOTE: Compared relevance and quality of search strategies by results being reviewed for relevance by independent experts. My personal misgivings about this method for validating a search is that it cannot test for what is missed that you don’t know about.
Tanon AA, Champagne F, Contandriopoulos AP, Pomey MP, Vadeboncoeur A, Nguyen H. Patient safety and systematic reviews: finding papers indexed in MEDLINE, EMBASE and CINAHL. Qual Saf Health Care. 2010 Oct;19(5):452-61. http://www.ncbi.nlm.nih.gov/pubmed/20457733
NOTE: Compared sensitivity & specificity for new search strategies in comparison to previously published search strategies on the same topic. Validated by comparing to a large selection of sentinel articles. Very difficult to achieve, and am ambitious strategy!
Brown L, Carne A, Bywood P, McIntyre E, Damarell R, Lawrence M, Tieman J. Facilitating access to evidence: Primary Health Care Search Filter. Health Info Libr J. 2014 Dec;31(4):293-302. http://www.ncbi.nlm.nih.gov/pubmed/25411047
NOTE: Interesting strategy that first created and validated a search strategy in OVID for quality control over the search development process, and then converted the strategy to PUBMED and validated it again. The validation was again through the selection of a set of sentinel citations, but they explicitly selected for the best quality articles in the topic and referred to the set as the “gold standard.”
Damarell RA, Tieman JJ, Sladek RM. OvidSP Medline-to-PubMed search filter translation: a methodology for extending search filter range to include PubMed’s unique content. BMC Med Res Methodol. 2013 Jul 2;13:86. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3700762/
NOTE: Same strategy as the article by Brown, Carne…Tieman above, but a different topic.
You can find more articles on this topic by exploring the following search results:
(validated OR validation OR “quality control” OR “quality assessment”) search strategy review http://www.ncbi.nlm.nih.gov/pubmed/?term=(validated+OR+validation+OR+%22quality+control%22+OR+%22quality+assessment%22)+search+strategy+review