Tuesday, 25 August 2020

 COVID-19 Search Bank


NHS Health Education England virtually gathered together a group of information specialists in April to create a "search bank" of work undertaken on COVID-19. The aim of this work was to enable sharing of information on the new coronavirus and streamline the time taken on literature searches. 


For the most part, entire search strategies and results are available on the search bank. I was involved as a peer reviewer of the strategies, as well as a contributor. For my purposes, I wanted to be able to replicate chunks of searches for my requesters, as often I was being asked questions with a slightly different slant. With the amount of literature being published daily on COVID-19, I would be wary of sending on someone else's results wholesale (and certainly never without due credit!), so I mostly used the bank for borrowing search strings and re-running these in the biomedical databases. However, a couple of times I did send on other librarian's results where the question was so similar to mine, and used that as a springboard for my own update on the topic, which I added.


The search bank can be found here: https://kfh.libraryservices.nhs.uk/covid-19-coronavirus/for-lks-staff/literature-searches/


Working on this project gave me a real insight into the different ways we practise as Clinical Librarians and how we present our results to requesters. There's a broad range across HEE, as I'd always suspected, and I don't think there's a single way of doing things that should be adopted by all, we clearly do what works best for our service users and for our services. 


There's definitely an argument over whether we as information specialists overemphasise the importance of the search strategy. I find it interesting that in some database outputs, the strategy/search history features at the beginning (e.g. OVID) and others place it at the end (HDAS). It's not the most important thing for the end user but I think it's essential for transparency and reproducibility and so I like to see it included in full, and so at the end of the results file is probably the right place for it to be for me. It's not often that the final line of the strategy is the only place I find the selected abstracts from, so it's not always a completely clear picture of the search, but it's a useful tool to see what the information specialist was up to. I often look at my colleague's searches to prompt ideas for searching, and I always check relevant Cochrane strategies! 

 Clinical Librarian bibliography updated


The Clinical Librarian bibliography has been updated here: http://www.uhl-library.nhs.uk/cl/pdfs/clinical_librarian_bibliography.pdf

If you're aware of any articles that we've not included, please let us know!

Thursday, 12 March 2020

Using the Data Miner Chrome extension to collate Google results


This last week I ran a search for a rapid review which was particularly grey literature heavy. As a result, I ended up with several domain-specific search strings for Google. Originally, I thought I could either copy the first 100 results into a Word document for screening, or screen the results for relevance myself. However, the former is not the most user friendly, and the latter would inevitably introduce some bias. Additionally, speed was key for this rapid review!

I decided to look into how I could possibly pull all the required information from the Google results, in a way that would enable the reviewer to efficiently screen themselves. My colleagues know that I LOVE a Chrome extension, and now I have found yet another one in the Data Miner Scraper.

Data Miner is a really handy tool which enables you to quickly scrape the information you want from any webpage. They have a bunch of public “Recipes” available which can pull off a range of information depending on your needs (under the "Public" tab). In this case, I simply needed the URL, title, and summary info (effectively what you would get as you’re scrolling through the results normally). I found a recipe that would do this, and simply ran it on each of the pages from my searches.

You can then export the results into Excel. Once I had it in this format it was also really easy to de-duplicate the results using the URLs. So in the end, I had a much more user-friendly format, with de-duplicated results, which would enable my colleague to screen the results more easily.

I am yet to look into whether this could be uploaded to a reference manager so the results could all be screened in one place (seems feasible?). However, for now I’m glad I’ve found Data Miner and I know I’ll be using it in more searches in the future.


  • You can download Data Miner here (big bonus was that I didn’t need permission from IT!).

  • They also have a YouTube channel with lots of tutorials which can be accessed here.


COVID-19 resources

Our colleague Keith Nockels at University of Leicester is putting together information on COVID-19 at his blog currently. Take a look here: http://browsing.blogspot.com/2020/01/outbreak-of-novel-coronavirus.html

Using Global Index Medicus in systematic reviews


I recently had cause to search Global Index Medicus (GIM) from the World Health Organisation (WHO) as a part of an international systematic review that I’m working on. The team specifically needed to search databases that covered low and middle income countries (LMICs). Global Index Medicus covers five regions, Africa Index Medicus (AIM), Index Medicus for the Eastern Mediterranean Region (IMEMR), Index Medicus for the South-East Asia Region (IMSEAR), Latin American and the Caribbean Literature of Health Sciences (LILACS), and Western Pacific Region Index Medicus (WPRO). Through the WHO interface it is possible to search all of these databases combined, or individually.

When it came to constructing the search strategy for the systematic review, I’d never used GIM before. I had a look around for any guides to searching the database effectively and came up blank.

Through some experimentation, I found that an advanced search allowed me to search using MeSH descriptors, and having already built my search in Medline, that meant straightforward transposing of the terms should be possible. However, the advanced search did not allow me to build a strategy in the same way I would use other databases as the search lines were not numbered. So, to get around this issue, I searched for all of the MeSH terms and keywords for each concept, and combined with the OR operator. Once that search was run, GIM gave me a single line search in the search box, which I copied to a Word document. I repeated this process for each concept in my search strategy and was able to come up with three single lines of search string combined with the OP operator, that the database then allowed me to combine using the Advanced Search feature with the AND operator. It was a lengthy search strategy and a lengthy process!

What I discovered:
  • Adjacency operators did not seem to work, or at least, ADJ and NEAR were not recognised by GIM.
  • It is possible to select and download references within the search results
  • The best way to view all results was by downloading the .csv file and using Excel to read (which is often what the systematic reviewers I work with want to see) – there was no limit of numbers and I was able to download 3000+ results in a single file
  • Downloading .ris files for reference management software could only be done in batches of 100 (so quite a time consuming process)
  • Keep a copy of your strategy safe so you can re-run it immediately prior to submission to your journal of choice

If you want a real global slant on your systematic review, it’s a fantastic resource that’s free to use, it will just take a little bit of time to negotiate.


Thursday, 7 November 2019

Helping with Systematic Reviews


Just after I started last Autumn, I was really excited to have the opportunity to get involved with helping design and run the searches for a few systematic reviews.

In the last few weeks, two of those systematic reviews have been published! They are:


It’s great to see that not only are they now published, but I’m very grateful that both authors chose to include me as an author too. It’s incredible to look back at how many SRs we’ve helped with this last year at UHL – I’m currently helping on my 17th!

Whilst the projects themselves were really interesting, it was also a great learning opportunity for me as a new Clinical Librarian too. In 2019, I’ve also had the opportunity to go on two courses on systematic review searching. The main things I’ve learnt are:
  1. Whilst daunting for a perfectionist at first, once you’ve done a few, they’re the best thing! It’s really satisfying creating a sensitive strategy that fits the question well.
  2. Don’t be afraid to experiment with different databases, and with different fields.
  3. Keep open communication with requestors, and manage expectations. I learnt from my colleagues about the benefits of scoping strategies, and now I use them every time. They really help gauge whether I’ve understood the question correctly, and pin down what the requestor wants, as well as giving them a good idea on result numbers.
  4. Run tests to check the key articles have been found. This was emphasised on the courses, but I have also found it reassures requestors, and helps you identify any terms that might have been missed.
My final learning point is that librarians deserve credit for their input! Developing high-quality strategies is a real skill and can take weeks. I’m now much less daunted by the prospect of asking for acknowledgement or authorship for my contributions upfront.



Wednesday, 5 June 2019

A handy resource for updating systematic review searches


As someone who has only worked on systematic reviews for the past year or so, I am also fairly new to the concept of re-running searches.

Previously I thought it was as simple as re-running a strategy, limiting it to the years covering the interim period, and sending off the RIS files for any deduplication with existing results. However, when on the Systematic Reviews course provided by the CRD team at the University of York, this topic was brought up, and we were told it’s a much more complicated process than you’d think. In fact it takes up a whole separate course by itself!

So when I was asked by one of the Fellows to run an update search for their systematic review, I knew I needed to look into how to do this properly. The problem is, limiting by date only limits by publication date, not when the record was added to the database. Therefore, if only limiting by date, you won’t pick up records from outside of that publication period, which also weren’t on the database at the time of the initial searches. However, searching by the date added to the database was not as straightforward as it seemed.

I quickly found talk of using the “date delivered” field on Ovid, but no specific help on the correct syntax to use. That’s when I found this wonderful page from the McGill library:


They’ve created a really useful table with templates of the correct syntax for each database:





As I tend to use the native interfaces for systematic reviews, this is an absolute lifesaver. I’m yet to discover how this is possible if using HDAS (if at all), but for the moment this great resource is firmly bookmarked.

Do you have any tips or tricks for re-running searches? Let us know!