The Joanna Briggs Institute
Research and Consultancy
Education and Learning
Joanna Briggs Foundation
You are here:
Search Summarised and Primary
Levels of Evidence FAME
Grades of Recommendation
Joanna Briggs Collaboration
JBC Intranet (Login required)
Train the Trainer Online
The CSRTP Trainer
TtT Online Assessment
JBC Training and Events
Australia Adelaide (JBC)
Australia Aged Care (JBC)
Australia Aged Care (EUG)
Australia Brisbane (JBC)
Australia Canberra (JBC)
Australia Chronic Disease (JBC)
Australia Community Care (JBC)
Australia CSIRO (JBC)
Australia Dementia (ESG)
Australia Newcastle (ESG)
Australia Perth (JBC)
Australia Perth (ESG)
Australia Perth (EUG)
Australia Primary Health Care (JBC)
Australia Quality and Risk (JBC)
Australia Royal Adelaide (EUG)
Australia Rural and Remote (EUG)
Australia St John of God (EUG)
Australia St Joseph (EUG)
Australia SomerCare (EUG)
Australia Sydney (JBC)
Australia Wollongong (JBC)
Belgium Leuven (JBC)
Botswana Gaborone (ESG)
Brazil Sao Paulo (JBC)
Cameroon Yaounde (JBC)
Canada Ontario (JBC)
Canada St Elizabeth (ESG)
Denmark Nursing (JBC)
Doha SIDRA (JBC)
England Nottingham (JBC)
England Nursing and Midwifery (JBC)
Ethiopia Jimma (JBC)
Finland Helsinki (JBC)
Ghana Kintampo (JBC)
Hong Kong Nursing (JBC)
Japan Osaka (JBC)
Kenya Kilifi (JBC)
Korea Seoul (JBC)
Malawi Chichiri Blantyre (ESG)
Myanmar Yangon (JBC)
Nigeria Ibadan (ESG)
Nigeria Oyo State (ESG)
Philippines Santo Tomas (JBC)
Philippines Santo Tomas (ESG)
Portugal Coimbra (JBC)
PR China Peking (JBC)
PR China Shanghai (JBC)
Romania Bucharest (JBC)
Romania Public Health (JBC)
Rwanda Kigali (ESG)
Scotland Aberdeen (JBC)
Scotland Edinburgh (ESG)
Singapore Cancer Institute (EUG)
Singapore Health Service Management (JBC)
Singapore Mental Health (JBC)
Singapore Nursing (JBC)
South Africa Cape Town (ESG)
South Africa Durban (JBC)
South Africa Johannesburg (JBC)
Spain Madrid (JBC)
Switzerland BEST (JBC)
Taiwan Hualien (JBC)
Taiwan Taipei (JBC)
Tanzania Muhimbili (ESG)
Thailand Chiang Mai (JBC)
Uganda Kampala (JBC)
USA Indiana (JBC)
USA Louisiana (JBC)
USA New Jersey (JBC)
USA Oklahoma (JBC)
USA San Francisco (JBC)
USA Texas (JBC)
Wales Cardiff (JBC)
Joanna Briggs Alumni (Fellows)
JBI Clinical Fellows
JBI and OVID
Research and Consultancy
Systematic Review Registered Titles
Systematic Review Protocols
Systematic Reviews (external)
Best Practice Information Sheets
Education and Learning
Global Learning Centre
RAPcap Critical Appraisal Training Program
Leadership and Higher Degrees
JBI 2011 Convention
JBI National Australia Conference 2012
JBI Colloquium 2012
Online Journals and Publications
Identified Review Topics
Articles and Other Information
Terms and Conditions
JBI RAPid (critical appraisal tool)
JBI TAP (Thematic Analysis Program)
JBI TAP Registration
JBI Consumer Pamphlet Builder
JBI Clinical Manual Builder
JBI PACES (audit and feedback software)
JBI POOL (outcome software for acute care)
JBI COOL (outcome software)
The Implementation Study
Joanna Briggs Foundation
Become a JBI Member Now
What is evidence based health information?
Just as a detective searches for evidence to solve a crime, so too do health care professionals look for evidence to guide them in their practice. The detective must have evidence to support or refute the accused persons guilt. In a similar fashion, the healthcare professional must have evidence to support their proposed course of treatment. They search for information that will help them to provide the most effective or beneficial form of care to their clients. Due to the wealth of information available to them, the process of discovering which information is the best can be difficult and time consuming. It would take an enormous amount of time for your doctor to sift through the copious amounts of research and information available to them on a particular topic. This is where organisations such as the Joanna Briggs Institute come into play. We collect, analyse and synthesise the information for health professionals, providing them with the evidence on which to base their practice. In this way, we conduct the ‘detective’ work for them.
Research as a second language: understanding the jargon!
Learning about research and health information can be like learning a new language and you often feel as though you need a degree just to understand it! There are plenty of research terms and jargon that can be very confusing if you don’t know what they mean.
When you think about it, we are really asking you to learn not just one, but two new languages: the language of research and the language of health care. Researchers use words such as ‘odds ratio’ or ‘placebo’ and health professionals use words like ‘ambulate’ or ‘erythema’ … apparently they are all speaking English, but do you understand what they are talking about? Translated, these words are actually very easy to understand. Odds ratio essentially is a way of measuring risk; a placebo is a ‘fake’ treatment; ambulate put simply means to move; and erythema means redness of the skin.
It is important not to let the jargons care you off! If there is something you really want to know more about and you have found some research papers about it, have a read, highlight the words you don’t understand and with the help of a dictionary or health professional, it isn’t too difficult to work out what they are talking about.
What is a systematic review?
A systematic review is how we conduct our detective work. They are summaries of all past research on a topic of interest. Phase one of the investigation involves searching for the information. Systematic reviews use an explicit, rigorous and comprehensive approach to reviewing the literature.
Once the information has been identified, the’ fieldwork’ part of the investigation is complete. Phase two of the investigation involves evaluation of the information that has been identified. Reviews such as these rely on sound methods that enable us to combine the findings from multiple studies, and present those findings in a single document.
The information gathered through the systematic review process is presented and disseminated in a number of ways by the Institute. This is the final phase of our investigation. We start by producing full systematic review; this information is then condensed into what we calla ‘Best Practice Information Sheet’, which is used as a quick reference for health professionals in hospitals, community care, schools and general practice; this information is re-written once again to produce easy to understand information pamphlets for consumers.
The right research for the right question
Research is a way of collecting evidence and there are many ways in which to do this. There are many different types of research and the choice of research method will largely depend on its suitability or appropriateness to answer the research question. Generally research methods fall into one of two categories: qualitative or quantitative. Qualitative research is concerned with finding answers to questions centered on human experience (how people feel) while quantitative research generally deals with cause and effect (how many benefited).
Qualitative and quantitative methods of research are both quite valid, producing valuable information that can assist in making health care decisions. Traditionally, qualitative research results have not been included in the critical appraisal and systematic review of research evidence and this is where JBI differs from many of the organisations that conduct this type of work. JBI knows that you want to know which treatments and interventions work and we also know that you want information that can assist you in knowing how that treatment might make you feel.
The JBI believes qualitative work must be included when considering the evidence. Qualitative work considers how people feel, and their experiences, rather than looking at two treatments and saying one is better than the other. Rather than limiting systematic reviews of evidence to quantitative work JBI is prepared to consider all forms of evidence. In doing so we believe that we provide a more complete picture of the treatment options available.
For example, a quantitative study might show vacuum assisted drainage (device to assist the draining of a wound) is a more effective method for treating wounds, but a qualitative study might suggest there is immense pain associated with this, so therefore the treatment might be inappropriate. The two forms of research can compliment each other, and importantly help determine the best treatment for the patient.
Reporting of research results
As a consumer of health care, it is important to ‘Just Be Involved’ in making decisions about the treatment or care that you receive. However, the volume of information available to consumers in health-related books, videos, CD-ROMs, magazines, newspapers, television, the internet and from other people can be daunting. With so much information readily available to us, it may be hard to distinguish between reliable and unreliable information.
We have all heard terms used in the media like ‘latest research’ or ‘scientific breakthrough’ on a regular basis. Sometimes news stories about health issues seem to contradict one another - one says something is good for you, and another says the same thing is bad for you. Whether research results are reported in a medical journal or in the popular media, they should always be examined for quality and/or relevance.
All too often we accept the results of research without question - after all, an expert has conducted it, right? Well, yes, but this does not guarantee that the findings are reliable or of high quality and this should not be the only criteria we judge health information on. As we have demonstrated, research comes in all shapes and sizes and some are more reliable than others. This is not always demonstrated when results are published or reported and there are a variety of other surrounding issues to consider when assessing the quality of apiece of research. Influences on research outcomes can include the funding body (for example with regard to research on drugs), or research methods used.
Next time you read a report in the newspaper or see a report on television listen very carefully to the language that is used. Do they provide you with any background regarding previous research in the area? Has the research they are reporting on been done more than once? Are they planning on continuing their research on that treatment? Do they mention how many people the treatment was tested on?
All of these things are of vital importance when considering information related to a specific health topic. You aren’t expected to know all of the answers, but it is useful to be able to appreciate that not all information is based on solid, reliable evidence and to be able to feel confident about asking questions about information you are presented with. The most important thing is to BE INVOLVED!
Assessing the quality of information
Evidence comes in all shapes and sizes, from many different sources. Evidence can play a critical role in any investigation and it is important for detectives to recognise evidence that will provide ‘reliable’ information to aid in the investigation. This is also the case when assessing health information, as some types of evidence are more reliable than others. So once you have gathered all of the evidence, how do you assess how reliable it is? This is where ‘rating the evidence’ comes into play.
For every publication produced by the Joanna Briggs Institute, the evidence related to the topic being investigated is assessed for reliability and quality. We do not rate a procedure or treatment, but the evidence (or research) that is available to support it.
Traditionally, the Joanna Briggs Institute has used established evidence ratings from other organisations. These rating systems generally deal with quantitative research and so the JBI is in the process of developing a rating system that deals more broadly with evidence from both quantitative and qualitative research. We are not only interested in how ‘effective’ a treatment is, but how ‘feasible’, ‘appropriate’ and ‘meaningful’ it is. What do we mean by these terms exactly? Well, let me explain.
, how well a treatment it works of a treatment is obviously very important, but effectiveness can be influenced by a number of things.
of a treatment relates to how achievable a treatment is. When discussing feasibility it is important to take into consideration the cost of the treatment and the availability of equipment or medication required to carry out the treatment.
the treatment is relates to how suitable a particular treatment is.
Vacuum assisted drainage (a device to assist the draining of a wound) is a more effective method for treating wounds, but for some people there is immense pain associated with this, so therefore the treatment might be inappropriate.
a treatment is relates to the patient’s experience regarding a treatment. For example, research that investigates the experience of women with breast cancer is concerned with what that experience ‘means’ to the patient.
A good example of how these elements complement each other is the use of compression stockings to treat leg ulcers. Compression stockings have been shown to be cost effective, convenient and to have minimal side effects, making them a ‘feasible’, ‘appropriate’, and ‘effective’ treatment option. However, if the patient’s experience (meaningfulness) of compression stockings is that the stocking is uncomfortable to wear and they refuse to wear it, the other three are compromised.
We all appraise and evaluate information in all sorts of ways to make decisions every day. It may be that we are assessing different brands of a product before deciding which to buy. We may have heard about the product through an advertisement in a magazine, on the radio or television, maybe our friends or family told us about the product or maybe we simply compared the product labels and price while in the supermarket. What we have done is critically appraise the information we have gathered in order to make a decision about which one to purchase. We do the same thing with our health care decisions (or if we don’t – we should) when deciding what course of treatment to follow for a specific condition or illness.
Assessing and critically appraising research evidence can be a complex, difficult and time consuming process – even for the professionals! One of the reasons this process can be so confusing is that there are so many different types of research results to appraise. This is why organisations such as JBI are so useful – because they conduct the evaluation for you. However, while we can provide information on a wide range of treatments and interventions, there may be a topic that you would like to investigate further that is not covered in our database.
It is important to know whether the information you are reading is relevant, valid and reliable. When assessing the strengths and weaknesses of research there are several questions you may wish to consider.
Research questions are often very specific and so it is important to make sure that the research you are reading about is directly related to the issue you are interested in investigating. For example, if you suffer from diabetes not all research on diabetes will be of interest to you. Some will be concerned with different types of diabetes, others with diet and nutrition, or exercise.
This relates to the participants or subjects recruited and included in the research. It is important not to assume the results of a piece of research are applicable to you and your condition because of a vague connection. As in the case of the diabetes example, you may be a young, active healthy male who has diabetes and you identify a research article related to diabetes and exercise. However, the research may have been conducted on elderly, inactive males and so results would not necessarily be applicable to your situation.
If a study has been conducted efficiently and reported well, there are a number of things you should easily be able to identify without knowing a great deal about research and its jargon. Background information should be provided telling you what research has been previously conducted in the area and how the current research differs. The method (how the research was conducted), results (what they found) and discussion (what the results mean) should be clear and concise.
If you are interested in further reading about critical appraisal of research evidence couple of suggestions are:
Irwig J, Irwig L, & Sweet M. (1999) Smart Health Choices – How to make informed health decisions. Allen and Unwin, NSW Australia
Greenhalgh T. (2000) How to read a paper: the basics of evidence based medicine. BMJ Publishing Group, UK.
(Also available at
Page last modified 05/11/2012
JBI Consumer Information Sheets are only accessible to subscribers of JBI COnNECT+. Click
to go to the JBI COnNECT+ homepage.
Hidden UpdatePanel, which is used to help with saving state when minimizing, moving and closing docks. This way the docks state is saved faster (no need to update the docking zones).
Copyright © 2011 The Joanna Briggs Institute | Site by
THE JOANNA BRIGGS INSTITUTE
The University of Adelaide, South Australia 5005 AUSTRALIA
Telephone: +61 8 8313 4880 | email
CRICOS Provider Number 00123M