Skip to Main Content

Innovations for Youth/YEDI/YAAH Library Sessions: Evaluate What You Find

Critically Evaluating What You Find

Evaluation is about determining the quality, value, or importance of something in order to take action. It is underpinned by the systematic collection of information and evidence.

What is evidence? Things to keep in mind:

  • All research is (potentially) "evidence" and there are no "perfect" studies.
  • Is there an agenda (bias)?
    It is doubtful that any study of humans is totally without some kind of bias, either in the study design, or in the author's pre-existing beliefs, not to mention the source of the research funds. How bias in methodology was controlled and the significance of bias in any particular study is what's relevant.
  • Is qualitative research "evidence"?
    If your goal is to understand beliefs and meanings in the group with whom you are working, then qualitative studies can be important.
    Read Criteria for evaluating evidence on public health interventions (J Epidemiol Community Health. 2002 Feb;56(2):119-27)

Who pays for science?
Most scientific research is funded by government, companies doing research and development, and non-profit entities. Because science is attempting to get at some "truth," the source of research funding shouldn't have a significant effect on the outcome of scientific research, right?

Is it race? or is it racism?
Race is a sociological construct, yet most articles describing racial disparities ascribe them to race, not to racism.

Peer review
Peer review refers to a process whereby scholarly work (ie, an article) is reviewed and critiqued by experts to ensure it meets some standards of acceptance before it is published.
Does this process make for better science?

What gets researched? What gets published? ("Publication bias"); What (or Who) gets funded?
Studies that report interventions that had no effect are less likely to get published. What does this mean in terms of the state of knowledge on a topic?

There is also evidence of disparities in grant funding for research:

as well as disparities in recruitment for trials:

Sometimes stuff gets researched because of "scientific inertia":

Oops! I made a mistake (or ... was it cheating..?)
Occasionally, researchers make mistakes, and sometimes those mistakes affect the conclusions of a published article. Articles may be retracted if the mistake is significant. This is a formal process where the author or journal publishes a statement outlining the error. Sometimes, however, retraction is the result of fraud, plagiarism, or other bad acts.

Opinion or fact?
Do the conclusions of the article follow the evidence that's presented? Are opinions or notions posited as facts?

CV boosting: Does this study add to the body of knowledge, or is it just something the author is doing to add to his/her list of publications?
(In)significance of a single study: Science is incremental. Beware of any study that's proclaimed to be a "breakthrough."

 

What is the PoV?

perspective, or point of view

What to Consider When Looking at Survey or Estimated Data

  • Look at sample sizes and survey response rates
    • Representative of your population?
    • Enough responses to be valid?
  • Who was surveyed?
    • Representative of population being compared to?
    • Include group you are interested in?
    • Were the survey respondents from heterogeneous groups?
    • Do the survey questions mean the same things to members of different groups?
  • How was survey conducted?
  • What assumptions and methods were used for extrapolating the data?
    • Is there any bias?
    • Is the method appropriate?
  • Look at definitions of characteristics:
    • Does this match your own definitions?
  • When was the data collected?
    • How old is too old?

How is race/ethnicity reported in the studies you read?:

  • Who identified race/ethnicity of respondents/participants?
  • Does the language in the article impart bias?
  • Is race acknowledged as a social construct?
  • Are differences reported as associated with "race" or "racism"?
  • Are participants' identities disaggregated?

Reliability and validity

Reliable data collection: relatively free from "measurement error." 

  • Is the survey written at a reading level too high for the people completing it?
  • Is the device used to measure elapsed time in an experiment accurate?

Validity refers to how well a measure assesses what it claims to measure 

  • If the survey is supposed to measure "quality of life," how is that concept defined?
  • How accurately can this animal study of drug metabolism be extrapolated to humans?

Adapted from Chapter 3, Conducting research literature reviews : from the Internet to paper, by Arlene Fink,  2010.