Sage Publications, Inc.
2455 Teller Rd.
Newbury Park, CA 91320
$49.95 (c) / $24.00 (p)
Description:
Originally this 517-page book was published as a special
edition of the Journal of Social Behavior and Personality. It now appears as a
paperback book by Sage Publications. Their choice to make this material more
broadly available is a good one that should be applauded by social scientists.
It is in keeping with their claim to be the "International Professional
Publishers."
There are 36 chapters by different authors. The pages are
densely packed with a type face that is rather small but nonetheless clear and
legible. The first 90 pages present eight chapters that deal with the basic
issue of replication of scientific studies. These selections discuss the importance of replication, what
replication is, the "file drawer" problem, how to tell when
replication has occurred, and the bias of journal editors against publishing
studies that are replications. The second section contains seven reports of
classic replications; the third section, ten reports of replications in
psychology; the fourth section, replications in the study of communication; and
the final section, four reports of replications in other disciplines.
Discussion:
At least the first portion of this book should be read and
carefully studied by all professionals who are consumers of research done in
the behavioral sciences. This includes mental health professionals, attorneys,
judges, journalists, law enforcement personnel, and any others whose lives may
be affected by the quality and reliability of what is held out to be research
data. In the justice system, popular media, and in dealing with social change,
there is a perception of research data that is often limited, possibly
incorrect, and potentially damaging. Research studies are treated like hits in a
baseball game, each one a separate, independent event where the average appears
to have some currency but what really counts is the box score. Therefore, often
in a courtroom research studies are treated as matching points in a ball game or
a tennis match. "Well, you have that study but I have this one.
So
there!" The listener or, in many cases, the finder of fact, is then left in
the position of totting up the bottom line to decide which side wins. This book
will correct such misconceptions.
Possibly the most disconcerting finding reported in the book
is the demonstration that there truly is a bias against publishing replications.
This means there is little or no chance for what is supposed to be the
self-correcting nature of science to work. The implications of this may be seen
in the furor over the cold fusion experiments. If the failure to replicate the
claims of successful cold fusion had never been published, the entire world
could be going down a fruitless path of spending billions on atomic fusion
plants with no possibility of producing energy. Basing decisions on single,
unreplicated studies is likely to result in undetermined amounts of error.
If
there is evidence of failure to replicate but it is buried somewhere, it may
then be a long time before it is understood that error has taken place.
After the first section of this book, the reports of
replication studies flesh out and illustrate the problems in replication, assist
in learning how to tell when replication has, in fact, taken place, and increase
the ability to differentiate credible research from research
which should be treated more circumspectly and cautiously. Learning the concepts
about what constitutes good research makes it possible to ask appropriate
questions and probe the understanding of a witness who is presenting a claim
purported to be based on research evidence. A discrimination can be made between
research data that are strong and credible and those which are weak or
unsupported.
Reviewed by Ralph Underwager, Institute for Psychological
Therapies, Northfield, Minnesota.