You can specify Rank as the sort order when searching (it’s the default) which will put the articles which best matched your search on the top, and the complete results in descending relevance to your search. This feature is useful for finding the most important articles on a specific topic.
You can also change the sort order of results by selecting rank at the top of the search results pane after you perform a search. Note that rank order after a search only ranks up to 1000 maximum results that were returned; specifying rank in the search dialog ranks all possibilities before choosing the final 1000 (or less) to return.
For the complete list of tips, see PEP-Web Tips on the PEP-Web support page.
Torbet, G. (2012). Ritchie, S. J., Wiseman, R., & French, C. C. (2012). Replication, replication, replication. The Psychologist, 25 (5): 346-357.Yong, E. (2012). Replication studies: Bad copy. Nature, 485 (7398): 298-300.. Neuropsychoanalysis, 14(2):249.
(2012). Neuropsychoanalysis, 14(2):249
Ritchie, S. J., Wiseman, R., & French, C. C. (2012). Replication, replication, replication. The Psychologist, 25 (5): 346-357.Yong, E. (2012). Replication studies: Bad copy. Nature, 485 (7398): 298-300.
Review by: Georgiana Torbet
“Bias and prejudice are attitudes to be kept in hand, not attitudes to be avoided.”
Academic psychology has been forced to do some self-examination of late, not only in the wake of high-profile cases of scientific fraud but also after the widely reported publication of some dubious results. The work of Daryl Bem (2011) which appeared to find parapsychological abilities such as clairvoyance in the general student population has had an impact not only on the academic literature, but also on the way that psychology experiments are published and publicized. When Bem's experiments were repeated by Ritchie, Wiseman, and French, researchers who were dubious about his results, they did not find evidence of the parapsychological abilities that he did. But they struggled to find a journal that was willing to publish their results, as they were a repetition of previously published work.
This throws into the spotlight one of academic science's biggest problems—the bias inherent in which results are published and which results never see the light of day. Known as the “file-drawer problem,” this occurs as experiments that yield strong, positive, novel results are more likely to be published than those that yield negative findings. Thus, on reviewing the literature, one sees the results of only a fraction of the experiments that have actually been performed and may be persuaded that a finding is more robust than it actually is.
Ed Yong examines how this issue is particularly relevant to psychology in a recent Nature news feature.
[This is a summary or excerpt from the full text of the book or article. The full text of the document is available to subscribers.]