Research Synthesis Methods
The journal of SRSM, Research Synthesis Methods can be found on the Wiley Online Library.
Professional Associations
The Cochrane Collaboration - Cochrane is a global independent network of health practitioners, researchers, patient advocates and others, responding to the challenge of making the vast amounts of evidence generated through research useful for informing decisions about health.
The Campbell Collaboration - The Campbell Collaboration is an international research network that produces systematic reviews of the effects of social interventions in Crime & Justice, Education, International Development, and Social Welfare.
The Campbell Collaboration - The Campbell Collaboration is an international research network that produces systematic reviews of the effects of social interventions in Crime & Justice, Education, International Development, and Social Welfare.
Software
Comprehensive Meta-Analysis - Commercial meta-analysis software that combines ease of use with a wide array of computational options and sophisticated graphics.
OpenMEE - Open source and cross-platform meta-analysis software geared towards ecologists and evolutionary biologists.
RevMan - Review Manager software provided by the Cochrane Collaboration for preparing and maintaining Cochrane reviews.
OpenMEE - Open source and cross-platform meta-analysis software geared towards ecologists and evolutionary biologists.
RevMan - Review Manager software provided by the Cochrane Collaboration for preparing and maintaining Cochrane reviews.
Links
SRSM members who would like their work posted should email it to Melissa.
Schmidt, F.L. 2017. Statistical and measurement pitfalls in the use of meta-regression in metaanalysis. Career Development International, Vol. 22 Issue: 5, pp. 469-476. https://www.emeraldinsight.com/doi/abs/10.1108/CDI-08-2017-0136
Schmidt, F.L. 2016. Beyond questionable research methods: the role of omitted relevant research in the credibility of research. Archives of Scientific Psychology, 5, pp. 32-41. http://dx.doi.org/10.1037/arc0000033
Schmidt, F.L., Oh, I. 2016. The crisis of confidence in research findings in psychology: is lack of replication the real problem? Or is it something else? Archives of Scientific Psychology, 4, pp. 32-37. http://dx.doi.org/10.1037/arc0000029
Schmidt, F.L., Oh, I. 2013. Methods for second order meta-analysis and illustrative applications. Organizational Behavior and Human Decision Processes, Vol. 121 Issue: 2, pp. 204-218.
https://www.sciencedirect.com/science/article/pii/S0749597813000368
Schmidt, F.L. 2013. Eight common but false objections to the discontinuation of significance testing in the analysis of research data. What If There Were No Significance Tests, edited by Lisa L. Harlow, Stanley A. Mulaik, James H. Steiger pp. 37-64 https://www.researchgate.net/publication/285020701_What_if_there_were_no_significance_tests
Le, H., Schmidt, F.L., James, H.K. & Lauver, K.J. 2010. The problem of empirical redundancy of constructs in organizational research: An empirical investigation. Organizational Behavior and Human Decision Processes, Vol. 112. Pp. 112-125. https://ac.els-cdn.com/S0749597810000245/1-s2.0-S0749597810000245-main.pdf?_tid=ebcbed1a-8ffa-4654-bef6-1d3d93d5b578&acdnat=1546551328_2faa5b52229ff82f9f4a6e5d0e468def
Le, H., Schmidt, F.L. & Putka, D.J. 2009. The multifaceted nature of measurement artifacts and its implications for estimating construct-level relationships. Organizational Research Methods, Vol. 12 Issue: 1, pp. 165-200. https://journals.sagepub.com/doi/abs/10.1177/1094428107302900?journalCode=orma
Schmidt, F.L., Oh, I. & Hayes, T.L. 2009. Fixed-versus random-effects models in meta-analysis: Model properties and an empirical comparison of differences in results. British Journal of Mathematical and Statistical Psychology. Vol. 62 Issue 1, pp. 97-128 https://onlinelibrary.wiley.com/doi/full/10.1348/000711007X255327
Schmidt, F.L. 2008. Meta-Analysis: A constantly evolving research integration tool. Organizational Research Methods, Vol. 11 Issue: 96, pp. 96-113. https://doi.org/10.1177/1094428107303161
Schmidt, F.L., Le, H. & Illies, R. 2003. Beyond alpha: An empirical examination of the effects of different sources of measurement error on reliability estimates for measures of individual-differences constructs. Psychological Methods, Vol. 8 Issue: 2, pp. 206-224. http://psycnet.apa.org/record/2003-06499-010
Schmidt, F.L., Hunter, J.E. 1996. Measurement error in psychological research: Lessons from 26 research scenarios. Psychological Methods, Vol. 1 Issue 2, pp. 199-223.
Schmidt, F.L. 1996. Statistical significance testing and cumulative knowledge in psychology: Implications for training of researchers. Psychological Methods, Vol. 1 Issue: 2, pp. 115-129. https://www.researchgate.net/publication/232518319_Statistical_significance_testing_and_cumulative_knowledge_in_psychology_Implications_for_training_of_researchers
Hunter, J.E., Schmidt, F.L. 1994. Estimation of sampling error variance in the meta-analysis of correlations: Use of average correlation in the homogeneous case. Journal of Applied Psychology, Vol. 79 Issue: 2, pp. 171-177.
Law, K.S., Schmidt, F.L. & Hunter, J.E. 1994. A test of two refinements in procedures for meta-analysis. Journal of Applied Psychology, Vol. 79 Issue: 6, pp. 978-986. http://psycnet.apa.org/record/1995-12097-001?doi=1
Schmidt, F.L., Law, K.S., Hunter, J.E., Rothstein, H.R., Pearlman, K. & McDaniel, M. 1993. Refinements in validity generalization methods: Implications for the situational specificity hypothesis. Journal of Applied Psychology. Vol. 78 Issue: 1, pp. 3-12. http://psycnet.apa.org/record/1993-20116-001
Schmidt, F.L. 2017. Statistical and measurement pitfalls in the use of meta-regression in metaanalysis. Career Development International, Vol. 22 Issue: 5, pp. 469-476. https://www.emeraldinsight.com/doi/abs/10.1108/CDI-08-2017-0136
Schmidt, F.L. 2016. Beyond questionable research methods: the role of omitted relevant research in the credibility of research. Archives of Scientific Psychology, 5, pp. 32-41. http://dx.doi.org/10.1037/arc0000033
Schmidt, F.L., Oh, I. 2016. The crisis of confidence in research findings in psychology: is lack of replication the real problem? Or is it something else? Archives of Scientific Psychology, 4, pp. 32-37. http://dx.doi.org/10.1037/arc0000029
Schmidt, F.L., Oh, I. 2013. Methods for second order meta-analysis and illustrative applications. Organizational Behavior and Human Decision Processes, Vol. 121 Issue: 2, pp. 204-218.
https://www.sciencedirect.com/science/article/pii/S0749597813000368
Schmidt, F.L. 2013. Eight common but false objections to the discontinuation of significance testing in the analysis of research data. What If There Were No Significance Tests, edited by Lisa L. Harlow, Stanley A. Mulaik, James H. Steiger pp. 37-64 https://www.researchgate.net/publication/285020701_What_if_there_were_no_significance_tests
Le, H., Schmidt, F.L., James, H.K. & Lauver, K.J. 2010. The problem of empirical redundancy of constructs in organizational research: An empirical investigation. Organizational Behavior and Human Decision Processes, Vol. 112. Pp. 112-125. https://ac.els-cdn.com/S0749597810000245/1-s2.0-S0749597810000245-main.pdf?_tid=ebcbed1a-8ffa-4654-bef6-1d3d93d5b578&acdnat=1546551328_2faa5b52229ff82f9f4a6e5d0e468def
Le, H., Schmidt, F.L. & Putka, D.J. 2009. The multifaceted nature of measurement artifacts and its implications for estimating construct-level relationships. Organizational Research Methods, Vol. 12 Issue: 1, pp. 165-200. https://journals.sagepub.com/doi/abs/10.1177/1094428107302900?journalCode=orma
Schmidt, F.L., Oh, I. & Hayes, T.L. 2009. Fixed-versus random-effects models in meta-analysis: Model properties and an empirical comparison of differences in results. British Journal of Mathematical and Statistical Psychology. Vol. 62 Issue 1, pp. 97-128 https://onlinelibrary.wiley.com/doi/full/10.1348/000711007X255327
Schmidt, F.L. 2008. Meta-Analysis: A constantly evolving research integration tool. Organizational Research Methods, Vol. 11 Issue: 96, pp. 96-113. https://doi.org/10.1177/1094428107303161
Schmidt, F.L., Le, H. & Illies, R. 2003. Beyond alpha: An empirical examination of the effects of different sources of measurement error on reliability estimates for measures of individual-differences constructs. Psychological Methods, Vol. 8 Issue: 2, pp. 206-224. http://psycnet.apa.org/record/2003-06499-010
Schmidt, F.L., Hunter, J.E. 1996. Measurement error in psychological research: Lessons from 26 research scenarios. Psychological Methods, Vol. 1 Issue 2, pp. 199-223.
Schmidt, F.L. 1996. Statistical significance testing and cumulative knowledge in psychology: Implications for training of researchers. Psychological Methods, Vol. 1 Issue: 2, pp. 115-129. https://www.researchgate.net/publication/232518319_Statistical_significance_testing_and_cumulative_knowledge_in_psychology_Implications_for_training_of_researchers
Hunter, J.E., Schmidt, F.L. 1994. Estimation of sampling error variance in the meta-analysis of correlations: Use of average correlation in the homogeneous case. Journal of Applied Psychology, Vol. 79 Issue: 2, pp. 171-177.
Law, K.S., Schmidt, F.L. & Hunter, J.E. 1994. A test of two refinements in procedures for meta-analysis. Journal of Applied Psychology, Vol. 79 Issue: 6, pp. 978-986. http://psycnet.apa.org/record/1995-12097-001?doi=1
Schmidt, F.L., Law, K.S., Hunter, J.E., Rothstein, H.R., Pearlman, K. & McDaniel, M. 1993. Refinements in validity generalization methods: Implications for the situational specificity hypothesis. Journal of Applied Psychology. Vol. 78 Issue: 1, pp. 3-12. http://psycnet.apa.org/record/1993-20116-001