Ironically, professor Ian Shaw’s editorial focused primarily on the merits of qualitative methodology, not quantitative methods. So with all due respect Dr. Shaw, I would therefore suggest a friendly amendment to your article title: ‘‘The Positivist [not Positive] Contributions of Quantitative Methodology to Social Work Research: A View from the Sidelines.’’ Quite frankly, I found that Shaw’s editorial tended to ignore the positives of quantitative research [of which there are many], and he chose instead to focus on some of its widely recognized methodological limitations. In my life, I have reached the point where I realize that the need to be right is toxic. It truly does nothing to advance meaningful discussion, conversation, and/or thinking—in fact, just the opposite. Thus, I do not need to be right, but I should convey how I arrived at this appraisal of Dr. Shaw’s statement. I will provide some empirical facts to support my contention, as for me it was the ‘‘probleme majeur’’ [I do think it sounds so much nicer in French] with Dr. Shaw’s article. His view of quantitative methods appears to have a ‘‘position flaw,’’ that is, he argued from the outside-in (qualitative to quantitative), rather than the reverse. And this obfuscated many of his claims in the article, some of which had real merit. Evidence abounds in his think piece about this ‘‘position flaw.’’ Some noteworthy quotations are ‘‘I am prepared to hold quantitative objectivisms in one hand and qualitative revelations in the other’’ (para. 2); ‘‘to get to the point, writing as a convinced qualitative methodologist—roughly speaking, epistemologically constructivist and ontologically realist’s’’ (para. 5); and a bit later when he states, ‘‘I want to refer to the ways that qualitative strategies may gain from borrowing’’ [from quantitative ones—my words here]. Given that these quotations may be viewed as a bit out of context by some, I offer additional empirical proof through the Internet tool ‘‘Wordle,’’ found at www.wordle.com. Wordle allows one to upload any number or words/texts into a program that instantaneously forms a magical ‘‘word cloud.’’ The size of the words in the cloud [which you can alter in appearance but not content] is determined by the frequency [or perceived relevance] in the uploaded word file. So one can visually inspect which words in the file ‘‘pop up’’ in the word cloud, more prominently. Figure 1 shows the uploaded word cloud of Dr. Shaw’s article on ‘‘Positive Contributions of Quantitative Methodology . . . ’’ And Sheezam! Without a histogram, pie chart, frequency table, parametric or nonparametric test, one can clearly see how the word ‘‘QUALITATIVE,’’ pops up in the cloud more prominently. Wordle also has a singular word count feature for each character in the uploaded text, including commas (,’s). ‘‘Qualitative’’ is mentioned 31 times, and ‘‘quantitative’’ is mentioned 26 times in your article. So Dr. Shaw, the empirical proof is ‘‘in the pudding’’—or ‘‘pudding cloud,’’ in this case. Turning to the arguments offered about quantitative methods, I was also a bit confused about the point of departure for braiding your culminating thoughts so narrowly—‘‘quantitative solutions to the problem of comparison, in particular through experimental and quasi-experimental intervention research’’ (para. 5). The truth is that at no point in social work’s research history have experimental and quasi-experimental been the mainstay research designs of either social work academics or practitioners. For example, in a content analysis of N 1⁄4 329 articles published between 2005 and 2007 in the three main North American empirical social work journals (Research on Social Work Practice, Journal of Social Service Research, and Social Work Research), I found (Holosko, 2010) that only 2.3% studies used experimental designs, and 5.1% used quasiexperimental designs. Conversely, 72.9% used a simple one shot case study design without an intervention, with 9.3% using a single posttest after an intervention. I reiterate then, that the research methods you are referring to in the previous quotation, quasi-experiments and true experimental studies, are rarely used and decidedly uncharacteristic of the ‘‘bread and butter’’ designs used by social work researchers, as reflected in our professional journals. I do both applaud and laud you for acknowledging and unearthing some crucial ‘‘research chestnuts’’ that laid the foundation stones for today’s social work research and evaluation enterprise: The Origins of Scientific Sociology (Madge, 1963), Quasi-Experimentation: Design and Analysis in Field Settings (Cook & Campbell, 1979); The Effectiveness of Social Casework (Fischer, 1976); and Toward Reform of Program Evaluation (Cronbach et al., 1980), to name a few, that I also occasionally reread in awe from my own personal library. I was also rather moved by your correct assumption that our field seems to have not progressed significantly in our research
Read full abstract