Academia.eduAcademia.edu
Published in: QMiP Bulletin, 13, 59‐63. [Not peer‐reviewed] Do you really need a methodology? Kerry Chamberlain School of Psychology, Massey University, Auckland, New Zealand In August 2010 I attended the QMiP Conference in Nottingham, where I made a presentation in the 5‐minute challenge slot. That was the genesis for this short piece, although it started from a misconception. This occurred because I misread the call for the 5‐minute challenge presentations; the call was actually intended as a challenge to the presenters, to present their research within a five‐minute time slot. But reading it too quickly with pre‐formed assumptions, I misread it as intending the challenge to be directed to the audience, to challenge their thinking. So I submitted an abstract entitled “Why you don’t need a methodology”; fortunately, the organisers overlooked my careless reading and accepted the presentation. So I spoke, for five minutes, with five slides, arguing a case for why you don’t need a methodology in qualitative research. Actually, this proved to be only half the point, since I also argued that you would inevitably, and always, have a methodology if you worked appropriately – by determining, critically and reflexively, the best way to proceed; by carefully choosing relevant and appropriate methods for the piece of research that you wanted to do. Reflecting on this talk later, I realised the challenge would have been better stated as “How to get a methodology”. Several people approached me afterwards for a copy of the talk; since at that stage it was only five visual images on a presentation and some ideas in my head, I couldn’t oblige. This is a chance to respond to those requests. So first, what’s the problem with methodology? Well, my argument was, and still is, that too many qualitative researchers, particularly in psychology, don’t think carefully and critically enough about methodology. Instead, they go looking for a pre‐existing methodology, seeking to find one off the shelf – one that someone else has developed for them and that they can adopt and use ready‐made. (They often do this with theories too, rather than theorising their research for themselves, but that’s another story.) There is Do you really need a methodology? 2 no shortage of off‐the‐shelf methodologies, although some are more amenable to use in this way than others. Grounded theory is probably the prime example, especially once Corbin and Strauss had published the detailed ‘manual’ on how it should (must?) be done (see, Corbin & Strauss, 2008). Within psychology in the UK, the development and rise of Interpretative Phenomenological Analysis (IPA) over the last two decades (see, Smith, 2011) presents another good example, with several papers and a book (Smith, Flowers, & Larkin, 2009) written about how it should be done. However, other methodologies are less prescriptive, and therefore harder to adopt in a way that has been described as methodolatry (Janesick, 1994); narrative, hermeneutic phenomenology and critical ethnography come to mind (although each has given rise to some debate on that topic). For instance, anyone who has tried to work with narrative as a methodology will be aware immediately of how impossible it is to find, let alone rely on, codified practice in that field. But they will also be cogniscent of the powerful contributions that can be made through a good narrative analysis (or in the case of narrative, should that be narrative inquiry? (Chase, 2005)). My concerns about methodological practices in qualitative research in psychology were first expressed in relation to health psychology (Chamberlain, 2000), and they are not new (see, Koch, 1981). Billig (2002) was critiquing critical discourse analysis on similar grounds a decade ago, arguing that CDA functioned like a legitimising ‘brand’ that limited and confined debate, and Breeze (2011) has argued this in greater depth more recently. Thorne (2011) has also offered pertinent comments recently on the methodological straightjacket that constrains much applied health research. Hammersley (2011) has issued a book entitled “Methodology, who needs it?” that deals with these, and a range of related issues, in depth. My on‐going frustration with methodology talk, and the main driver at the time for my QMiP 5‐minute challenge presentation, arose primarily out of my activities as a journal editor. In reviewing submissions, I frequently come across methodological statements within method sections making claims to be using a particular methodology, but never really engaging with the underlying epistemological assumptions or the theoretical thinking behind the methodology, and never connecting this fully into the research methods and analytic work presented. So statements like the following appear frequently: 2 Do you really need a methodology? 3 According to Smith (1996) Interpretative Phenomenological Analysis (IPA) is concerned with trying to understand how people make sense of their experiences. Since this project seeks to understand the experience of xxx, IPA is a suitable methodology. Thematic analysis was selected as the most appropriate method for the analysis of the interview transcripts. Thematic analysis is a method for identifying, describing, analyzing and reporting themes and patterns within data (Braun and Clarke 2006). However, without further comment and discussion, such statements are essentially tautological, and empty of real content and the necessary connection to the specific piece of research in view. They are, as Hammersley (2011, p. 188) has argued, simply supporting “… the division of research into paradigms or approaches whose assumptions are protected from discussion; they are either asserted or rejected …” Here, statements like these are merely assertions that mask debate and preclude the need for any critical discussion as to how these methodological approaches are specifically relevant for the research in hand (see also, Chamberlain, 2011). How did we end up like this? I think there are several interlocking reasons and processes here. One is that qualitative research is often resisted and marginalised within academia because the intellectual contributions of qualitative research to knowledge are not well understood by non‐qualitative researchers (see, Thorne, 2011). The prominence of positivist approaches to research in psychology works to sustain ‘orthodox’ beliefs about the nature of psychology and the dominance of particular research paradigms (Danziger, 1990; Dean, 2004); the “hazards of orthodoxy”, as Stoppard (2002) has labelled it, can be quite difficult to overcome. Also, although the British Psychological Association has included qualitative methods into the curriculum, many psychology students receive only limited training in this arena. This is accentuated by a lack of expertise in qualitative research amongst many involved in the training of psychologists (see Madill, Gough, Lawton, & Stratton, 2005; Tashakkori & Teddlie, 2003), and tensions around the introduction of qualitative courses into the curriculum (see O’Neill, 2002; Stoppard, 2002). Ironically, much of the demand for learning about qualitative research comes from psychology students wanting to branch out beyond orthodox psychological research (Mitchell, Friesen, Friesen, & Rose, 2007), often influenced by methods used in other disciplines. However, there are signs that this situation is improving and training in qualitative research is developing (Harper, 2012; 3 Do you really need a methodology? 4 Rennie, Watson, & Monteiro, 2002). Hopefully, that will not lead to the ‘normalised’ adoption of specific methodologies, to standardisation on selected methodologies, and the consequent restriction of research practice that this can impose. Unfortunately, there are some signs that this may be the case. Thompson, Smith, and Larkin (2011) surveyed UK trainers in clinical psychology, finding that about forty percent of dissertations presented in the previous year used qualitative methods, but of these, half were using IPA. Obviously, students seek a degree of certainty in moving into a new, and often unfamiliar, arena of research, and sticking with established and codified methodologies is one easy way to achieve that. However, the adoption of methodologies off‐the‐shelf leads to important limitations of key practices for achieving high quality qualitative research; it limits researchers engagement in criticality, reflexivity and creativity. Related to this is another concern of mine, about method. This is the over‐emphasis in qualitative psychology on the interview as the method. The interview is the taken‐for‐ granted data collection method of choice, and like off‐the‐shelf methodology, this means that the use of the interview requires no consideration or justification. In qualitative psychology the interview is used almost to the exclusion of other methods and ways of working that may have equal or better value for a given project. The ‘essential’ nature of the interview may be because so much qualitative psychology is focussed on experience, and it seems obvious that the only way we can get at other people’s experience is by having them talk about it. This is true, but there are many ways to invoke talk, and the use of a one‐off (semi‐structured or unstructured) interview with participants may be seriously limiting the scope, depth and potential of our research. With my research students, I have initiated a campaign that seeks to go beyond the use of a single interview as a research method (unless they can justify it, which is the case in some circumstances). This campaign is an attempt to hasten the death of the ‘drive‐by’ interview – that form of research practice which unthinkingly views participants as nothing more than data sources, and the single interview as a sufficient means of obtaining data for analysis. Atkinson and Delamont (2006, p. 164) describe this problem as the “general culture of the ‘interview society’”, and critique its use beyond psychology. I raise this because the concerns here for method are essentially the same as those raised earlier for methodology; unthinking and uncritical adoption of either is bad research practice. 4 Do you really need a methodology? 5 So I push students away from methodologies and into methods – to find the best methods to gather data for the research questions they want to answer, and often the use of more than one (qualitative) method simultaneously within a project. At its simplest level, the use of more than one interview with each participant has considerable value, deepening rapport, expanding the scope and extending the depth of data collected, allowing reflection (by both researcher and participant) (see Flowers, 2008). Other methods do other things, as when the use of photo‐production turns the participant into a researcher of his or her own life (see, Hodgetts, Chamberlain & Radley, 2007) or the discussion of material objects brings new and more detailed information to the fore (see, Sheridan & Chamberlain, 2011). An article we recently published illustrates the value and contribution of methods (Chamberlain, Cain, Sheridan, & Dupuis, 2011). It focuses on the multiple methods used in two different doctoral research projects. Neither of these projects started with the identification of “a methodology” or a methodological approach, although one is broadly informed by discourse and the other by narrative. Rather, both projects began by engaging with their topics theoretically and developing specific methods‐in‐context that would provide relevant in‐depth data that was amenable to appropriate analytic interpretation and insights into the phenomena under investigation. This was not simple or easy to accomplish, and required considerable critical and reflexive work on the part of the researchers. This is a good way to work. Being free of the constraints of a specific methodology allows space to consider a wide range of options for data collection and promotes creative solutions to data collection issues (see Rolfe, 1995). Considering multiple and diverse methods forces a critical consideration of how they will add to the project, work together, and combine into a unified project that achieves its goals. However, I do not want to suggest that creativity and critical reflexive considerations only arise in complex projects using multiple methods; they are essential processes for the more common single interview study of experience, or any other project, big or small, simple or complex. So how do you to get a methodology? I stated earlier that if you work as I have outlined above, thinking theoretically and choosing methods thoughtfully for data collection, you will inevitably end up with a methodology. This is because a methodology is, as Crotty (1998, p. 3) writes, “the strategy, plan of action, process or design lying 5 Do you really need a methodology? 6 behind the choice and use of particular methods and linking the choice and use of methods to the desired outcomes.” Consequently, by starting from methods, and provided these are thoughtfully connected, you will have a logic to your research that is the methodology. Importantly, this is, as it should be, your methodology for your research. If you have been working creatively, critically and reflexively, then you will have carefully considered and justified your research practice, thought about alternatives and know why you have rejected them, integrated the chosen method or methods into your project, and be able to explain and defend your research processes. We should note that there is a lot written about methodology in general and certainly a lot about specific methodologies, such as grounded theory, discourse and phenomenology. The arguments above do not mean that you should not read this and be informed by it – merely that you should never simply adopt a methodology off‐the‐shelf. Reading about a methodology (say critical discourse) should inform the way you undertake your research, by establishing the assumptions it rests on, by setting and clarifying boundaries, by differentiating this approach from other forms of research with different aims (say, to examine thematic content, to understand experience, and so on), and by enhancing theoretical thinking about the issues under investigation. Every piece of research is unique, in what it seeks to do and how it seeks to do it. So methodological ideas and concepts, like theoretical ideas and concepts, are there to stimulate, to be drawn on and utilised, to be adapted in context; they are not there to be followed slavishly. The use of methodology should never become methodolatry. This means that you need to provide an argument for how you use them, in your own voice. Simply stating the tautological mantras quoted earlier with references to the gurus in the field is not a suitable, sufficient, or satisfactory statement that compels any reader of your work to accept that you have carefully thought through what you want to do, and why it is the best way to go. References Atkinson, P., & Delamont, S. (2006). Rescuing narrative from qualitative research. Narrative Inquiry, 16, 164‐172. Billig, M. (2002) Critical discourse analysis and the rhetoric of critique. In G. Weiss and R. Wodak (eds.), Critical discourse analysis: Theory and interdisciplinarity (pp. 35‐46). London: Palgrave Macmillan. Breeze, R. (2011). Critical discourse analysis and its critics. Pragmatics, 21, 493‐525. 6 Do you really need a methodology? 7 Chamberlain, K. (2000). Methodolatry in qualitative health research. Journal of Health Psychology, 5, 289‐296. Chamberlain, K. (2011). Troubling methodology. Health Psychology Review, 5, 48‐54. Chamberlain, K., Cain, T., Sheridan, J., & Dupuis, A. (2011). Pluralisms in qualitative research: From multiple methods to integrated methods. Qualitative Research in Psychology, 8, 151‐169. Chase, S. E. (2005). Narrative inquiry: Multiple lenses, approaches, voices. In N. K. Denzin, & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (3rd ed.) (pp. 651‐679). Thousand Oaks, CA: Sage. Corbin, J., & Strauss, A. (2008). Basics of qualitative research: Techniques and procedures for developing Grounded Theory (3rd ed.). Thousand Oaks, CA: Sage. Crotty, M. (1998). The foundations of social research. St Leonards: Allen & Unwin. Danziger, K. (1990). Constructing the subject: Historical origins of psychological research. New York: Cambridge University Press. Dean, K. (2004). The role of methods in maintaining orthodox beliefs in health research. Social Science & Medicine, 58, 675‐685. Flowers, P. (2008). Temporal tales: The use of multiple interviews with the same participant. QMiP Newsletter No. 5, 24‐27. Hammersley, M. (2011). Methodology, who needs it? London: Sage. Harper, D. J. (2012). Surveying qualitative research teaching on British clinical psychology training programmes 1992–2006: A changing relationship? Qualitative Research in Psychology, 9, 5‐12. Hodgetts, D, Chamberlain, K., & Radley, A. (2007). Considering photographs never taken during photo‐production projects. Qualitative Research in Psychology, 4, 263‐280. Janesick, V. (1994). The dance of qualitative research design: Metaphor, methodolatry, and meaning. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (pp. 209‐219). Thousand Oaks, CA: Sage. Koch, S. (1981). The nature and limits of psychological knowledge: Lessons of a century qua ‘science'. American Psychologist, 36, 257–269. Madill, A., Gough, B., Lawton, R., & Stratton, P. (2005). How should we supervise qualitative projects? The Psychologist, 18, 616‐618. Mitchell, T., Friesen, M., Friesen, D., & Rose, R. (2007). Learning against the grain: Reflections on the challenges and revelations of studying qualitative research methods in an undergraduate psychology course. Qualitative Research in Psychology, 4, 227‐240. O’Neill, P. (2002). Tectonic change: The qualitative paradigm in psychology. Canadian Psychology, 43, 190‐194. 7 Do you really need a methodology? 8 Rennie, D. L., Watson, K. D., & Monteiro, A. (2002). The rise of qualitative research in psychology. Canadian Psychology, 43, 179‐189. Rolfe, G. (1995). Playing at research: methodological pluralism and the creative researcher. Journal of Psychiatric and Mental Health Nursing, 2, 105‐109. Sheridan, J., & Chamberlain, K. (2011). The power of things. Qualitative Research in Psychology, 8, 315‐332. Smith, J. A. (2011). Evaluating the contribution of interpretative phenomenological analysis. Health Psychology Review, 5, 9‐27. Smith, J. A, Flowers, P., & Larkin, M. (2009). Interpretative Phenomenological Analysis: Theory, method and research. London: Sage. Stoppard, J. (2002). Navigating the hazards of orthodoxy: Introducing a graduate course on qualitative methods into the psychology curriculum. Canadian Psychology, 43, 143‐ 153. Tashakkori, A., & Teddlie, C. (2003). Issues and dilemmas in teaching research methods courses in social and behavioural sciences: US perspective. International Journal of Social Research Methodology, 6, 61‐77. Thompson, A., Smith, J. A., & Larkin, M. (2011) Interpretative Phenomenological Analysis and clinical psychology training: Results from a survey of the Group of Trainers in Clinical Psychology. Clinical Psychology Forum, no. 222, 15‐19. Thorne, S. (2011). Toward methodological emancipation in applied health research. Qualitative Health Research, 21(4), 443‐453. 8