Refining Analysis

Last modified date

This post is in a series of posts about simplifying and clarifying the analysis system. As I’ve moved forward, more clarity comes to light.

How to actually define units of meaning (eg Reed, 2006, Marton & Pong, 2005) for the intended outcome spaces (remember, there are five, a main one, and four subsidiaries, if you like) is the major challenge. Defining what a unit of meaning is, i.e., the difference between any old utterance and an utterance of worth, is the crux of the whole thing. It is a fine balance of bringing your understanding and expertise to bear and in so doing becoming an instrument of the research, and bracketing presuppositions and biases that you may have about what is worthy and what is not. Making decisions about the nature of the phenomenon of inquiry are therefore vital, to delimit the research focus, but retain as an objective an outlook as possible. Even if it is impossible to be completely unbiased (clue: it is), you can at least make bias transparent, and acknowledge its possible impact, and discuss that.

After that you have the challenge of deciding how to judge whether a dimension of variation (a unit of meaning that is noticeably similar in nature between transcripts, but exhibits variation) is promoted to being an actual category of variation, or just remains a dimension.

I had a variety of sources for guidance: Edwards, 2005 (levels of presence), Cope, 2002, 2004 (structure of awareness tables for dimensions of variation factors and inclusivity relaitonships) and Taylor & Cope, 2007 (really good tables showing levels of understanding and multiple DoV), plus later adding Kaapu & Tiennen, 2010 for an alternate way of looking at ‘experience’ outcome spaces. Plus Riosko, 2007, for overall ideas on how to manage analysis stages and then how to write about them. All added some understanding but it was still difficult to know how to really ‘do it’. Sjoström and Dahlgren (2002), with frequency, position and pregnancy guidance are hugely useful. Still I pondered. In the past few weeks things have become slightly clearer, though I’m still full of doubt. I have taken the following approach, and the analysis system being applied to the transcript data is as follows:

  • First, open code all transcript data as much as possible. This permits data to be managed more easily, without any analysis.
  • Then, sort open codes into parent aggregate groups, with the most populous node aggregate groups denoting early conceptual ‘primary dimensions of variation’, that might go on to form categories of experience variation at collective level (simple ‘ frequency’ of code references).
  • The primary dimensions of variation that exhibited sufficient variation of meaning within themselves to form categories of experience variation in the intended outcome spaces were then selected, first for the activity as a whole, the smart learning journey, then for the four system elements of a smart learning journey, all using the same approach but with different primary dimension of variation groupings. This was perhaps the most challenging stage of the analysis, as required decisions about what was most relevant as meaning for the purpose of the desired outcome space.
  • ‘Secondary dimensions of variation’ are included within primary groupings where utterances are related in some way. These will be noted in relevant tables.

Sjöström & Dahlgren’s (2002) frequency, position and pregnancy methods were used to reflect on all utterances, for an individual utterance itself, then in context of the whole individual transcript, then across the collective of transcripts. This repeated expanding out then focusing in process helped to reflect on the transcripts in a systematic way, developing clarity of purpose.

Another important aspect is the reflection on what is referential (the meaning) and what is structural (the how, the ‘practical’ aspects) of an utterance. For example, I reflected a lot on whether ‘Discovery’ was a category of variation, as it was coded often in a variety of similar interpretations (finding out, discovering, wayfinding, exploring, treasure hunting…). But many of these utterances were structural, attached to something else that to me indicated more actual meaning: e.g. ‘it was fun discovering…’, or ‘I enjoyed finding out’. While discovering is part of this meaning, (I would argue) it is not a meaning in itself. The meaning is in the enjoyment of discovering, the fun of finding out. I’m still reflecting on this as I keep re-reading collections of utterances, to potentially reinterpret them.

Not all the transcript utterances are selected for inclusion in formation of experience variation, as this would be impossible, with too many variations and too few instances of each. However, by making decisions on the most relevant utterances that demonstrate the variation of meaning in experience, one can begin to outline an architecture of variation (Marton & Booth, 1997, p. 202) for the intended outcome spaces. This process was iterative, requiring many repeated readings, sorting and interpretation of meaning in the contexts of the outcome spaces.


THE STRUCTURE OF AWARENESS OF A SMART LEARNING JOURNEY: explanation of categories of variation
CATEGORY OF DESCRIPTION
STRUCTURE OF AWARENESS
PRIMARY DoV
Indicated by collective frequency/ position/ pregnancy, denoting its ‘category status’
REFERENTIAL: meaning, reasoning focus interpreting whole (theme)
INTERNAL:
the theme
the ’near’ thematic field
EXTERNAL: the further thematic field into the margin
DoV 1 Obligation
  • Obligation
  • requirements
  • tasks
Doing the tasks;
‘what we had to do’;
Questions, tasks, obligations, requirements, own assignment or coursework
Relevance to own work, grading, ‘being marked’, usefulness, reason to do it, time needed or set aside (available), purpose
DoV 2 Social
  • Discussing
  • helping
  • working together
  • being social
Discussing the tasks, discussing things associated with tasks, discussing other things about the location
working together to help each other, discussing the technology, working out ‘who was going to do what’, sharing technology
thinking about collaboration as a help to learning, other social aspects,getting to know each other, other passers by, fun and enjoyment with friends
DoV 3 Being there
  • Being there
  • being in the place
  • being there at that time
Being ‘in the place’, it ‘being real’, ‘living it’, ‘living in the picture’, walking in their shoes, at that time, in that moment
seeing the close context, media and knowledge ‘immediately’ at the place, not wasting time, ‘doing it now’, not being like a book or online, technology mediation for discovery of place, feeling a place
mood and atmosphere of place, weather, light, sounds, wider context of surroundings, knowing the locations on a map (the route), being like a tourist, taking notice of surroundings, inspiration,
imagination
visiting/exploring other locations for learning and/or inspiration
DoV 4 Knowledge
  • Knowledge, place for own sake
  • Experience as gaining benefit
  • Knowledge and Place as Value
personal research, motivation, own experience of the journey, the journey being of benefit, the journey as value for learning,
personal reasoning, imagination, creativity, curiosity, own interest in topic(s), inspiration, learning something new,
potential use or purpose, preparedness, prior or post research, additional knowledge construction or discovery,
visiting/exploring other locations for learning and/or inspiration
DoV 5 Novelty
(seen in Maltese edu cohort PG1, PG2)
  • novel
  • new
  • different
the newness of the experience, novelty,  the novelty of the technology, a new way to learn, another world
being different than classroom learning, being better than ‘sitting down being quiet listening to the lecturer’, being far more exciting
‘experiential learning’, not a classroom setting, never having been on a smart learning journey, learning through new ways, never having used the technology before

Further notes:
  • A Primary Dimension of Variation can indicate a category, if it shows relational and inclusivity qualities in terms of how it appears (for similarities and differences) within utterances.
  • A Primary Dimension of Variation is a primary indicator of the most significant meaning(s) of an utterance, that other primary and (perhaps) additionally secondary DoV have further relationships with. This is shown by placing the first DoV as the primary DoV. (NB In the outcome space categories shown here, there are no quotes. But, in tables with quotes, all the relational primary DoV are shown for each quote (after Cope 2004), demonstrating relational inclusivity, though not always hierarchical in the sense of any level of presence or complexity.)
  • Primary DoV5 ‘Novelty’ shown here may not be included in the final Outcome Space for the activity as a whole, as it may be reinterpreted as a secondary DoV, not a primary.
  • Secondary Dimensions of Variation are additional that further articulate the referential and structural, to describe the structure of awareness and shed light on the pedagogical relevance structure in the context of the utterances.
  • Secondary Dimensions of Variation can be created and assigned from most populous related other DoV seen in the open code aggregate parent groupings that have relevance to the primary DoV (after Taylor & Cope, 2007, Table 2).

* * *

One of the reasons I’m writing posts about devising the analysis is that phenomenography places a great deal of importance on the communicability of analysis systems and approaches taken. As yet I’m still pretty poor at being able to simplify and communicate this system. Still lots of holes. But I feel I have progressed a good bit, and am looking forward to how it looks in a month or two.

References

  • Cope, C. J. (2002) Educationally critical aspects of the concept of an information system, Informing Science Journal, 5, Vol. 2. 67–78.
  • Cope, C. (2004). Ensuring Validity and Reliability in Phenomenographic Research Using the Analytical Framework of a Structure of Awareness. Qualitative Research Journal, Vol. 4, No. 2, 2004: 5-18. Retrieved from http://search.informit.com.au/documentSummary;dn=133094720910488;res=IELHSS
  • Edwards, S. (2005). Panning for Gold: Influencing the experience of web-based information searching. (Doctoral Dissertation). Retrieved from https://eprints.qut.edu.au/16168/
  • Kaapu T., & Tiainen, T. (2010). User Experience: Consumer Understandings of Virtual Product Prototypes. In Kautz, K., & Nielsen, P., A. (Eds.), Scandinavian Information Systems Research. First Scandinavian Conference on Information Systems, SCIS 2010 Rebild, Denmark, August 20-22, 2010 Proceedings, pp. 18-33.
  • Marton, F., & Booth, S. (1997). Learning and Awareness. Mahwah, NJ: Lawrence Erlbaum Associates
  • Marton, F., & Pong, W.P. (2005). On the unit of description in phenomenography. Higher Education Research & Development Vol. 24, No. 4, November 2005, pp. 335–348
  • Reed, B. (2006). Phenomenography as a way to research the understanding by students of technical concepts. Núcleo de Pesquisa em Tecnologia da Arquitetura e Urbanismo (NUTAU): Technological Innovation and Sustainability. Sao Paulo, Brazil, 1-11.
  • Roisko, H. (2007). Adult Learners’ Learning in a University Setting A Phenomenographic Study. (Doctoral Dissertation). University of Tampere Department of Education. Finland. Retrieved from http://tampub.uta.fi/bitstream/handle/10024/67717/978-951-44-6928-2.pdf;sequence=1. Last accessed December 2018.
  • Sjöström, B. & Dahlgren L.O. (2002). Nursing theory and concept development or analysis. Applying phenomenography in nursing research. Journal of Advanced Nursing 40 (3), 339-345.