Doctoral Thinking

Last modified date

I’ve been working on the analysis system now for some time. That means I *still* haven’t yet fully clarified how I will analyse the transcripts and then how I will in some way interrelate the other data sets, that is, the other data that aren’t transcripts. I’m well on the way but have a distinct feeling I must streamline, make things simpler, more straightforward, more logical and effective.

Recently two things happened that have helped the mental mulching along, and I have been reflecting on potential new changes as well as the process by which this kind of iterative streamlining granularity happens.

Analysis for purpose

photo of analysis software
Developing categories of description for the first outcome space, the activity as a whole.

The challenge, of course, starts with the questions 1, and my task to find a way to evaluate learning effectiveness in a smart learning journey so that I can possibly then trace pedagogical considerations relevant to these experiences, and develop understanding of the learning process in these kinds of activities. These can then be reflected on for any implications in learning design and for the relevance of connectivist principles to smart learning activities. I’m not concerned here with deep discourse on the merits of connectivism as a theory (that’s a rather contentious moot point). What does concern me are the principles of connectivist *inspired* learning: participation, autonomy, collaboration, interactivity and so forth.

Phenomenography is my chosen preferred way of attempting to evaluate the experience of a smart learning journey, as it can examine the user experience holistically, and offers enough flexibility to capture all kinds of experience reflections that might permit learning experience to emerge without too much prompting. I am then planning to investigate further using broad distinctions of elements in a systems thinking approach 2 of this whole. (I’m also going to analyse learner generated content using various taxonomies but more on that later.)

Cutting the flab

For a long time I anticipated to do an additional layer of experience analysis, to attempt to examine ‘pedagogical aspects of interest’ 3 as a distinct set of analysis, in fact this was a very early idea, however this is where I am rethinking. At first this layer of analysis seemed essential, the way to go to find out how and what people might be learning. Now I’m not sure, I think this is unnecessary, it will probably duplicate (some of) what will emerge from the other two layers of analysis. Since I added the idea of system element analysis, which was a later idea and I think much more useful and flexible (based both in relevant smart learning literature as well as my own prior work), these two analysis layers – the whole and then the system element parts of the smart learning journey – are sufficient. They will very likely tell me what I am looking for: how and what people are learning.

Whether I still use the pedagogical aspects as I noted previously remains to be decided. Aside from having to have some way to clearly discuss actual learning that may be taking place (thereby needing some way to articulate and evaluate this) I also need to contrast this with the digital learner generated content from participants in the journey. I’m using a mixture of Bloom’s and SOLO taxonomy scores, plus perhaps a DiAL-e framework evaluation to demonstrate more ‘usual’ methods of learning evaluation, to mimic assessment and provide how learning might be evaluated that does not consider the learner experience. This can then be juxtaposed, to create discussion and highlight possible challenges or even conflicts of evaluation. I think the terms will end up just being ways to analyse this content, in terms of the scores and generally.

Providing pedagogical context

To (possibly) complicate matters further I decided at the beginning to carry out a simple literature meta analysis so as to establish a baseline of prevalent ‘connectivist principles’ related discourse in the literature. This provides context, a way to acknowledge what people are talking about, what their expectations are and perhaps what kind of theoretical backdrop exists in these discussions. While I don’t plan to do any major analysis on this aspect, I’m now also considering dropping this – after all, will it really tell me anything useful?

So what were the two things?

The first was a chance conversation with a friend, where simply by mentioning what I was up to – that is, having to explain in a few simple sentences – meant that I had to both understand and actually hear out loud what it was I was intending to do. Like when you first play a song out loud that you have written, you know instantly if it is good, if it works and what needs changing or removing.

The second was coming across some notes on making thinking visible in your thesis (Shosh Leshem, March 2016, PDF Slides). So this article is about that, about how to explain this streamlining process of the analysis systems and methods.



  1. a) How can we measure the effectiveness of smart learning experiences considering both content of learning and process for learning?
    b) Can we formulate a practical pedagogical guide for smart learning activities based on connectivist principles?
    c) How does this pedagogical guide inform the design of smart learning?
  2. Place; Knowledge; Collaboration; Technology
  3. Knowledge Construction; Role & Identity; Digital & Information Literacy; Overall engagement