Monday, September 19, 2011

Methodolgy

We had a meeting last week for our project. Our wiki page was becoming too cluttered and we needed some firm decisions and direction so we could move ahead. It was great to have a f2f meeting as the discussion board was also not really working so well - the organisation of the information coming through was confusing. Some good ideas and questions to add to our wiki were discussed and sue was to set up a 2nd wiki so as not to lose the inital info but for something clearer to follow. Anyway, a further meeting, minus me, lead us to some direction as to what we could be individually working on and I was handed the Methodolgy. Having read various literature, it seems the more I read the more I need to read, to get a better understanding. To come to some clear decisions as to what it is exactly we are evaluating/areas we want to focus upon can seem never ending. During an archived wimba session I was catching up on with Tom Reeves, Sandra commented upon the limitations of the individual to be setting up the right questions for the evaluation. A good point, as we are not necessarily experts at this or even analysing the results as effectively as one with greater experience in this field. I think the answer was that this is probably budget dependent. He also suggested to collaborate with people within the institution - particularly if they have such skills in evaluating. Great listening to his ideas and experiences as i tried to make notes. I thought it quite astonishing when he relayed an experience of a successful online course delivery having been implemented for 20 years, was then dropped when the lecturer retired - too difficult/time consuming to continue with/or adjust to as new lecturers - returned to previous didactic teaching methods! Regarding methodology for evaluation it seems a triangulation approach is a practical approach. The multiple methods evaluation model identifies a set of guidelines - applying an eclectic approach. Using different approaches/measures gives a more effective understanding and accuracy of the points in question (Reeves & Hedberg, 2003). Methods to be used will include: A user review by means of a questionnaire (providing quantitative data - wider audience) A focus group - to clarify results in questionnaire (qualitative data) Expert review using the heuristic evaluation instrument (qualitative data) The Heuristic Evaluation Instrument ...for eLearning Protocols, http://it.coe.uga.edu/~treeves/edit8350/HEIPEP.html, presents sample questions that could be asked over a range of 20 areas. This paper is referred to on p158 of Chapter 7 (link via Tom Bowie's individual wiki) Through our discussions the heuristic evaluation would be an effective source of data highlighting what's working or areas that can be improved.