ACM Logo  An ACM Publication  |  CONTRIBUTE  |  FOLLOW    

HCI Education to Support Collaborative e-Learning Systems Design

By Niki Lambropoulos, Fintan Culwin, Margarida Romero / September 2010

Print Email
Comments (3) Instapaper

HCI Education to Support Collaborative e-Learning Systems Design

September 2, 2010

Within a single generation, computers and the Web have migrated from business, industry, and commerce to permeate many aspects of everyday life. They have become so embedded that changes to the technology results in changes to society. The transition from Web 1.0 to Web 2.0 has changed users from passive recipients to active content creators. Web 2.0 is not only a technological tool but also a social phenomenon that enables collaboration and creativity (Berners-Lee, 2007). These changes present challenging design issues for Web-based education, including: how, where, and when learning happens; and how to participate effectively in producing co-created digital content (Harper et al., 2008). Anytime, anywhere e-learning, requires tools to support e-learners in the management of their learning objectives and activities.

Collaborative e-learning (CeL) requires tools that support participants as both learners and users (Smulders, 2002). Being users as well as learners, e-learners interact in a complex space which includes the tools, their peers and their e-tutors. Dillenbourg and colleagues suggested that there is a need to explore this interactional space in order to better understand e-learning (Dillenbourg et al., 2009). Contextual information from the interactions can improve e-learners' awareness, organisational and learning activities. Therefore tools need to support e-learners by exposing the contextual information within the interactional space.

Human-computer interaction education (HCI-Ed) can provide a CeL roadmap for the design of e-learning tools. HCI-Ed requires an initial conceptual understanding of the learning context, targets and values; before the design, evaluation and implementation of systems and tools from a user/learner-centred perspective. This is complemented by the study of the major phenomena surrounding the impact of these processes. Based on this understanding we have developed a seven stage process to support HCI-Ed, emphasising the initial conceptual stage.

HCI-Ed for Dynamic eLearning Systems
The seven stages in the HCI-Ed methodology are

  1. identify and understand the CeL context
  2. design
  3. evaluate
  4. build
  5. evaluate
  6. redesign and redevelop
  7. study and research of the phenomena surrounding human-computer and human-interactions.

Stage I: Identify and Understand the CeL Context
The first HCI-Ed stage is the understanding of the context, key concepts, learning objectives, problems and tasks, prior to their translation into requirements. This understanding of interactional context is obtained by identifying issues within it and the key factors that are at play within the associated learning values. The e-learners who will use the system will interact as learners and as end-users. Consequently, their involvement in design is essential to identify key problems and provide insights about the interface, the context and the learning needs.

This initial stage suggests a direct fit between the educational tasks, the learning values and the methods chosen to pursue them. In order to identify the context, targets and key problems, diverse methodologies are needed, including: literature reviews, questionnaires, algorithms, focus groups and ethnographic approaches.

Parker's (2003) considerations of e-learning's dynamic and heterogeneous nature emphasise observing and analysing the shared acts, and conditions that restrict or facilitate interactions. The observation and analysis of any discontinuities within the shared e-learning tasks can shed light on requirements for their evolution and convergence (Dix, 2003). Discontinuity in interactions can be evident in mutual mistakes that interrupt the otherwise naturally occurring activities.

This implies that interventions by e-tutors in the ways e-learners participate would be beneficial, which in turn implies the acquisition of new skills by both the tutor and the learner (Lave and Wenger, 1991). For the learner this would involve making them self- and group-aware of the patterns of interactions and leading them from this knowledge towards reflection upon their understanding of the shared meaning that is being constructed. All of this is also required by the tutor but with more of a strategic overview of the processes and patterns.

The outputs of this stage will be an understanding of, and statement of, the requirements that an educational tool is attempting to satisfy. For CeL tools these requirements would be related to facilitating interactions that lead to the construction of shared meaning, awareness of the individual learners' position within the social space, and awareness of the learner's own cognitive development.

Stage II: Design
The second HCI-Ed stage is iterative design based on the processes and outputs from the previous stage. The design needs to support sequences of interactions based on identified examples of knowledge co-construction and is expressed in the form of organisational and learning patterns.

Designs for control over knowledge convergence can be expressed as paths through the learning space. For example, the length and complexity of an e-learning path can indicate typical and atypical behaviours to a tutor. Interactions by the tutor can be triggered by the emergence of atypical behaviour. An example of identifying atypical behaviour might be a new user lurking in a discussion group. The learning path would indicate reading but not posting and the intervention might be a suggestion that it was time for them to make a contribution.

Visual representations of interactions can have an impact on individual and group awareness and reflection upon it. For example, online paths can allow the e-learner to infer an interaction and time pattern which can be used to optimise their e-learning activities. These capabilities are absent in current learning management systems (LMS) where the e-learners' group interaction awareness is based on mere perceptions of posts in discussion forums.

The outputs of this stage will be designs, usually expressed as initial sketches and/or mock-up prototypes. This stage can be thought of as interlinking the pedagogical requirements identified in the first stage and the design solutions chosen to implement them. However, before committing the resources needed to realise them they are first evaluated.

Stage III: Evaluation
The third HCI-Ed stage is the evaluation of design's quality by attempting to ensure that the tools are fit for purpose, as established in the first stage and also with respect to general usability and engineering heuristics. The evaluation can be performed informally using domain experts, in this case experienced e-tutors, and end-users, in this case prospective e-learners. Further evaluation could be performed by non-domain users/learners. For example the technique of cut-price usability engineering (Riva, 2002), also known as Guerrilla or discount usability (Nielsen, 1994), has been shown to find almost as many faults as real users would find at a fraction of the cost. The output of this stage will be a validated design which can be used to produce the initial builds.

Stage IV: Build
The fourth HCI-Ed stage involves building functional or semi-functional prototypes in order to acquire feedback on the design from actual usage. Exploratory prototypes are useful to allow different groups of users/learners to comment upon the designs. Such prototypes might be thrown away or might evolve towards a final version. In a sense this evolutionary approach to the production of a release version is a continuation of the previous stage. The output of this stage will be initial builds which can be used for more extensive, and more formal, evaluation.

Stage V: Evaluation
The fifth HCI-ED stage is the live evaluation of the prototype tools. It differs from the third stage not only in that it is evaluating a product not a design, but also by being more formal. The evaluation addresses pedagogical utility, usability and acceptability criteria. Pedagogical utility is the degree to which the functionality of the system allows the learner to reach his/her learning goal. Pedagogical usability is the degree to which the tools, contents, interfaces, and tasks are easy to use. Pedagogical acceptability is the degree to which the system is compatible with learners' motivation, affects, culture, values and experiences from using similar systems.

This process is facilitated by use of the pedagogical ergonomical questionnaire, which has been produced by the authors as a part of this approach. Measuring the degree to which the system solves the identified key problems will confirm its overall utility. The output of this stage will be a list of best features of, deficiencies in and proposed enhancements for the initial builds.

Stage VI: Redesign and Redevelop
The sixth HCI-Ed stage commences with the re-design and re-building of the functional prototypes in order to incorporate the lessons learned from the previous stage. This then produces an initial release version of the tools which can be used by real end users/learners.

The engineering philosophy of HCI-Ed is agile and iterative. The process of re-design, re-design-evaluate, re-build and re-live-evaluate should continue throughout the life of the tool. Agile methodologies advocate a large number of releases, each of which has a small functional refinement. The cycles of design, build and evaluation in this stage differs from the initial cycles in that the tools are in-situ and the evaluation is being performed by real end users/learners. The outputs of this stage are the successive versions of the tools and the evaluations of them.

Stage VII: Study and Research
The final HCI-Ed stage is the study and research of associated phenomena surrounding human-computer and human-human interactions situated within CeL. A multi-disciplinary approach is essential to effective HCI-Ed. The studies of the tool's use and its wider impact will provide structured and unstructured data in numeric and textual formats. Statistical analysis can work on numeric data from questionnaires, usage and user logs as well as quantified qualitative analysis. Qualitative analysis can work on textual data, open questions, focus groups and interviews. Alternative methodologies such as social network analysis may be used to provide several perspectives of the e-learning phenomena.

Observation and analysis of the space between the shared acts can be facilitated by involving different groups with different levels of expertise. Developers, e-tutors, and e-learners including: novices, average users and experts, are all required in participatory design, evaluation and assessment. This also implies that the satisfaction of one group's requirements, for example the e-tutors, may have implications for the other groups. Thus, this stage may require a revisit of the original design requirements to assess the direct fit between educational task, the learning values and the method chosen to pursue them. Therefore the study and research of the phenomena surrounding human-computer e-learning interactions shed light into e-learners' behaviours rather than describing them.

Conclusions
In Virginia Woolf's Between the Acts, the meanings of the dialogues and actions are ambiguous and can only be resolved within the immediate and wider context in which they are contained. Likewise it is only by understanding 1) the task, 2) the learner, and 3) the full contexts within which learning takes place, that systems and tools can be designed and deployed to aid these processes. Hence the first and last of the seven stage HCI-Ed processes are more than bookends but are essential to situating the products within their context. The middle stages are very akin to those proposed by agile software engineering methodologies. Although they are presented as discrete, they are inherently overlapping, iterative and non-linear.

The effects of Web 2.0 technologies have changed on-line learners from passive recipients of information to active collaborators in the search for shared meaning. Hence the nature of the context within which tools to support learning have to operate has itself changed from a less active to a more active landscape. It is anticipated that the changes that produced Web 2.0 will continue and hence the landscapes within which the contexts evolve will continue to change as well. Furthermore, just as informatics technologies have changed society as a whole, its use in education will change the essential nature of pedagogy. Within this scenario a methodology such as HCI-Ed, which recognises change as central and inevitable, is well suited to the production of succeeding generations of CeL support tools.


About the Authors
Niki Lambropoulos and Fintan Culwin are at the Centre for Interactive Systems Engineering, London South Bank University, London, U.K.

Margarida Romero is at the Universit� de Toulouse Le Mirail, Toulouse, France.


Acknowledgments
The work for this paper is related to the Euro-CAT-CSCL FP7 Marie Curie Industry-Academia Partnerships and Pathways (IAPP) project funded by the European Union. Euro-CAT-CSCL consortium is composed by Ouak.net, London South Bank University, Universitat Aut�noma de Barcelona, Universitat Ramon Llull and Universit� de Toulouse.


References
Berners-Lee, T. (2007). Berners-Lee warns of changes ahead. Computing Magazine. Retrieved 17/06/2007, from http://www.computing.co.uk/computing/analysis/2186086/berners-lee-warns-changes-ahead.

Dillenbourg, P., J�rvel�, S., & Fischer, F. (2009). The evolution of research on computer-supported collaborative learning: from design to orchestration. In Technology-Enhanced Learning. Principles and products (p. 3-19). Edited by N. Balacheff, S. Ludvigsen, T. de Jong, T., A. Lazonder & S. Barnes. Springer.

Dix, A. and D. Ramduny-Ellis, J. Wilkinson (2003): Trigger Analysis—Understanding Broken Tasks, The Handbook of Task Analysis for Human-Computer Interaction, D. Diaper & N. Stanton (eds.), Lawrence Erlbaum Associates, 2003.

Harper, R., Rodden, T., Rogers, Y. & Sellen, A. (2008). Being Human: Human Computer Interaction in 2020. Microsoft research, Cambridge, U.K., 2008.

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press.

Nielsen, J. (1994). Guerrilla HCI: Using Discount Usability Engineering to Penetrate the Intimidation Barrier. Retrieved 15/04/2010 from http://www.useit.com/papers/guerrilla_hci.html.

Parker, N. K. (2003). The Quality Dilemma in Online Education. In Terry D. Anderson, Fathi Elloumi, (Eds.). Theory and Practice of Online Learning, pp. 385-421.

Riva, G. (2002). Web usability revisited: a situated approach. PsychNology Journal, 1(1), p.p. 18 - 27.

Smulders, D. (2002). Designing for Learners, Designing for Users. eLearn Magazine. Retrieved 05/12/2005, from http://elearnmag.org/subpage.cfm?section=best_practices—article=11-1.




Comments

  • Wed, 12 May 2010
    Post by Leo Havemann

    Thanks for this post Roger. As someone who works with academic staff assisting them in adopting technology, I suppose I learned to love Twitter reasonably quickly and easily. Affirmation that Twitter is actually useful is probably more influential coming from a self-professed 'old luddite'!

    For anyone who is interested I blogged about the frustration I felt with the caricaturing of Twitter as pointless ephemera, and what I think it is good for, awhile back: http://ble-learning.blogspot.com/2009/04/how-i-learned-to-stop-worrying-and-love.html

    cheers, Leo @leohavemann

  • Mon, 10 May 2010
    Post by Loretta Donovan

    Roger,

    You are noticing something that has outstanding possibilities for learners and knowledge workers. From my work that will be published in fall in Social Knowledge: Using Social Media to Know What You by John Girard (Ed):

    "The inherent desire to share information on behalf of creating collective knowledge is mediated by behaviors such as truth-telling, deception, and politicizing. Beyond those social behaviors, new mediators of reality have emerged from strictly technological properties of social media. The technical prowess and presence of opinion creators makes their information more readily available as search engines move contributions to more prominent positions based on accessibility, rather than their reliability. This phenomenon causes information to be noticed, disseminated via links, and replicated in whole or part in other sites or formats. The result is an increasing quantity of information which lacks quality control for its value or capacity to broaden and build the knowledge base."

    Regards, @LorettaDonovan

  • Mon, 10 May 2010
    Post by Macia Conner

    Roger,

    Terrific post. You introduce one of my favorite benefits of Twitter: making what we learn viral so that we can introduce it to others and reflect on it together so that it become something more.

    Twitter (and all social media) is a participatory activity you'll experience differently and at a deeper level when you join in, not just watch with interest from the sidelines. I don't text either, yet here I am, learning nonstop from the sharing, engagement, and process. I hope you'll consider joining too.

    - Marcia (@marciamarcia)