Connecting Learning Design and Learning Analytics

Guest Editors


• Davinia Hernández-Leo, Universitat Pompeu Fabra Barcelona

• María Jesús Rodríguez-Triana, École Polytechnique Fédérale of Laussane  

Yishay Mor, independent consultant

• Paul Salvador Inventado, Carnegie Mellon University



Important dates


• Deadline: May 20, 2017 -> June 5, 2017 (extended)

• Notification to the authors: June 30, 2017 -> July 25, 2017 (extended) 

• Camera ready paper: July 30, 2017 -> August 20, 2017 (extended)

• Publication of the special issue: end of September, 2017





Learning Design (LD) and Learning Analytics (LA) are both domains of research and action that aim to improve learning effectiveness.


Learning Design or, as some prefer, Design for Learning (Beetham & Sharpe, 2013; Laurillard, 2013; Mor, Craft & Hernández-Leo, 2013), is an emerging field of educational research and practice. Its practitioners are interested in understanding how the intuitive processes undertaken by teachers and trainers can be made visible, shared, exposed to scrutiny, and consequently made more effective and efficient. Craft and Mor (2012) define learning design as “the creative and deliberate act of devising new practices, plans of activity, resources and tools aimed at achieving particular educational aims in a given context”. The emphasis on this activity as both “creative and deliberate” highlights the dual nature of design, and in particular learning design, as both a creative practice and a rigorous inquiry.


Arguably, most of the work in the field of LD has focused on the creative processes, on practices, tools and representations to support it, and on mechanisms for sharing its outputs between practitioners. Very little has been done in terms of the practices, tools and representations used for evaluating the effects of the designs. Several approaches emphasise top-down quality enhancement, which help designers to base their work on sound pedagogical principles. What is missing is the trajectory that would complete the feedback loop: the built-in evaluation of designs to see whether they achieved the expected outcomes.


Learning Analytics are defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Ferguson, 2012). LA typically employ large datasets to provide real-time or retrospective insights about the effect and effectiveness of various elements and features of learning environments. Learning analytics are rooted in data science, artificial intelligence, and practices of recommender systems, online marketing and business intelligence. The tools and techniques developed in these domains make it possible to identify trends and patterns, and then benchmark individuals or groups against these trends. LA can help to identify at-risk learners and provide interventions, transform pedagogical approaches, and help students gain insight into their own learning.


How Learning Design may help Learning Analytics? According to situational approaches, one of the prerequisites to obtain relevant outputs is not to isolate the analysis of educational data from the context in which it is embedded (Crook, 2000). Following this perspective, some authors state that this tandem between LD and LA offers the opportunity to better understand student behaviour and provide pedagogical recommendations when deviations from the original pedagogical intention emerge (Looney & Siemens, 2011; Lockyer & Dawson, 2011;  Lockyer et al., 2013; Griffiths, 2013;  Wise, 2014), addressing one of the challenges posed by LA, i.e.: interpreting the resulting data against the original pedagogical intent and the local context, to evaluate the success of a particular learning activity (Sutherland et al. 2012). This approach of linking LD and LA has been already applied to support learning e.g., using e-portfolios (Reimann et al., 2013) and on-line simulators (Lejeune & Guéraud, 2012); monitoring in CSCL (Rodríguez-Triana et al., 2015), and at different abstraction levels e.g., connecting the analysis with the accomplishment of the curriculum objectives defined in a course (Gluga et al. 2013). In a recent large-scale study by Rienties et al. (2015) the learning design of 40 large-scale modules were linked with learning behaviour, learning satisfaction and academic retention. In this explorative study Rienties et al. (2015)  indeed found that learning design decisions made by teachers were related to learning behavior of students in blended and online environments.


How Learning Analytics may support Learning Design? Reciprocally, well-formulated learning analytics can be helpful to inform teachers on the success and outcomes of their learning designs (Lockyer, Heathcote, & Dawson, 2013; Melero et al., 2015). Learning analytics can provide evidences of the impact of a design in one or several learning situations in aspects such as engagement patterns in the activities proposed by the learning design, learning paths followed by the students, time consumed to complete the activities, etc. These data can support awareness and reflection about the effects of the learning designs as well as redesign processes, by facilitating the identification of design elements that need to be revised before reuse (Chacón & Hernández-Leo, 2014).


To sum up, LD offers LA a domain vocabulary, representing the elements of a learning system to which analytics can be applied. LA in turn, offers LD a higher degree of rigor by validating or refuting assumptions about the effects of various designs in diverse contexts. There is a natural and synergistic relationship between both domains, which has led to a growing interest and some initial effort in bringing them together (Mor et al, 2015; Rienties et al, 2015). However, making these links operational and coherent is still an open challenge. Recently design patterns have been suggested as a construct that can mediate between the two domains (Inventado & Scupelli, 2015). 



Beetham, H., & Sharpe, R. (2013). Rethinking pedagogy for a digital age: Designing for 21st century learning. routledge.

Chacón, J., Hernández-Leo, D. (2014) Learning design family tree to back reuse and cooperation, In: Proceedings of the 9th International Conference on Networked Learning, Edinburg, UK, pp. 510-517.

Crook, C. (2000) Motivation and the Ecology of Collaborative Learning. In R. Joiner, K. Littleton, D. Faulkner, and D. Miell, editors, Rethinking Collaborative Learning, pp. 161-178. Free Association Books: London.

Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5-6), 304-317.

Gluga, R., Kay, K., Lister, R., Charleston, S.M., Harland, J. & Teague, D. (2013). A conceptual model for reflecting on expected learning vs. demonstrated student performance. In Proceedings of the 15th Australasian Computing Education Conference, Melbourne, Australia.

Griffiths, D. (2013) The implications of Analytics for teaching practice in Higher Education. JISC CETIS Analytics Series, 1(10):1-23.

Inventado, P.S. & Scupelli, P. (2015). Towards an open, collaborative repository for online learning system design patterns. eLearning Papers, 42(Design Patterns for Open Online Teaching): 14-27.

Laurillard, D. (2013). Rethinking university teaching: A conversational framework for the effective use of learning technologies. Routledge.

Lejeune, A. & Gueraud, V. (2012). Embedding observation means into the learning scenario: Authoring approach and environment for simulations-based learning. In 12th International Conference on Advanced Learning Technologies, Rome, Italy, pp. 273-275.

Lockyer, L. & Dawson, S. (2011) Learning designs and learning analytics. In International Conference on Learning Analytics and Knowledge,, New York, USA, pp. 153-156.

Lockyer, L., Heathcote, E. & Dawson, S. (2013) Informing pedagogical action: Aligning Learning Analytics with Learning Design. American Behavioral Scientist, 57(10), 1439-1459.

Looney, J. & Siemens, G. (2011) Assessment competency: Knowing what you know and Learning Analytics. It is time for a breakthrough. Number 3. Promethean Thinking Deeper Research Paper No.3.

Melero, J., Hernández-Leo, D., Sun, J., Santos, P., Blat, J. (2015) How was the activity? A visualization support for a case of location-based learning design, British Journal of Educational Technology, 46(2), 317-329.

Mor, Y., Craft, B., & Hernández-Leo, D. (2013). The art and science of learning design: Editoral. Research in Learning Technology, 21.

Mor, Y., & Craft, B. (2012). Learning design: reflections upon the current landscape. Research in learning technology, 20.

Mor, Y., Ferguson, R. & Wasson, B. (2015), 'Editorial: Learning design, teacher inquiry into student learning and learning analytics: A call for action', British Journal of Educational Technology 46:(2), 221-229.

Rienties, B., Toetenel, L. & Bryan, A. (2015). 'Scaling Up' Learning Design: Impact of Learning Design Activities on LMS Behavior and Performance, in 'Proceedings of the Fifth International Conference on Learning Analytics And Knowledge' , New York, USA, pp. 315-319.

Rienties, B., Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, (60). 333-341.

Rodríguez-Triana, M. J.;Martínez-Monés, A.; Asensio-Pérez, J. I. & Dimitriadis, Y (2015). Scripting and monitoring meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations, British Journal of Educational Technologies, 46(2), 330–343.

Sutherland, R; Eagle, S. and Joubert, M (2012). A vision and strategy for Technology Enhanced Learning. Report from the STELLAR Network of Excellence.

Toetenel, L., Rienties, B. (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical decision-making. British Journal of Educational Technology

Wise, A.F. (2014). Designing pedagogical interventions to support student use of learning analytics,” in Proceedings of the 4th International Conference on Learning Analytics And Knowledge, Indeanapolis, USA, pp. 203–211.








Topics of Interest


This special issue solicits original research papers framing connecting learning design with learning analytics. The main topics of interest are:


• Practical examples of synergies between LD and LA.

• Methods and tools for developing data-enriched learning design and / or design-aware learning analytics.

• Application domains for integrated LD-LA approaches, such as teacher inquiry, learning at scale, and self-determined learning.

• Theoretical and conceptual foundations, opportunities and challenges for synergies between LD and LA.

• Meta-models and mediating frameworks for connecting and correlating LD and LA.

• Utilising Design Patterns as such meta-models, and as boundary objects for all of the above.



Submission procedure 


All submissions (abstracts and later final manuscripts) must be original and may not be under review by another publication.

The manuscripts should be submitted anonymized either in .doc or in .rtf format. 
All papers will be blindly peer-reviewed by at least two reviewers. Perspective participants are invited to submit a 8-20 pages paper (including authors' information, abstract, all tables, figures, references, etc.). 
The paper should be written according to the IxD&A authors' guidelines .

Submission page -> link
(when submitting the paper please choose Domain Subjects under: 'IxD&A special issue on: ‘Connecting Learning Design with Learning Analytics’')

For scientific advices and for any query please contact the guest-editor:


• davinia [dot] hernandez [at] upf [dot] edu

• maria [dot] rodrigueztriana [at] epfl [dot] ch

• yishaym [at] gmail [dot] com

• pinventado [at] cmu [dot] edu


marking the subject as: 'IxD&A special issue on: Connecting Learning Design with Learning Analytics ''.