2015 – Poughkeepsie

The first LAK Hackathon!

The first hackathon, in 2015, focused on the Apereo Open Dashboard, with data sourced from an xAPI [3] Learning Record Store. It illustrated how the concept of an Open Learning Analytics3 architecture was developing but also shone light on some structural weaknesses: shortage of usable data for demonstration/development/quality-assurance/etc, and something of a gulf between the conceptions held by different stakeholders as to what a learning analytics dashboard would contain. Subsequent work by workshop organizers has begun to develop repeatable methods for generating synthetic data to help address the first weakness [1]. The second has been the topic of ongoing research (vide infra).

Information and call to participate

2016 – Edinburgh

The second hackathon, in 2016, continued to explore the practicalities of Open Learning Analytics. Using Jisc’s emerging Learning Analytics architecture [8] as a reference point, with some data generated using the synthetic data methods which the first hackathon stimulated, the participants in the hackathon: scrutinized Jisc’s interoperability recipes, tested the interoperability of learning record stores, learning analytics processors, and dashboards, and assessed the learning analytics standards landscape. The hackathon had a lasting effect, with numerous improvements to Jisc’s interoperability recipes, and a strong message from the LAK community in favor of the greater integration of emerging learning analytics standards – xAPI and Caliper – contributing to the cooperation of ADL and IMS from mid-2016.

Information page

GitHub repository

2017 – Vancouver

The third hackathon built upon three assets: previous workshops, recent research, and recently-developed software. The first comprises the previous two LAK hackathons, the 2015 LAK Workshop “Visual Aspects of Learning Analytics” [2], and the 2016 LAK Workshop “Data Literacy for Learning Analytics” [9]. We set the scene for the workshop using recent research on actionable analytics [6], student feedback [4], and embedding learning analytics in pedagogic practice [5]. We introduced Jisc’s student app, which is being piloted with students across the UK after extensive consultation and design activities, as stimulus for discussion on the student perspective.


Workshop challenges

Tech & Data

GitHub repository


[1]         Berg, A.M. et al. 2016. The Role of a Reference Synthetic Data Generator within the Field of Learning Analytics. Journal of Learning Analytics. 3, 1 (2016), 107–128.

[2]         Duval, E. et al. eds. 2015. VISLA 2015, Visual Aspects of Learning Analytics. CEUR Workshop Proceedings (2015).

[3]         Experience API v1.0.1: 2013. http://www.adlnet.gov/wp-content/uploads/2013/10/xAPI_v1.0.1-2013-10-01.pdf.

[4]         Khan, I. and Pardo, A. 2016. Data2U. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge – LAK ’16 (New York, New York, USA, 2016), 249–253.

[5]         Kitto, K. et al. 2016. Incorporating student-facing learning analytics into pedagogical practice. Proceedings of the Annual ASCILITE Conference (2016).

[6]         Pardo, A. et al. 2016. Generating actionable predictive models of academic performance. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge – LAK ’16 (New York, New York, USA, 2016), 474–478.

[7]         Roll, I. and Winne, P.H. 2015. Understanding, evaluating, and supporting self-regulated learning using learning analytics. Journal of Learning Analytics. 2, 1 (2015), 7–12.

[8]         Sclater, N. et al. 2015. Developing an open architecture for learning analytics. EUNIS Journal of Higher Education. (2015).

[9]         Wolff, A. et al. 2016. Data literacy for learning analytics. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge – LAK ’16 (New York, New York, USA, 2016), 500–501.