Openness and Innovation in E-Learning. Deploying Learning Analytics to Support Study Success in Higher Education

Academic Paper, 2019

12 Pages, Grade: 80


Table of content

I. Introduction to Learning Analytics
1.1. Definition
1.2. Implications
1.2.1. Context
1.2.2. Implication
1.2.3. Success tips

II. Justification

III. Evaluation
3.1. Save the children’s onenet platform
3.2. Open University’s VLE


Abstract: This report presents, analyzes and critically discusses learning analytics on the extent to which it supports learning. It will introduce the learning analytics and provide its background to the current research and development in e-learning, explaining how they relate to, and differ from, other uses of data in education. The report shall also describe in details the implications associated with the use of learning analytics in the context of international training organizations and the recommendable success load map. In the same context, a justification on how the most important features of learning analytics support learning will be provided, setting out the key benefits and risks. Finally, two examples illustrating the use of learning analytics will be evaluated against how they enhances learning in relevant contexts. As far as resourcing will be given account, the continuous use and innovation in learning analytics is recommended.

Key words: Learning Analytics, Virtual Learning Environment, Learning Management System, Students Behaviour in Online Environment, Online Education and Students Retention.

I. Introduction to Learning Analytics

In the very recent decades which date the development of education technology, there has been dramatic increase in the number of people connected online for education purposes. Consequently, there has been growing interests in collection and automatic analysis of numerous data, available from VLE and other digital platforms, about learners’ behaviors in order to solve a wide range of problems (Jamila et al., 2018) and enhance students learning experience (Entesar & Henda, 2014); a research area referred to as Learning Analytics (LA), allowing teachers to have a more accurate and up-to-date follow up of each learner (Ferguson et al., 2016).

1.1. Definition

There has been no single agreed definition for learning analytics. Tanya and Banff (2011) define it as a process of capturing and processing of selected data that help both students and instructors at the course or on individual level.

Tanya & Banff ,2011

However, a popular definition of the term states that learning analytics refers to “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Maren, 2014; Long & Siemens, 2011, p. 34). Others define learning analytics as a tool for examining, understanding, and supporting student learning behavior and changing their learning environments (Drachsler and Kalz, 2016, Rubel and Jones, 2016). Niall et al. (2016) also argue that Learning analytics is a combination of expertise from different academic disciplines such as educational data mining and predictive modelling.

To sum up, one can argue that learning analytics refers to the measurement, collection, analysis and reporting of student progress data and how the curriculum is delivered. By using digital resources and systems, students generate data that can be studied to reveal predictive patterns of success, difficulty, or failure that allow teachers and students to respond in a timely manner.

According to Entesar & Henda (2014) Learning Analytics is not a new field but derives from different related fields such as Educational Data Mining, Academic Analytics, Action research, and Personalized Adaptive Learning. Olga et al. (2018) emphasize that Learning analytics is very closely related to educational data mining. While the goal of academic analytics is to support operational, institutional and financial decision-making processes, both learning analytics and educational data mining aims to understand how students learn.

Currently, learning analytics is also being widely deployed by education institutions and training organizations with hope to improve learning outcomes and support learning and teaching (Olga et al., 2018). A good example is an international humanitarian organization that uses the Learning Management System (LMS) to build its staff capacity and collects these metrics to support a more accurate, data-driven approach to curriculum design (Gill et al., 2018).

1.2. Implications

1.2.1. Context

The current organization is an educational child-focused training organization that provides children and youth with a range of training in psychosocial support for recreational purposes, in different refugee camps. As children learn better in a fun way, the organization has recently incorporated a playful learning approach through digital tablet games to increase children's engagement. These games are deliberately structured to serve as a pedagogical simulation for players and help them to achieve a particular learning goal, for example by helping learners acquire new skills, behavior or acquire new information. The organization also provides child protection training to its staff, local authorities and partner organizations in person or through its online platform. However, the organization has not started using the learning analytics yet, but started to pilot the project by recording some data in their online Learning management system and designing some feedback questions to evaluate learning sessions.

1.2.2. Implication

The rapid embracement of Learning Analytics is currently diverting educators’ attention from clearly identifying requirements and implications of using it in academic education and training organizations with hopes that analytics can make their organizations fit-for-purpose, flexible, and innovative. Its application in education is expected to provide institutions with opportunities to support learner progression and to enable personalized, rich learning on a large scale (Tempelaar, Rienties, & Giesbers, 2015)

However, to adopt this approach, institutions are urged to have powerful analytics engines (Tobarra et al., 2014), and skillfully designed visualizations of analytics results and create supportive, insightful models of primary learning processes (Papamitsiou & Economides, 2014). While personalization is becoming commonplace in learning, researchers indicate that most institutions may not be ready to exploit the variety of available datasets for learning and teaching, or have staff with the required skills in learning design (Greller & Drachsler, 2012; Stiles, 2012; Tempelaar et al., 2015)

In the define context, the use of learning analytics tool is normally intended to address specific learning analytic objectives, which vary according to the needs of relevant program stakeholders and context of application. The commonly used types of data often include learners' personal inherent traits (e.g., demographics and learning style), learners' competence traits (e.g., time spent on a resource, frequency of posting, number of logins, level of knowledge and skills), learners' motivation traits, learners' behavioral and emotional patterns or a combination of the above (Entesar & Henda, 2014). According to Audrey (2011), the analysis of captured data reveals how learners interact with the content and the discourse they have around learning materials as well as the social networks they form in the process, the degrees of connectivity, and peripheral learners.

The adoption of learning analytics by an educational training organization would be useful in identifying learners' performance trends (Rienties, et al., 2016) evaluating them on the basis of a set of performance criteria (Tempelaar et al., 2015). Learning analytics would also help provide personalized tutoring that allows participants to achieve better project performance, motivation and commitment (Kelley, & Shen, 2014; Clow, 2013; Chatti et al., 2014). This becomes essential in monitoring of learners' skills development in relation to program/project objectives (Larusson & White, 2012; Howlin & Lynch, 2014). It can be argued that this approach is the basis for stimulating learners' sense of self-regulated progress through direct feedback using visualizations and alerts (Verbert et al., 2013; Clow et al., 2013); and as such Gašević et al. (2015) and Baker & Siemens (2015) understand that such information can help project managers and training facilitators increase participants’ retention and performance through early warning alerts. This helps leadership in fulfilling internal and external accountability mandates related to target learners’ achievement, in addition to its potential to improve their learning outcomes (Uhler & Hurn, 2013, Macfadyen et al., 2014).

1.2.3. Success tips

While the organization has not started to implement Learning Analytics project, it’s important to note that in order to be successful there may be a range of technological and human challenges that need to be addressed (Jose, 2018) such as incomprehensiveness, complexity, poor data quality, etc.

One of the main concerns with learning analytics is the collection of partial data because certain interactions may not be detected by the analytic engine once they do not occur in digital environment. The use of simple web application would help collect these missing information. Furthermore, it’s advisable to used data collected once a significant number of participants have used the application.

The use of student dashboard with comprehensive features presented in simple format would give feedback to students about their learning status/progress, hence dashboard should be informative enough and if possible indicate some immediate improvement actions. For example, displaying the student status within the group so they have a clear reference on where they are compared to their peers or using a red or green traffic light could show the students’ progress status in a course.

Since learning analytic projects are mostly about finding factors that contribute to learners’ failures or successes in order to design intervention strategies that work in a given learning context, periodic follow-up meetings with relevant stakeholders beyond the pure implementation team are recommended. This goes hand in hand with clearing ethical concerns, for example by creating ethical project charter, and engage all stakeholders in the ethical committee. In addition, the use of more data sources and multiple mathematical models would enhance the learning analytics model, allow the collection of quality data and improve the predictions.

As projects have baselines, it is also important to recall it for learning analytics tool, r egularly monitoring the validity of analytical models and sure there is a feedback component on every model deployed. In other words, the model should be able to auto evaluate and report how well it performs; present analytics in a very simple terms or way using colors, symbols, or other representation mechanisms required to visually summarize information; and make sure they are easy to understand and actionable.

In order to stimulate learners’ engagement, planning a careful and personalized communication with learners is so advisable, especially those who need help according to the analytics but also with those who have good performance to keep their engagement levels up. Instructors should offer unsolicited support to identified at-risk learners who need extra-help even if they have not requested it; letting them know they are behind, if applicable, but not discouraging them. According to Jose (2018), an intervention framework is needed to help users of learning analytics tool feel comfortable interpreting and acting on their dashboards.

Finally, starting with a pilot the project designed to explore the power of this technology may be an invaluable catalyst to push the organization forward on its adoption. Pilot project should be limited in scope, cost, target quick wins; involve from the beginning all the long run key stakeholders; and effectively communicate successes (Jose, 2018)


Excerpt out of 12 pages


Openness and Innovation in E-Learning. Deploying Learning Analytics to Support Study Success in Higher Education
The Open University  (School of Educational Technology)
Openness and Innovation in e-Learning
Catalog Number
ISBN (eBook)
ISBN (Book)
Learning Analytics, Virtual Learning Environment, Learning Management System, Students Behaviour in Online Environment, Online Education and Students Retention.
Quote paper
Dr. Sixbert Sangwa (Author), 2019, Openness and Innovation in E-Learning. Deploying Learning Analytics to Support Study Success in Higher Education, Munich, GRIN Verlag,


  • No comments yet.
Read the ebook
Title: Openness and Innovation in E-Learning. Deploying Learning Analytics to Support Study Success in Higher Education

Upload papers

Your term paper / thesis:

- Publication as eBook and book
- High royalties for the sales
- Completely free - with ISBN
- It only takes five minutes
- Every paper finds readers

Publish now - it's free