The Senior Specialist Track Research Fund (SSTRF) aims to build a culture of research in MOE and support the growth of Senior Specialists as thought leaders through providing opportunities for:

Leading and participating in research projects
Engaging in cross-divisional research collaborations
Communicating research findings and experiences
Informing policy and practice through research



Learning Tools Interoperability(LTI) Exploratory Learning Environment(ELE) Data Analytics



Interactive simulations from the Open Source Physics at Singapore (OSP@SG) project's exploratory learning environment (ELE) has been used in several online lessons hosted on the Student Learning Space (SLS). These simulations facilitate inquiry (e.g. in physics, science) and gamification (math and languages) as students complete learning tasks online. With 4000 user referrals from the SLS in May 2020, the data on students’ mouse clicks and simulation states can be harnessed as learning analytics to better support teachers in monitoring and providing more personalised feedback for quality online ELE activities. Creating such a flexible ELE provider that can track and report ELE interactions (e.g. actions on buttons, time, duration) to teachers and resource creators would be research and development that can impact on teaching practice.

This study has three aims:

  1. Glean student difficulties by clickstream analytics while they are using any OSP@SG virtual laboratories/ELE-games by designing a Learning Tools Interoperability (LTI) provider on our multi-award winning ELE library. 
  2. Test out what visualization and interpretation of these collected data on students’ actions via the ELE can be presented in a teacher dashboard for data informed teaching actions, transforming user data into teaching and learning insights
  3. Use a design based research approach to examine how teachers and students benefit from the data provided by such a platform, in terms of adapting instructions and enriching learning experiences.


Enhance teaching and learning practice by gleaning difficulties faced by students while using any OSP@SG/Easy JavaScript Simulation (EJSS) virtual laboratories/ELE-games
(In order to attain this objective, we need to first create a LTI tool provider for the entire 600 plus ELE to work on the SLS platform securely with OAuth signatures)
Experiment with what visualization of student-user data and representation (example scripted video playback, time stamp of actions, etc) will provide better support for teachers to facilitate ELE-simulation teaching and learning, in the context of EJSS ELE-interactives
Understand via a design based research method how teachers and students can benefit from the data analytics provided by such a platform, in terms of adapting instructions and enriching learning experiences

Research Questions

  • RQ1: How can teaching and learning practices be enhanced and data informed while using any OSP@SG/Easy JavaScript Simulation (EJSS) virtual laboratories/ELE-games in the Student Learning Space (SLS) via LTI tool provider?
  • RQ2: What visualizations of interaction and state data on the teacher dashboard are useful to provide better feedback and to enrich learning and teaching practices through ELE/simulation data analytics?
  • RQ3: What are the ways in which teachers can improve their instruction, and students can benefit from learning through an ELE with the provision of such learning analytics?


Impetus for Research

Big Picture

There is significant potential in exploratory learning environments (ELEs) for teaching and learning (T&L), particularly in the development of 21st century competencies (21CC), to support learning experiences that promote self-directedness, critical thinking and collaboration. In addition, ELEs can also facilitate ubiquitous, unobtrusive and game-based assessment modes. Supporting the curriculum triangle of content, pedagogy and assessment across the whole instructional cycle in such ELEs will allow us to work towards AI-enabled ELEs. This is a possible area of deep research for MOE’s AI Use Case on Learning Companion.


Impetus for Research

The impetus for this research stems from: 

1) Demand from teachers to monitor and guide students’ ELE-simulation activities 

(Figure 1)

2) Rich source of interaction data from 4000 users per month referral from the SLS to the OSP@SG digital library

Figure 1: Sample screenshot of a SGLDC discussion on whether there is a need for such data analytics from simulations, suggesting there is demand for such features


This study would provide evidence for teachers who often wonder how their students are coping with the assigned simulation activity, what is their degree of proficiency, and whether students need guidance when they face difficulty during Home Based Learning (HBL).  Literature on clickstream and game-based assessment research also supports ELEs to be able to provide insights into student understanding and misconceptions. 

A teacher dashboard for data analytics visualization and filtering of students’ actions of the online ELE data will be built as an end-to-end solution. Teachers will have the ability to decide which user interactions to display or to export out for further data exploration.



After discussion with the SLS team, we propose developing our award winning digital library of ELE to be a Learning Tools Interoperability (LTI) tool provider so that the data collected on the ELE can be stored on a MySQL database server, and attributable to specific students.  This is all done securely with OAuth signatures for validating messages between the LTI Tool Consumer (SLS) and the Tool provider (digital library of EJSS interactive/virtual laboratories). SLS will roll out its LTI enhancements by Sep 2020, thus there is a good chance that this proposal will be one of the first to test SLS’s LTI feature.



Research Method

In Chronological order.


Tool Development

The research purpose is to enrich learning and teaching practices by displaying a rich set of ELE data for better feedback and guidance through ELE/simulation data analytics. To realise this research, we need to build a LTI tool provider that serves the LTI tool consumer (SLS) through OAuth signatures validated messages.


The foundation and technical research on LTI tool provider-consumer will tap on our trainer Professor Felix to create the prototype solution and also to train project team members to contribute to the solution’s GitHub source code. The data needed for technical analytics are server logs and teachers and students usage data, expert scans on current state of LTI tool provider, LTI consumer and Oauth.


Design based Educational Research

A design based approach with an iterative cycle of development-implementation-evaluation on the technology tool, involving students and teachers is appropriate in this SSTRF. 

The data needed for user research are student-user data (survey to gather feedback and areas for improvement, semi structured interviews to triangulate initial findings), teacher-user data (survey, unstructured interviews to get teachers comfortable to reveal their deep thoughts and feelings). The survey is implemented using Google forms with questions related to the research questions, and the interviews are intended to triangulate the initial findings from the survey.



Participants will be teachers from MOE-ETD’s SGLDC Facebook group, their selected students, expert consultation with MOE-HQ Senior Specialists and Master Teachers.

The sample size targeted is at least 3 teachers with at least 3 classes (online and/or face to face) of students (involving approximately 120 Upper Primary students). The choice of participants is aimed to collect enough varied ranges of responses and support iterations as new or unexpected findings may require further rounds of data collection and LTI tool and dashboard iterations.

MOE Senior Specialists will be consulted on specialist (educational technology and research)-related issues and Master Teachers on teaching-related matters.

The rationale for choice of participants is based on sample size that is sufficiently significant in the educational research context of 100. Another consideration is also not to overtax any one particular teacher so each teacher will only have to implement this in one class of 40 students.

As a pilot study, the 3 teachers X 40 students each = 120 students at the upper primary level is a manageable scope for a one year time frame.



This research serves to provide teachers with ELE simulation data, and SLS with clear visualization and student interaction data that will enable teachers to tailor for personalised instruction. This project significantly expands the capabilities of SLS without extensive development costs for building these functionalities into the core build, freeing up development for other important teacher and student features. It is estimated that a similar ELE dashboard built within SLS with front-end ELE design capabilities already in EJSS will cost upwards of \(X00K in development costs, and annual subscription of \)X0K per year in subscription costs. This SSTRF of \(XXK and annual subscription of \)XX0, it is an upfront cost saving of 66% and the SSTRF approach leverages on the experts in the Open Source Physics community with higher chance of sustainability and adoption (Table 1). 





SLS build

Upfront development cost

\( XX,000 or less

\)X00,000 or more 

Subscription per year

\(XX0 for web server and domain name

\)0 for any number of students 

\( X0,000 or more based on the number of students in SLS.


Open Source Physics (OSP) community who upgrades the system usually for free or other research grants

SLS vendor with upgrades costly charges

Scope of PI

PI is a senior specialist who specialises in ELE with deep expertise and part of the network of OSP. Therefore, to build this is a natural extension of the existing body of work in the web server and integrate with SLS.

PI is not in the SLS team, and therefore cannot influence the SLS build components, which is already full until 2024.

Consultation with SLS team members

SLS team suggested that this tool be built via the SSTRF approach and be integrated to SLS via the LTI standards.

SLS team suggested it is difficult to do this as a SLS component due to cost and sustainability concerns.

Table 1: Comparison of SSTRF and SLS component build with compelling reasons for support this SSTRF



To be even more compelling to develop this SSTRF, this approach is aligned with SLS’s LTI plan to communicate with external systems through adoption of LTI standards, collecting and sending encrypted learning data-information through the internet with external systems.

The research team will also present interim findings and progress reports to the SLS Steering Committee in Jun 2021 and final updates in Oct 2021.



As Artificial intelligence and Data Science is one of the Research Groups within ETD X-Labs (specialist wing supporting the division), this research serves as one of the pillar Senior Specialist projects to push the boundary of Data Analytics, complementing commercial tools technology-literature scans done by AI section of ETD. Iterative dashboard design with teachers and students will allow us to understand useful and reliable ELEs, as well as build capabilities among teachers and specialists to design assessment items that better assess user understanding and behaviors. Data gathered over time are stored within EJSS, can provide a rich database to train AI assessment engines that can map user interactions to learning mastery. Our 2030 vision is to create a consistent and rich database enabled to train AI assessment engines, specifically, based on Machine Learning (ML). For example, a ML method can use our database as a training dataset in order to generate a prediction model.


The project also can provide alternative assessment methods that can plug into MOE’s AI Adaptive Learning System (ALS), a thread under the MOE AI Use Case. The training provided by our trainer will also level up the team’s expertise and our trainer can also give talks to all specialists and MOE officers in the area of data science.



No longer a black box for teachers who assign simulations as home based learning tasks, teachers can view and evaluate individuals as well as group data on particular simulation tasks (Figure 2).

Figure 2: A possible teacher dashboard visualization of students log of actions using a particular simulation showing Date, Duration and Actions



Data collection approach 

Simulations or ELE will collect any event in the student’s browser such as button clicks, slider drags, combo-box selection, drag on plotting panel objects with position and time data are captured. This approach is well-known like Clickstream analysis. These clickstream data will be sent to an external database where different Learning Analytics (LA) indicators will be calculated.


Specific learning analytics indicators/dimensions and how they will help with teaching and learning 

We will focus on LA indicators (Table 2) that are useful for knowing the student’s thought, behavior and engagement.



LA indicators 

Data collected

Related to student’s thought process

  • How many intermediate results previous to end result
  • How many simulation steps

Related to student’s behavior

  • Active/inactive time 
  • Simulation parameters changes

Related to student’s engagement

  • Simulation visits
  • Mean visit time
  • Type of engagement (occasional, hardworking, excessive)

Table 2: Specific learning analytics indicators/dimensions and how they will help with teaching and learning


For example, if the student takes unnecessary simulation steps to get to the desire state, it shows the student’s thought process (see Figure 4 to 8)


When such data-driven evidence is presented to teachers, they will be in a better position to design remediation strategies and provide personalised feedback. Therefore, we hope our Learning Analytics tools (Table 3) can provide teachers with valuable insights of student engagement and learning. The teacher can use those insights to make decisions, change learning processes, and adapt learning contents (Figure 3).


LA system 

type of data and how it works

Real-time feedback system

clickstream data is collected in real-time and sent to an external database. The LA indicators are calculated and generated in real-time as well. When new LA indicators are available, they are shown in the Dashboard view

Student behavior analytics tool

This tool is in charge of generating LA indicators related to student’s behavior. It uses any new clickstream data to update student’s behavior indicators in real-time.

Student engagement classifier

Similarly to student behavior analytics tool, this tool generates the student’s engagement indicators.

Student performance predictor

This tool uses the LA indicators, in particular the student’s thought process indicators, to predict the student’s performance

Table 3: Types of LA system and the type of data and how it works

Figure 3: Screenshot of a possible real time teacher dashboard, showing students who are online or offline and what are their last 3 actions, that can help the teacher conduct live simulation demonstration, adapt the teaching content and get students to follow the steps as guided by the teacher



Students are ultimately the recipients of all the enhanced strategies and feedback that can be harnessed from data. They will benefit from a more data driven ecosystem of SLS and LTI providers.

For example, in a kinematics mathematical modeling lesson, the student may choose to type in the model, ”X=0” as an equation that represents the motion of the simulated car when it is stationary (Figure 4).

Figure 4: Screenshot of simulation of a car on the left and the plot of position versus time on the right. The simulated motion is a stationary car and the student needs to use an equation to describe the motion via the combo box on the top right corner of model, X=0


What if the motion is now uniform with velocity at 2 m/s (Figure 5) and the student may select the model, “X = t” as an equation that can represent the motion of the simulated car. This first inaccurate selection is evidence of incorrect conceptual understanding.

Figure 5: Screenshot of simulation of a car on the left and the plot of position versus time on the right. The simulated motion is a uniform motion velocity of 2 m/s car and the student could have used the model, X=t to predict the motion inaccurately.


If the student continues to choose incorrect equations, say X= -t (Figure 6) it could mean that the student doesn’t have the concept of kinematics equations and the meaning of the coefficient v which is velocity, of the equation X = v*t for simple constant velocity motion.



Figure 6: Screenshot of simulation of a car on the left and the plot of position versus time on the right. The simulated motion is a uniform motion velocity of 2 m/s car and the student’s second attempt is the model, X=-t, to predict the motion inaccurately again


In addition, the choice of inaccurate equations say X = sin(t) (Figure 7) or any progressively closer fit equations to model the simulation’s car motion will all be captured and analysed, to provide teachers with indicators of student’s thought, behavior and engagement.



Figure 7: Screenshot of simulation of a car on the left and the plot of position versus time on the right. The simulated motion is a uniform motion velocity of 2 m/s car and the student’s third attempt is the model, X=sin(t) to predict the motion inaccurately again


Ideally, when the student gets the correct answer of X = 2*t (Figure 8), it is a clear indication of the student’s iteratively stronger understanding of mathematical models to describe uniform velocity at 2 m/s motion of a car.



Figure 8: Screenshot of a range of curriculum learning scenarios. The simulated motion is a uniform motion velocity of 2 m/s car and the student’s ideal attempt of the model, X=2*t to predict the motion accurately. Here, we are not looking for the correct model immediately but rather the struggle and repeatability of correctly using equations to predict motions in the curriculum



We speculate these data will benefit students tremendously when their teachers are provided with evidence-information on the student’s progressive modeling technique using equations as the sequence of data capture, that is an indication of the understanding and thinking behind those progressively better model- equations.

In addition, these user interactions can be compared with expected interactions in order to classify students according to their activity performance and teachers can ‘predict’ the performance of students.

With all the data from a learning experience safely stored, aggregated, and organized, it becomes possible to take full advantage of a learning analytics platform. Useful LA tools for teachers are such as real-time feedback system, student behavior analytics tool, student engagement classifier and student performance predictor provide teachers with insights and can address gaps in students’ conceptual understanding.


Data collected for teachers decision making and student learning


Data collected 

Rationale to inform teacher

Possible layout


quick check if students are on the correct ELE, can assist the teacher to identify students who are not on tasks.

last 2 actions

for synchronous classroom actions and coordination if the teacher wants to guide students step by step on the ELE, can assist the teacher to identify students who are not on tasks.

all events and action activity log on
ELE with time stamp

to allow teacher to review actions the students probably via a video player format or selectable filter such as “Element” and “Property” instead of table text form

Table 2: kind of data to be collected, the rationale to inform the teacher and a possible layout of the display inside SLS and the web server hosting all the ELE.



Wider Open Education Community

MOE has expressed interest in the past to contribute to the open education community and EJSS is already an internationally recognised platform for designing learning with ELE.  This LTI compliant learning tool data analytics feature will make EJSS able to plug into all LTI compliant LMSs benefiting all communities of EJSS users, and even accelerate research in clickstream and game-based assessment. 


Is the solution scalable?

In the future, the LA dashboard could incorporate new functionalities such as decision-making support, clustering, and others to make easier teaching tasks. 

We consider that our architecture is scalable because it is based on well-known technologies (LTI, web technologies, etc.) and, therefore further ML-based functionalities built over our system will be scalable as well. For example, in principle, other HTML5 games like those SLS team surfaced from other Institutes of Higher Learning (IHL) projects will need to incorporate the EJSS API (to be built in this proposal) in order to enjoy the same level of data collection approach for analytics in SLS.


Choice of ‘Trainer’

When the SSTRF runs into troubles with the deliverables, the project team will naturally  have to ask Professor Felix for his help as he knows precisely what he has done in his 7 years maintaining and developing the EJSS authoring toolkit. It is naive to believe that someone else is willing and able to continue to build on Professor Felix's work without Professor Felix advice and wisdom.  


Trainer’s CV

Felix J. Garcia-Clemente is Associate Professor at the Department of Computer Engineering of the University of Murcia (Spain). His research interests focus on Educational Technologies (edtech) and Learning Analytics (LA), and he specializes in applying data science to simulation-based online courses to enhance human knowledge on how we learn. He is co-author of over 100 scientific publications (including journal and conference papers) and an active member on different national and international research projects. 


In relation with edtech, he participates on the following active projects:

Learning analytics and game based assessment (LAGA). Massachusetts Institute of Technology (MIT).

iLabs Digital Laboratory Twins. Stanford University.

Indra Cyber Range. Indra Sistemas (Spain).


And his last publications focused on edtech are:

Luis de la Torre, Lars Thorben Neustock, George Herring, Jesus Chacon, Felix J. Garcia, Lambertus Hesselink, “Automatic Generation and Easy Deployment of Digitized Laboratories”, IEEE Transactions on Industrial Informatics, vol. 16(12), pp. 7328-7337, 2020.

Alberto Huertas Celdran, Felix J. Garcia Clemente, Jacobo Saenz, Luis de la Torre, Christophe Salzmann, Denis Gillet, “Self-Organized Laboratories for Smart Campus”, IEEE Transactions on Learning Technologies (TLT), vol. 13, no. 2, pp. 404-416, 2020. 

Alberto Huertas Celdrán, José A. Ruipérez-Valiente, Félix J. García Clemente, Maria Jesus Rodriguez Triana, Shashi Kant Shankar, Gregorio Martínez Pérez, “A Scalable Architecture for the Dynamic Deployment of Multimodal Learning Analytics Applications in Smart Classrooms”, Sensors, vol. 10, no. 10, art-no. 2923, 2020.


Felix is a co author of Easy JavaScript Simulations (EjsS) and a collaborator in the Open Source Physics (OSP) project and Open Source Physics@Singapore. As a member of the OSP Team, he received the Excellence in Physics Education Award from American Physical Society (APS). Felix was a visiting scholar at the research group Coordination & Interaction Systems (REACT) of the EPFL School of Engineering (Switzerland), the National Institute of Education (NIE) of NTU (Singapore) and the Ginzton Lab of Stanford University (USA).


Financial Compliance

As Professor Felix is the main leader in the Open Source Physics community who created the authoring toolkit called Easy JavaScript Simulation, there is no one else who knows how to modify the authoring toolkit to achieve the goals of this SSTRF quickly and efficiently for future adds-on. He is also a Data Analytics and Computer Science professor who can help develop the features and train the members of the SSTRF team with the necessary skills for the successful completion of this SSTRF. My conversation with ETD SSTRF Finance controller suggests this is in compliance with financial matters in MOE as Professor Felix is the only professor capable of delivering this SSTRF goals.


Return of Investment

Even though \)X0 000 is budgeted as Professor Felix’s visitation to Singapore to ‘train’ the project team members, we can easily scale the training to other interested SS who want to learn from Professor Felix to make the cost per head lower. The prerequisites are a strong interest in coding and a strong desire to create new content using EJSS authoring toolkit with the training received. Both Lawrence WEE and Darren TAN are past trainees of Professor Felix MOE-NIE edulab funded workshops and are most willing to continue training for other SS. This will lower the cost of training by many folds again. But the important question to ask is, is the training useful rather than how low is it per person? 

The project team’s 900+ simulations received 4000 users per month from Singapore and 30000 users per month in the world and it is argued that return of investment needs to dig deeper into the impact of the funding such as in terms of usage in SLS and the world. With the SSTRF proposed deliverables, the number of users will likely trend up as we intend to provide tight integration into SLS as well (this can only happen if the proposal is funded) as make the 900+ and growing number of simulations with data analytics, something not achieved by any other Ministry of Education. Professor Felix can also conduct training and sharings to all interested including Master Teachers and Specialists on his work as well as the proposal’s deliveries.

I recommend for the very interested SS, to contact Lawrence WEE to join the proposal formally as a team member to legitimise the training, form good working relationships and contribute more sustainably beyond the funding period of this SSTRF. 


Contingency plans 

To answer the question if the trainer is not able to come to Singapore, we will move to online training via Zoom (discussions and meetings) and Microsoft Remote Desktop(technical support).


Extension of the training to other SS/officers

We will extend the invitation all SS and HQ officers to schedule 2 half days per week for clinics sessions(face to face or Zoom and Remote Desktop), where officers can get targeted training (advice and support) with the related data analytics or Easy Javascript Simulation issues and problems they need help in solving. This aims to ensure the trainer gets to help officers get their skills up and solve real problems over a longer 8 week period (4 weeks, SSTRF2020 and 2 weeks SSTRF2021 combined) with consultation and training.



My position is to continue with SSTRF2021 as I am a SS trying to do work legitimately that can benefit SLS through simulation Data Analytics, it is a new area that SLS Assistant Product Owner (SLSAPO) spoke to me about pursuing under SS funds, instead of SLS build due to high build and high maintenance cost if done by a vendor.


COVID-19 cause my SSTRF2020 (SLS e-assessment interactive) and SSTRF2021(SLS interactive data analytics ) to use the trainer funds in the same year in 2021, thus it is not possible to conceptualise these 2 SSTRF as one engagement as SSTRF2020 was due to my involvement with SEAB-MOE electronic school based assessment (e-SBA) task force and SSTRF2021 was due to ETD's X-lab data analytics thrust.  


I agree with SLSAPO that if I use my expertise and community (Prof Felix in particular) it will likely be more sustainable (open source physics professors will look into it and improve the technology over time)  and cost effective


Greater financial prudence is already achieved by doing such a difficult data analytics project with \(X0,000, estimated at \)X00,000 for SLS vendor build, and the high yearly maintenance. If Professor Felix can finish these two SSTRF20 and SSTRF21 


In 2022 if it is assesses that there is something that needs Prof Felix computer science skills, then we try to explore funds to pay for training, with a focus to level up colleagues in technical know how (Artificial intelligence, Data Science, Computer Modelling). 
















literature scan of LTI provider, SLS discussion and syncing information and chart way forward


Develop a basic prototype LTI tool provider, assist in LTI tool provider scripting and Oauth of simulations to a LTI tool consumer (SLS)


To design the teacher dashboard interface integrative into SLS using carbon design framework and conduct research on user interface and experience by RA2


Design the ELE lessons with 5 SGLDC teachers, with survey and interview items protocol developed 


Improve LTI tool provider and Oauth surface by the project team and sync up with SLS 


Prepare second round of ELE lesson with teachers


Prepare report and prepare for journal paper publication to share answers/knowledge to inform learning and teaching in wider settings





  1. To design the simulations, assist in LTI tool provider scripting, survey and interview questions and data sense making, discussion with experts and reporting writing.
  2. To design the teacher dashboard interface integrative into SLS using carbon design framework and conduct research on user interface and experience. This is suggested by the SLS team as the teacher dashboard ideally needs to be polished and made using the same framework, sitting inside SLS for scalability. 



  1. To train the team members in the design and development of 
  2. ELE design for authentic and meaningful teaching and learning
  3. LTI tool provider-consumer-Oauth framework
  4. Customize EJSS editor (Prof felix is the creator) to work in the LTI tool framework and eventually communicate with SLS through LTI standards.

1 1 1 1 1 1 1 1 1 1 Rating 0.00 (0 Votes)