Faites connaître cette offre !
Reference : UMR5505-CHLBOU-037
Workplace : TOULOUSE
Date of publication : Thursday, June 3, 2021
Scientific Responsible name : Julien Broisin
Type of Contract : PhD Student contract / Thesis offer
Contract Period : 36 months
Start date of the thesis : 1 October 2021
Proportion of work : Full time
Remuneration : 2 135,00 € gross monthly
Description of the thesis topic
This PhD position is part of the second perspective of the Contract of Objectives and Performance established by the Centre National de la Recherche Scientifique (CNRS, 2019) to contribute to educational inequalities: it aims to study the impact of explainable intelligent learning systems on the tasks performed by learners and practitioners. Indeed, intelligent learning systems based on learning analytics represent a solution to address inequalities in learners' ability to understand and
master educational content. However, the results of the analysis, or the processes leading to these results, are often complex and difficult to interpret. This leads to a low level of trust and acceptability of the system by the different stakeholders.
During the past few years, the field of Explainable Artificial Intelligence (XAI) has attracted a lot of researchers to integrate data interpretability into intelligent systems (Gunning, 2016). Previous work on XAI has mainly attempted to explain models and algorithms to AI experts or analysts (Spinner et al., 2019; Zhang and Chen, 2020). Only a few initiatives have been interested in making these models explainable for different types of users beyond experts (Arya et al., 2019; Ribera and Lapedriza, 2019). Yet, in the field of education, the differentiation between the various
stakeholders is crucial. For example, the explanations provided to teachers should be different from those provided to learners. Also, the explanations given to a K12 student should be less detailed and easier to understand than those provided to an undergraduate student. Finally, when considering learners of the same age, explanations may vary according to their levels of knowledge, skills or motivation.
This project aims to study the impact of two types of personalised intervention integrating an explanatory model. First, a guidance system will have to propose a model capable of providing explanations about the learner's traced activities, but also prompts (Wong et al., 2019) including explanations to guide the learner in her progress while taking into account inter-individual differences at the cognitive level (knowledge, skills). Second, visualisations reflecting the results of the analysis will be personalised according to the user's role (e.g. learner, teacher). In order to improve the decision making process, the matching interactive explanations will also have to be more or less detailed according to a user profile to be designed and focusing on the skills required to facilitate interpretation of the learning data.Two tools will be used to evaluate the proposed solutions: Lab4CE, a platform for learning
computer science used at the Paul Sabatier University (Broisin et al., 2017), and NoteMyProgress, a self-regulation system (Pérez-Álvarez et al., 2020) that will be deployed in the ANR LASER project.
The candidate will be co-supervised by two researchers.
Julien Broisin is Associate Professor in Computer Science at the Institut de Recherche en Informatique de Toulouse (IRIT). He coordinates a new team focusing on Technology-Enhanced Learning and composed of five permanent researchers and five PhD students. The IRIT lab, located in Toulouse, is one of the biggest research institutions in computer science in France.
Franck Amadieu is Full Professor in Cognitive and Educational Psychology. He heads the lab Cognition, Langages, Langage & Ergonomics (https://clle.univ-tlse2.fr/) that is a cognitive sciences lab. He conducts research in education with digital technologies.
We talk about it on Twitter!