En poursuivant votre navigation sur ce site, vous acceptez le dépôt de cookies dans votre navigateur. (En savoir plus)

Sign Language synthesis from "AZee" descriptions and blocks of decreasing granularity (H/F)

This offer is available in the following languages:
Français - Anglais

Date Limite Candidature : lundi 24 mai 2021

Assurez-vous que votre profil candidat soit correctement renseigné avant de postuler. Les informations de votre profil complètent celles associées à chaque candidature. Afin d’augmenter votre visibilité sur notre Portail Emploi et ainsi permettre aux recruteurs de consulter votre profil candidat, vous avez la possibilité de déposer votre CV dans notre CVThèque en un clic !

Faites connaître cette offre !

General information

Reference : UMR9015-MICFIL-003
Workplace : ORSAY
Date of publication : Monday, May 03, 2021
Scientific Responsible name : Michael Filhol
Type of Contract : PhD Student contract / Thesis offer
Contract Period : 36 months
Start date of the thesis : 1 October 2021
Proportion of work : Full time
Remuneration : 2 135,00 € gross monthly

Description of the thesis topic

This PhD aims at animating a virtual signer (3D character for Sign Language synthesis) from 'AZee' expressions formally representing Sign Language utterances, implementing the proposed algorithm by Filhol & McDonald [1] and using the 3D modelling software program 'Blender'.

The AZee model developed at our lab allows to synthesise Sign Language with virtual signers (3D avatars). From a linguistically relevant semantic representation, it specifies the movements and motion to produce, together with their synchronisation on a time line such as those used in 3D modelling software programs. Blender is a free piece of such software, which we want to use for the output of our animations, specified by AZee. A first effort has been conducted in the recent past [4]; the candidate would be working in the wake of that preliminary work.

In collaboration with DePaul University (Chicago, USA), we have defined an algorithm for Sign Language animation by chunks of decreasing granularity from AZee expressions [1, 2]. AZee is a formal representation model for Sign Language utterances and discourse [3].
The goal of the PhD is to implement this algorithm, testing the various ways to transfer the linguistically specified constraints on body articulation on the target software timeline, Blender being the one considered as a starting point.

References
[1] Michael Filhol, John Mcdonald, Rosalee Wolfe. Synthesizing Sign Language by connecting linguistically structured descriptions to a multi-track animation system. 11th International Conference on Universal Access in Human-Computer Interaction (UAHCI 2017) held as Part of HCI International 2017, Jul 2017, Vancouver, Canada.
[2] Michael Filhol, John Mcdonald. Extending the AZee-Paula shortcuts to enable natural proform synthesis. Workshop on the Representation and Processing of Sign Languages, May 2018, Miyazaki, Japan.
[3] Michael Filhol, Mohamed Hadjadj, Annick Choisier. Non-manual features: the right to indifference. International Conference on Language Resources and Evaluation, 2014, Reykjavik, Iceland.
[4] Fabrizio Nunnari, Michael Filhol, Alexis Heloir. Animating AZee Descriptions Using Off-the-Shelf IK Solvers. Workshop on the Representation and Processing of Sign Languages, May 2018, Miyazaki, Japan.

Work Context

Office and work station provided at LISN. Data and software available to start PhD.
Supervisor present on daily basis.
Regular exchange with DePaul University (Chicago, USA) and company IVèS (Grenoble) is expected, the latter leading the project supporting the PhD.

We talk about it on Twitter!