Kinesemiotics: a case for wearable sensor technology PhD
- Mechanical, Electrical and Manufacturing Engineering
- Entry requirements:
- 3 years
- 6 years
- Reference number:
- Start date:
- 01 October 2018
- UK/EU fees:
- International fees:
- Application deadline:
- 09 March 2018
in the UK for research quality
in the UK for Mechanical Engineering
The Complete University Guide 2018
of 2 Queen's Anniversary Prizes
Loughborough University is a top-ten rated university in England for research intensity (REF2014) and an outstanding 66% of the work of Loughborough’s academic staff who were eligible to be submitted to the REF was judged as ‘world-leading’ or ‘internationally excellent’, compared to a national average figure of 43%.
In choosing Loughborough for your research, you’ll work alongside academics who are leaders in their field. You will benefit from comprehensive support and guidance from our Doctoral College, including tailored careers advice, to help you succeed in your research and future career.
Kinesemiotics is the study of motivated movement informed by Linguistics, Engineering, and Computer Science. The foundation of Kinesemiotics relies on effective data collection of sensor information relating to dance performance, the development of a suitable linguistic Functional Grammar model for Dance (FGD) from which to understand dance discourse, and the development of software to analyse, store and visualise such data through that model.
The heritage of dance and ballet is carried and transmitted by both practitioners and audiences; however, for all parties we are restricted to two primary sources of data: paper based notations (Labanotation, Benesh, etc), accessible only to a restricted elite of notators; video recording, with issues with perspective, resolution, accuracy, interactivity, and lacks any link to further semantic information relating to the performance in question.
An initial collaboration with the English National Ballet showed the potential benefits and applications of sensor-based notation systems, particularly for classical ballet. Such an approach also opens up the potential for novel interaction mechanisms with the data generated, including the exploitation of virtual reality technology and considerable advancements in the modelling of a theory of movement based communication with great applicative potential.
The main objective of this project is to develop a software which interfaces with a motion capture sensor suit, coupled to a commercial grade gaming engine for data manipulation and augmentation with additional semantic data based on the FGD. The PhD will be working in research labs and will be able to access to equipment resources as appropriate.
Primary supervisor: Prof. Massimiliano Zecca
Secondary supervisor: Dr. Arianna Maiorani / Dr. Russell Lock
Applicants should have, or expect to achieve, at least a 2:1 Honours degree (or equivalent) in robotics, mechatronics, biomedical engineering, or a related subject.
A relevant Master’s degree and/or experience in one or more of the following will be an advantage: computer science, human-robot interaction, linguistics or language analysis.
All students must also meet the minimum English Language requirements.
Fees and funding
Tuition fees cover the cost of your teaching, assessment and operating University facilities such as the library, IT equipment and other support services. University fees and charges can be paid in advance and there are several methods of payment, including online payments and payment by instalment. Special arrangements are made for payments by part-time students.
This is an open call for candidates who are sponsored or who have their own funding. If you do not have funding, you may still apply, however Institutional funding is not guaranteed. Outstanding candidates (UK/EU/International) without funding will be considered for funding opportunities which may become available in the School.
How to apply
All applications should be made online. Under programme name select Mechanical and Manufacturing Engineering. Please quote reference number: MZUF2018