New projects starting in Summer 2012


"BABEL: Bio-inspired Architecture for Brain Embodied Language" is a new EPSRC project on the computational neuroscience modelling of language and action learning in the humanoid robot iCub, and its implementation in the neuromorphic systems SpiNNAker. Angelo Cangelosi is the PI and project coordinator, also responsible for neuro-robotics research tasks. Thomas Wennekers and Sue Denam are co-investigators in Plymouth for computational neural networks modelling. Steve Furber and David Lester lead neuromorphic research at Manchester University, and Friedmann Pulvermueller (Free University Berlin) responsible for brain imaging studies and computational modelling. Industry advisors include representatives from ARM, HONDA HRI and TMSUK Japan. The project is also co-funded by the BBSRC.

Ongoing research projects

POETICON++ (FP7 Challenge 2)

"Poeticon++: Robots need Language", is a new project on the computational mechanisms for generalisation and generation of new behaviours in robots. This is the follow-up of the successful Poeticon project. The Plymouth team consists of A. Cangelosi, T. Morse and M. Peniak. The project is ccordinated by Katerina Pastra (Cognitive Science Institute, Athens), and with other partners as Y. Aloimonos (U. Maryland., USA), L. Fadiga and G. Metta (IIT, Italy), and J. Santos-Victor (IST, Portugal)

ROBOT-ERA (FP7 Challenge 5)

The project "ROBOT-ERA: Advanced Robotic Systems and Intelligent Environments for the Ageing Population" will start in October 2011. Cangelosi and Belpaeme, in the Plymouth team, will be working on Human-Robot-Interaction interfaces and on action/language integration research. The project is coordinated by Paolo Dario (SSSA Italy), and with other partners such as A. Saffiotti (Orebro U., Sweden), Zhang (Hamburg U., Germany), 4 industrial partners (Italy, Germany), three user partners (Italy, Sweden).

ROBOTDOC (FP7 Marie Curie ITN)

The ROBOTDOC project is a Marie Curie Initial Training Networks (ITN) led by the University of Plymouth. ROBOTDOC unites top European universities to jointly train early career researchers to study the development of cognition in robotics. Partners include the University of Zurich, the Italian Institute of Technology, the University of Skovde (SE), the University of Bielefeld (D), the University of Sunderland, Uppsala University, Yale University (USA), RIKEN (Japan) and companies Telerobot and Honda. ROBOTDOC runs from 2009 to 2012 and receives 3.3M€ in funding.


"A neural network generating flexible locomotor behaviour in a simple vertebrate: studies on function and embryonic self assembly", EPSRC funded (project total £1,000,000, Plymouth receiving £344,900, 2009-2012).

ALIZ-E (FP7 Challenge 2)

ALIZ-E: "Adaptive Strategies for Sustainable Long-Term Social Interaction". Integrated project under the 7th framework programme of the European Union. Tony Belpaeme and Angelo Cangelosi coordinate the 8.3 M 4.5-year project, of which the CRNS receives 1.4 M. The ALIZ-E consortium consists of 7 academic partners (Plymouth as coordinator, Deutsches Forschungszentrum für Künstliche Intelligenz, Vrije Universiteit Brussel, Netherlands Organization for Applied Scientific Research, Imperial College, University of Hertfordshire, National Research Council Padova), one hospital (San Raffaele del Monte Tabor, Italy) and one SME (Gostai, Paris). ALIZ-E runs from 2010 to 2014.

Research projects completed


CONCEPT "Linguistic and direct transmission of concepts in human-robot networks" is an EPSRC funded project (£192,291, 2008-2011) led by Tony Belpaeme studying how robots can acquire concepts using language and how conceptual information can be transferred between robots.


COLAMN "A Novel Computing Architecture for Cognitive Systems based on the Laminar Microcircuitry of the Neocortex" is a £1,861,976 EPSRC funded project running from 2005 to 2010. The project lead is Thomas Wennekers.


CARMEN "Code analysis, repository, and modelling for e-Neuroscience" is an EPSRC funded project locally led by Roman Borisyuk (project total £4,037,770, Plymouth receiving £103,000, 2006-2010).

Distributed control in a swarm of UAVs (AFOSR/EOARD)

Project funded by the US Air Force Lab (EOARD) led by Angelo Cangelosi ($149,446, 2007-2010).

ITALK (FP7 Challenge 2)

The ITALK project "Integration and Transfer of Action and Language Knowledge in Robots" is an Integrated Project under the 7th Framework Programme, studying how language and action interact with cognition. This €6.25M project (of which Plymouth receives €1.68M) running from 2008 to 2012 will use a humanoid robot, the iCub, as a test platform and will help us understand how intelligence can be recreated. Plymouth, with Angelo Cangelosi and Tony Belpaeme, acts as coordinator of ITALK.


The Mars Rover research project, funded by the Advance Concept Team at European Space Agency (Ariadna Programme), to investigate the role of the Island Model in the optimisation of controllers for autonomous space robots. The project is led by Angelo Cangelosi, Davide Marocco, and Martin Peniak, with the assistance of Barry Bentley.

P-ARTS (Apple Inc.)

The P-ARTS project (Plymouth Advanced Robot Training Suite) is an Apple Research and Technology Support project funded in kind by Apple. Davide Marocco, Tony Belpaeme, Angelo Cangelosi and Rob Ellis will use Apple Xserve machines to support cognitive robotics research.

SCANDLE (FP7 Challenge 2)

The SCANDLE Project is an exciting collaboration between five leading European institutions: University of Plymouth: Dr Susan Denham, Dr Thomas Wennekers, Magyar Tudományos Akadémia Pszichológiai Kutatóintézet: Prof István Winkler, Institute of Neuroinformatics, University of Zurich: Dr Giacomo Indiveri, University of Cyprus: Prof Andreas Andreou, Dr Julio Georgiou, Carl von Ossietzky Universität: Prof Georg Klump.
Over the next three years we will be developing a system that will be able to identify and distinguish living beings from inanimate objects, on the basis of sound alone: a cognitive acoustic scene analysis system. This system will be able to construct composite representations of living beings exclusively through the use of information derived from sounds. This will happen in two ways:
1. in a passive way, to detect sounds generated or caused by living beings;
2. in an active way, using a newly developed microsonar device. This device will emit sounds that bounce off objects. The system will learn to recognise patterns in the sounds that are returned.
More detailed explanations of these two types of detection are available here

Virtual Research Centre in Personal Robotics (EPSRC)

The Virtual Research Centre in Personal Robotics is a EPSRC funded project (£118,458, 2007-2010) led by Guido Bugmann to create a virtual UK-wide research centre for personal robotics.


VALUE "Vision, Action, Language Unified by Embodiment" is an EPSRC funded project (£809,132, of which Plymouth receives £507,722 , 2008-2011) between the University of Dundee and Plymouth.

Current research themes

Neuro-Inspired (NIR) Robotics Group

The Neuro-Inspired Robotic (NIR) group focuses on the application of brain-inspired modelling methods for the design of cognitive capabilities in humanoid robots. The group includes leading experts in brain modelling (Roman Borisyuk, Thomas Wennekers), cognitive systems (Tony Belpaeme, Angelo Cangelosi) and interactive robotics (Guido Bugmann). The projects COLAMN, ITALK, ROBOTDOC, VRCPS and ALIZ-E are the core active research programmes of the NIR.

Fabric Manipulation

The overall aim is the development of robot skills required for the manipulation of fabrics in an unstructured environment, e.g. home or laundry. This project focuses on sorting tasks, such as those conducted before placing cloth in a washing machine. The objective is the development of artificial vision and manipulation algorithms enabling a small humanoid robot to conduct fabric sorting tasks. Contact Peter Gibbons, Phil Culverhouse or Guido Bugmann.

Bio-mimetic robotic cognitive processing

This research theme has two main aims. (1) To track and interpret human eye gaze to improve our understanding of cognitive processes and (2) to build a robot which organises its decision making based on insights from human decision making gained through eye tracking. Contact Chris Ford, Guido Bugmann, Phil Culverhouse.

MIBL: Multimodal IBL. Teaching a personal robot how to play a card game

The overall aim is the development of human-robot interfaces allowing the instruction of robots by untrained users, using communication methods natural to humans. This project focuses on card game instructions, in a scenario where a user of a personal robot wishes to play a new card game with the robot, and needs to first explain the rules of the game. Game instructions are a good example of more general instructions to a personal robot, due to the range of instruction type they contain: sequences of actions to perform and rules to apply. The objective is developing a robot-student able to understand the instruction from the humans teacher and integrate them in a way that supports a game playing behaviour. The project relies on recordings of a corpus of instructions between a human teacher and a human student. The corpus is then used to define the speech recognition grammar, analysis the format of rule conveyed by natural language and list the skills required by the robot to execute the instructions. This approach is termed "corpus-based robotics".

Natural Object Categorization: Recognizing species of plankton

The aim of our research is to investigate visual object recognition in experts and apply that knowledge to machine recognition. In particular we are interested in expert perception of natural objects and scene rather than 'novice' or normal perception. Expert perception is characterised by a period of training, which is required to ensure that perceptions meet the criteria for expert behaviour. Projects include expert plankton categorisation and cytological smear slide assessment. The work also extends our knowledge of visual perception in general. Since 1989 the Natural Object Categorisation group at Plymouth University have been developing machine vision systems to categorise marine plankton. The group have focussed on the difficult task of discriminating microplankton, as it is a good model for investigating top-down influences of expert judgements on bottom-up processes. It has been particularly revealing to explore the issues of recognition in a target group of objects where natural morphological variation within species causes experts difficulties. An operational machine (known as DiCANN) has been constructed which has been extensively tested in the laboratory with field-collected specimens of a wide range of plankton species, from fish larvae and mesozooplankton to the dinoflagellates of the microplankton. DiCANN employs multi-resolution processing, ‘what and where’ coarse channel analysis with support vector machine categorisation.
The HAB Buoy project concluded with the construction of four Harmful Algal Bloom monitoring systems that have been deployed to partner sites in Italy, Spain and Ireland. The systems possess digital microscopes and DiCANN recognition software. Using a precision pumped water system, they sample 375ml per hour and image to 1 micron resolution. The DiCANN software is capable of recognising specimens that are greater than 20 micron, both phytoplankton and zooplankton, for monitoring purposes.
Projects include expert plankton categorisation, motion analysis, texture processing and cytological smear slide assessment. Contact Phil Culverhouse.

Completed research projects

  • Brain-Inspired Neuronal Model of Attention and Memory, EPSRC, £149,656, led by Roman Borisyuk.
  • Instruction-based learning for mobile robots, EPSRC, 1999-2003, led by Guido Bugmann. This joint project between the University of Plymouth and the University of Edinburgh explored a still-untapped method of knowledge acquisition and learning by intelligent systems: the acquisition of knowledge from Natural Language (NL) instruction. This is very effective in human learning and will be essential for adapting future intelligent systems to the needs of naive users. The aim of the project was to investigate real-world Instruction Based-Learning (IBL) in a generic route instruction task. Users engaged in a dialogue with a mobile robot equipped with artificial vision, in order to teach it how to navigate a simplified maze-like environment. The experimental set-up limited perceptual and control problems and also reduced the complexity of NL processing. The research focused on the problem of how NL instructions can be used by an intelligent embodied agent to build a hierarchy of complex functions based on a limited set of low-level perceptual, motor and cognitive functions. We investigated how the internal representations required for robot sensing and navigation can support a usable speech-based interface. Given the use of artificial vision and voice input, such a system can contribute to assisting visually impaired people and wheelchair users.