This project will apply advanced visualization and analysis research to the massive quantities of data produced by water internet connected sensors. Managing precious water reserves is a key challenge facing the world. The company we will work with is a world leading developer of sensors for the water industry producing products that measure water flow, pinpoint leaks, monitor flood risk from rivers, etc. All of these devices are connected to the internet allowing real-time monitoring of the environment and water distribution to a customer level. They are currently in a major infrastructure project to install leak, pressure and flow sensors throughout water distribution network.
Application deadline: Friday 25th November 2015. Start date: January 1st 2016. Further details and application.
In this work, we investigate whether it is possible to distinguish conversational interac- tions from observing human motion alone, in particular subject specific gestures in 3D. We adopt Kinect sensors to obtain 3D displacement and velocity measurements, followed by wavelet decomposition to extract low level temporal features. These features are then generalized to form a visual vocabulary that can be further generalized to a set of topics from temporal distributions of visual vocabulary. A subject specific supervised learning approach based on Random Forests is used to classify the testing sequences to seven dif- ferent conversational scenarios. These conversational scenarios concerned in this work have rather subtle differences among them. Unlike typical action or event recognition, each interaction in our case contain many instances of primitive motions and actions, many of which are shared among different conversation scenarios. That is the interactions we are concerned with are not micro or instant events, such as hugging and high-five, but rather interactions over a period of time that consists rather similar individual motions, micro actions and interactions. We believe this is among one of the first work that is devoted to subject specific conversational interaction classification using 3D pose features and to show this task is indeed possible.
J. Deng, X. Xie, and B. Daubney, A bag of words approach to subject specific 3D human pose interaction classification with random decision forests, Graphical Models, Volume 76, Issue 3, Pages 162–171, May 2014.
More details can be found at the Swansea Vision website.
Congratulations to Arron Lacey who successfully defended his MSc by Research thesis, titled “Supervised Machine Learning Techniques in Bioinformatics: Protein Classification“.
Xianghua Xie was the supervisor, the two external examiners were Reyer Zwiggelaar (Aberystwyth) and Yulia Hicks (Cardiff), and Parisa Eslambolchilar was the viva chair.
Congratulations to Farhan Mohamed, who today, successfully defended his PhD thesis: Metaphoric Visualisation.
Phil Grant was the supervisor, Mark W. Jones the internal examiner, Roy Ruddle (Leeds) the external and Ulrich Berger the viva chair.
Joel Dearden and Aris Tsitiridis starts working here as a RIVIC RAs with Mark W. Jones and Ben Mora today.
Dr. Xianghua Xie joined the BMVA executive committee. The BMVA provides a national forum for individuals and organisations involved in machine vision, image processing, and pattern recognition in the United Kingdom.
Dr. Xianghua Xie was appointed as an Associated Editor of the IET Computer Vision journal. The vision of the journal is to publish the highest quality research work that is relevant and topical to the field, but not forgetting those works that aim to introduce new horizons and set the agenda for future avenues of research in Computer Vision.
BBC report the Welsh Rugby Union’s use of our Matchpad ipad visualization software.
New Scientist make an online article about our research helping biologists to understand their complex animal motion data collected from accelerometers.
Congratulations to Ed Grundy on winning the Best Paper prize at Eurovis 2009. The paper Visualization of Sensor Data from Animal Movement explores how visualization can inform biologists about animal motion through data collected from accelerometry tags. We worked with Rory Wilson’s smart tag group here at Swansea.