Images have become ubiquitous in recent years, and their use is now incorporated in many applications covering everyday life, including scientific research and industrial manufacturing.
Members of the Computer Vision group are interested and active in a variety of research and applied projects in this area.
We have extensive experience in many areas of computer vision, image and video processing. We have worked closely with the Schools of Psychology, Engineering, Dentistry, Medicine and Optometry, and have also collaborated with industrial partners such as Renishaw, Airbus, British Aerospace, Welsh Rugby Union, local Police forces, councils and health services.
Aims
In the future we aim to attract more high-quality PhD students and more funding from UKRI and industry.
Research
Examples of prior projects include:
- machine and deep learning of imagery and video
- high resolution surface reconstruction
- computational archaeology
- analysis of high dimensional image feature spaces
- articulated human motion analysis
- sports video analysis
- video surveillance applications
- modelling crowd behaviour
- visual media quality assessment
- forgery detection
- visual attention modelling and applications
- non-photorealistic rendering
- generative image models
- reflection aware visual SLAM
- speech driven facial modelling and video synthesis
- image registration
Examples of applications of our research include:
- visual avatar for virtual delivery of healthcare
- segmentation of 3D OCT scans of retinas
- the perception of trustworthiness from smiles
- determining the effectiveness of surgery from facial morphology and temporal dynamics
- analysing the effects of alcohol on crowd dynamics
- violence digitally unrolling of fragile parchments from 3D X-ray scans analysis of birch bark manuscripts
Projects
Project name: IA Natural Language Interface for Directing Virtual Character Performances
Funded by: XR Network+ project of EPSRC
Principal investigator: Dr Yipeng Qin
Project name: Uncovering the “Instincts” of Deep Generative Models for Fair and Unbiased Visual Content Creation
Funded by: EPSRC DTP
Principal investigator: Dr Yipeng Qin
Project name: Prototyping Smart Clothes for Population-level Motion Monitoring and Analysis
Funded by: Royal Society
Principal investigator: Dr Yipeng Qin
Project name: Charting New Frontiers: An Exploratory Expedition and Pilot Study on Chattable Virtual Avatars, Unveiling Ethical and Social Dimensions in Content Delivery
Funded by: GW4
Principal investigator: Dr Yipeng Qin
Project name: Revolutionizing Visitor Experiences of Cultural Heritage: Unleashing the Power of AI-Powered Chattable Avatars in Interactive Exhibitions
Funded by: AHRC
Principal investigator: Dr Yipeng Qin
Project name: Integration of video realistic avatar capability into an artificial intelligence driven healthcare information platform
Funded by: KTP
Principal investigator: Professor David Marshall
Project name: A BioEngineering approach for the SAFE design and fitting of Respiratory Protective Equipment (BE-SAFE RPE)
Funded by: EPSRC
Co-principal investigators: Professor David Marshall and Professor Paul Rosin
Meet the team
Lead researcher
Academic staff
Events
Seminars from members and visitors are presented at the visual computing research seminar series.
Previous events
- Chattable Virtual Avatars for Museum & Archive Workshop (Dr Barbara Caddick of the University of Bristol and Dr Yipeng Qin, March 2024)
- Cardiff University and National Trust Workshop on AI Avatars (Dr Daniel Finnegan and Dr Yipeng Qin, March 2024)
- Past Meets Future: Can AI Personas Bring Historic Figures to Life? (Dr Yipeng Qin, March 2024)
Next steps
Research that matters
Our research makes a difference to people’s lives as we work across disciplines to tackle major challenges facing society, the economy and our environment.
Postgraduate research
Our research degrees give the opportunity to investigate a specific topic in depth among field-leading researchers.
Our research impact
Our research case studies highlight some of the areas where we deliver positive research impact.