
Our projects
-
Deep Earth Observation
Lead: Professor James Geach
We are innovating analysis techniques in Earth Observation (EO). Astrophysics is the ultimate remote sensing challenge. We figured that the problem of inferring the properties of distant galaxies from a limited set of (very) remote observations is no different to inferring the surface properties of the Earth using data taken from orbit. Complex data, weak signals and buried information is our bread and butter. So we are applying our experience in astrophysics-based hyper-spectral image and time series analysis to EO, bringing to bear some of the latest AI techniques. We are particularly interested in applications of Synthetic Aperture Radar (SAR) observations, which are insensitive to cloud cover.
Our first major project is called ClearSky, which can predict the full 400-2300nm Visible/Infrared ground response using SAR imaging alone, thus allowing one to derive intelligence about ground conditions (such as the health or growth stage of crops) even in the presence of cloud. We are working closely with satellite communications innovator and space gateway Goonhilly Earth Station Ltd., a member of NVIDIA's Inception Programme. Goonhilly is a legendary satellite data hub with outstanding data connectivity. Recently, GES has established a powerful centre for AI-accelerated data processing: Goonhilly's new green Tier 3 data centre hosts an AI & Deep Learning optimised data platform, which includes the NVIDIA® DGX-1™ supercomputer. This is enabling us to perform truly deep and innovative Earth Observation analysis at lightning speed.
-
Machine learning in Chemical Space
Lead: Dr Michael Schmuker
Chemical information is omnipresent in our everyday lives: from food ingredients and aroma, over plant and animal signals, to pharmaceutical drugs. Yet, “chemical space” has been notoriously difficult to tackle for data scientists. One great challenge lies in computational olfaction: To predict, in silico, whether a given molecule has a smell, and what that smell might be. In the DEEPFRAGRANCE project, we are applying cutting-edge machine learning methods to predict “fragrance-likeness” of a vast number of volatile compounds and their potential smell. This method can support fragrance and aroma chemists in discovering new and sustainable compounds for food and leisure products. Our previous efforts in this are dealt with elucidating the “olfactory code”, that is, how the nose achieves its remarkable ability to rapidly detects and identify odorants with high precision that is still unmatched by any artificial approach.
-
Neuromorphic Olfaction
Lead: Dr Michael Schmuker
Neuromorphic computing is an emerging technology that takes inspiration from the brain to achieve low-latency and low-power artificial intelligence. Key principles of neuromorphic computing are: a massively parallel architecture with lightweight compute units; communication via small, timed messages; and event-based signal representation and processing. Extensive potential gains in power efficiency and reduction in latency are expected from this technology compared to current methods in AI. The challenge is to develop novel algorithms that fully embrace the event-based sensing and processing paradigm, and thus unlock the full potential of event-based platforms such as the SpiNNaker neuromorphic hardware system. Dr Schmuker and his team developed an event-based electronic olfaction system that achieves low-latency detection of odorants. They pursue applications of this new technology in gas-based robotic navigation and agriculture.
-
Night-time Sky/Cloud Classification
Lead: Professor Hugh Jones
As it stands there is no reliable automated way of determining whether at any given moment the sky is clear-enough for a telescope to begin observations and how good the conditions are in terms of deciding whether to pursue photometric or spectroscopic observations. The project is to select a suitable training set of all-sky camera images along with other data and to train a machine-learning algorithm to do the job. The local Bayfordbury all-sky camera provides a long running set of images and importantly has plenty of data under the full range of cloudy and clear skies along with a wide range of ancillary weather data including LIDAR monitoring. The initial aim of the project is to determine whether a given image is completely cloudless (one part of the criteria for photometric skies), spectroscopic (ie. good enough for the telescope to open) or the sky is too cloudy to be useful.