From their early dawn chorus to their late-night conversations, the birds that call CERN home can be heard calling across campus at all hours. Working out how many are singing is the goal of “Ambient”, a new artificial intelligence (AI) initiative to develop autonomous, solar-powered audio detectors that can identify birds by their song. Following the promising data gathered by the first 11 detectors, deployed in May, the project will roll out 40 more before the end of the year.
“Far from being a concrete jungle, CERN actually has huge areas of greenery,” said Felix Schachtschneider, a member of Ambient from CERN's Science and Civil Engineering department. Out of CERN’s 625 hectares, 415 cover non-built environments, which host a variety of animal and plant species.
To help improve CERN’s biodiversity, the Organization first needs to assess how many species already call its campus home. Ambient, a collaboration between CERN’s Site and Civil Engineering (SCE) department, its Occupational Health, Safety and Environmental Protection (HSE) unit and the University of Côte d’Azur’s Laboratory for Electronics, Antennae and Telecommunications (LEAT), has been investigating how to efficiently monitor bird populations across the CERN sites.
Ambient’s wooden “birdhouses” are actually designed to keep birds out and instead house the detectors. Powered by a small solar panel on the birdhouse roof, each detector includes a small circuit board that autonomously identifies the birdsong recorded by a small microphone, unlike traditional detectors, which send an entire audio recording over Wi-Fi to a data centre. This process consumes far more energy than the small solar panel could provide, and Wi-Fi’s short range also reduces how far these traditional energy-intensive detectors can be placed from the data centre.
“We decided to work with a low-energy-consuming AI which processes data directly on the sensor itself. Every time it detects a bird, it will only send us which bird it has detected, instead of the whole recording,” explained Schachtschneider. This onboard neural network was developed using LEAT’s expertise in EdgeAI, which combines AI with computing that takes place close to the data source.
Originally based on BirdNet, an open-source AI model that can detect more than 3000 bird species, Ambient’s network has been trained to specialise in 14 local species to reduce energy consumption. CERN and LEAT contributors used raw audio recordings from the first 11 detectors to train this compact AI model, which will be further improved with the roll-out of the next 40. Each detector can communicate with the data centre from many kilometres away via a Long Range Wide Area Network (LoRaWAN), making them more easily deployed near wildlife.
CERN’s work to protect and increase biodiversity “benefits not only the environment, but people’s well-being,” noted Schachtschneider. This AI technology also has a wide range of alternative applications, such as the detection of water system leaks, which could one day help reduce energy usage across industry.
|
Can you tell your chiffchaffs from your chaffinches? Below are five audio clips recorded by Ambient. Can you match the bird to its song?
(Videos: CERN). Answers. |