Queensland researchers are programming drones to monitor coral bleaching on the Great Barrier Reef.

QUT’s remote sensing and unmanned aerial vehicle (UAV) experts have partnered with the Australian Institute for Marine Science (AIMS) to test whether small drones, machine learning and specialised hyperspectral cameras can monitor the Great Barrier Reef more quickly, efficiently and in more detail than manned aircraft and satellite surveys.

A team behind QUT project leader Associate Professor Felipe Gonzalez has surveyed three reefs in the Great Barrier Reef Marine Park from 60 metres in the air, while AIMS divers recorded precise levels of coral bleaching from under the water.

“By taking readings from the air and verifying them against the AIMS data from below the surface, we are teaching the system how to see and classify bleaching levels,” said Professor Gonzalez an aeronautical engineer.

“Flying 60 metres above the water gives us a spatial resolution of 9.2 centimetres per pixel, which we’ve found to be more than enough detail to detect and monitor individual corals and their level of bleaching.

“This is great news for us because low-altitude drones can cover far more area in a day than in-water surveys and they’re not hampered by cloud cover as manned aircraft and satellites are – a system like this has the real potential to boost the frequency of monitoring activities in an economical way.

“The more data scientists have at their fingertips during a bleaching event, the better they can address it. We see small drones with hyperspectral cameras acting as a rapid response tool for threatened reefs during and after coral bleaching events.”

The Great Barrier Reef is home to around 3,000 reefs stretching 2,300 kilometres, making it slow and costly to survey using traditional methods.

Central to the new aerial system are miniaturised hyperspectral cameras, which until recently were so large and expensive only satellites and manned aircraft could carry them.

Standard cameras record images in three bands of the visible spectrum – red, green and blue – mixing those bands together to create colours as humans see them.

Professor Gonzalez said the hyperspectral camera, by comparison, captures 270 bands in the visible and near-infrared portions of the spectrum, providing far more detail than the human eye can see and at an ultra-high resolution.

“You can’t just watch hyperspectral footage in the same way we can watch a video from a standard camera – we must process all the data to extract meaning from it,” Professor Gonzalez said.

“We’re building an artificial intelligence system that processes the data by identifying and categorising the different ‘hyperspectral fingerprints’ for objects within the footage.

“Every object gives off a unique hyperspectral signature, like a fingerprint. The signature for sand is different to the signature for coral and, likewise, brain coral is different to soft coral.

“More importantly, an individual coral colony will give off different hyperspectral signatures as its bleaching level changes, so we can potentially track those changes in individual corals over time.

“The more fingerprints in our database, the more accurate and effective the system.”