AI 4 Corals: Developing our Autonomous Reef-Monitoring Catamaran
Development of automated and AI-based analytical tools for coral propagation projects
We are proud to announce the launch of our AI 4 CORALS fund-raising programme to help achieve our ambitious goals.
We are putting robots and artificial intelligence to work, to help save the coral reefs … and we need YOUR HELP!
Collecting consistent data on our large scale coral propagation projects is a challenge that the Reefscapers have been tackling for over 10 years. Currently this is achieved through the diligence of the Coral biologists, who manually photo monitor each coral frame. We have amassed over 200,000 photographs in 15 years of operation. Unfortunately, this amount of data is difficult to analyse manually, and a wealth of information lays unexploited.
Computer Vision (CV) used to extract information from photographic data on large data sets, along with Artificial Intelligence (AI), emerges as a possible way to replace the traditional bio-statistical framework, which proves mostly inconclusive in uncontrolled environments. It is therefore paramount to try and develop such tools for the future of coral propagation and offer scaling up of such operations, which, if climate forecast are to be trusted, may prove to be necessary in the future.
Reefscapers is embarking on an exciting project, concerning these avenues, using a two pronged approach. An autonomous catamaran is being developed in order to make data collection more robust, as slight inconsistencies can result in painstaking corrections. The solar powered catamaran would use both a GPS autopilot, in order to move around the island, and CV techniques for close range positioning. Once in position, the catamaran captures photographs and video thanks to a camera which can be lowered to the appropriate depth. Retrieved data will be uploaded when in range and could potentially provide a live feed. In addition the catamaran will constantly log different parameters such as temperature, current or wave height.
Feature-matching for steroscopic vision
Prototype design and construction
Good data collection is essential, but it is ineffective without the necessary data analysis. AI algorithms need to be implemented to coral reef propagation in all aspects, from species identification to growth rate calculations and correlation with environmental parameters. In coral research, many questions still remain and some of them can only be answered using large scale data analysis. For example, how important are various environmental factors relative to coral growth? How can we improve coral propagation methods? What makes individual colonies more resistant to bleaching events?
Thanks to our monitoring program we have collected a huge amount of data: We have 47,000 monitoring datasets (4 pictures each), from around 6,200 coral frames, which contain more than 300,000 individual coral fragments. This vast dataset warrants large-scale analysis to determine survival and growth rates, ultimately improving coral restoration techniques worldwide.
State of the art artificial intelligence methods including convolutional neural networks (CNN), object classification and image segmentation will be applied to extract valuable information from the collected raw data. Where is the coral fragment located in the picture? What is its size? What is next to it?
Following on from the first prototype, which will validate the overall framework for the process, fine tuning and additional development will be carried out on a second prototype for which funding is required. The retractable camera will be replaced by an autonomous rover. This will be tethered to the catamaran and will be able to navigate to the different colonies/fragments and carry out basic maintenance tasks (e.g cleaning the base of corals thanks to water jets). More sensors will be installed to retrieve fundamental environmental parameters.
AI development will focus on a database of coral pictures used for automatic species identification. The program will provide a species specific updated growth rate curve, in almost real time, and summary information concerning mortality occurrence in order to guide the coral biologists in their observations. A. millepora polyps and Coral Bleaching 2016
The proposed project development will take place over a one year period, beginning April 2019. An overall costing of $100,000 USD is forecast. In addition to the expenditure for the technological and scientific components, these project costs include staff salaries, flights and lodging for two programmers. Programmers with the necessary coral reef and coral propagation training will be sourced and will join the existing team in the Maldives.
Deep learning framework for coral growth data
We have tens of thousands of coral frame photographs in our database, with the potential to analyse the visible coral growth. To do this, we will need to automatically identify coral fragments (and hopefully their taxonomic family) and to calculate size/growth between consecutive monitoring dates. Conventional image processing software is unable to do this, so we will need to develop and use deep learning algorithms (artificial neural networks).
Development of our autonomous reef-monitoring catamaran to take our coral frame photographs
We have started to design and build a robotic camera system on a floating catamaran, that we hope in 2019 will be able to photograph all our coral frames automatically! This will enable us to scale up our operations, and to gather more data on coral growth.
For it to be autonomous (work independently and automatically), the system needs to know its position and be able to visually recognise frames. We are going to implement ‘Computer Vision’ algorithms that will enable the autopilot system to analyse the immediate environment, in real-time.
Construction of the catamaran is well under way, and we have now received the internal electronic components from overseas. We have also been working on our Artificial Intelligence system, developing software for identifying the coral fragments attached to our frames.
We use a pre-trained ‘Faster-R-CNN’ model, with extra training on 170 manually-annotated frame photographs to identify different objects (Pocillopora fragments, Acropora fragments, dead coral fragments). With the current model, it takes about 10 seconds to process one coral frame photograph, with a promising success rate of 40-45% (mean Average Precision).
Once the performance has increased and the results are more reliable, we will be able to calculate survivability for all frames and link it to recorded parameters.
We have received delivery of our catamaran! We subsequently installed the solar panels and completed the electrical connections, and we are currently testing the battery before installing the remaining systems.
We are also currently training a more complex model for the identification of coral fragments in our frame monitoring pictures. After 14 days of training, the performance is slowly improving. In the meantime, we have configured a database and various data processing tools to handle the coral fragment pictures that have been detected by this algorithm. They will then be processed through a segmentation model, which will identify the fragment more precisely.
The new software model we have trained for detection of the coral fragments is now much more successful than the old one (up from 45% to 70%). This means we now have an automated method of detecting and identifying coral species of any age, without human intervention.
The new model is a Faster R-CNN (Region-Convoluted neural network) with Resnet101 structure pre-trained on iNaturalist species dataset.
Work on the physical catamaran is continuing, as we continue to assemble and solder the electronic components.