All the Latest REEFSCAPERS News from our International Team of Marine Biologists
Reefscapers Partnership with Nausicaá Boulogne-Sur-Mer
NAUSICAÁ & REEFSCAPERS
We are proud of our 10-year partnership with Nausicaá aquarium, and grateful for their current “Operation Corail” Endowment Fund donations towards our Reefscapers coral restoration program (running until 3 March 2020).
Vacancy: Marine Environmental Communications Coordinator
Reefscapers is a marine consultancy company based in the Maldives, developing guest-oriented programmes for resorts through coral reef restoration, sea turtle rehabilitation, aquaculture and coastal protection.
Vacancy 2020 January - now closed
We are seeking a highly motivated and out-going Environmental Coordinator to start in a brand-new role, based at a 5-star island resort in the Maldives (details provided during interview). This position is an exciting opportunity for a great communicator with experience in Public Relations, a passion for the marine environment and a love for corals!
- Communicate conservation activities to resort guests & management, and the Reefscapers team.
- Write illustrated reports (in English) of activities, updates and targets.
- Develop marketing/communication strategies to promote Reefscapers national conservation efforts.
- Run the Reefscapers social media accounts (FB, Instagram, LinkedIn) to engage audiences.
- Create content for press releases, magazine articles and public awareness.
- Conduct marine life presentations, guest-oriented snorkelling activities and other marine projects.
- Photograph and document marine life; collect data and perform data entry.
- Organise resort staff training in marine biology and conservation awareness.
- Manage and maintain all aspects of the resort’s Coral Propagation Programme.
- Manage the coral frame databases and photo updates for guest-sponsored frames.
- Conduct coral frame maintenance, photography & monitoring, and wider research studies.
- BSc/MSc in Marketing, Communications, Marine Biology, Environmental Science.
- Passionate about marine science and environmental conservation.
- Experienced in project management, photography, video editing, article/blog writing.
- Experienced with running promo campaigns across various social media platforms.
- Accurate computer skills and numeracy, including data collection and analysis.
- Experienced with Excel and databases (preferred); bonus: mapping software, programming.
- Previous experience of remote tropical island life in a customer-facing role (preferred).
- Outgoing personality, positive attitude, good team player, experience in giving awareness programs.
- Strong swimmer, qualified diver (preferred) able to work long hours in the lagoon under full sun.
- Physically fit non-smoker, able to carry 20kg weight.
- Good spoken & written English; multilingual preferred; organised and self-disciplined work ethic.
Marine Savers at Four Seasons
Marine Life from Lab to Ocean! – We refurbish our large marine aquarium, and take a microscopic look at the fish eggs from our breeding pairs of Maldivian Anemonefish. Catch up with REEFSCAPERS coral biologists, and learn about the ‘sea turtle stranding season’ in the Maldives.
Reefscapers Dynamic Duo! – Meet Martyna and Sorin, 2 dedicated interns working on our coral restoration program at Kuda Huraa. Join the dynamic duo as they start maintaining 2,000+ coral frames, collecting & transplanting coral fragments for 7 solid hours in the water each day! 💙
Updates From Our Marine Biology Teams (Sept 2019) – See the Maldivian anemonefish eggs, and watch the Linckia sea star regenerating itself from a single arm. Meet our apprentices and volunteers, helping out with our Reefscapers coral propagation projects.
Turtle Tracks, Jolly Jellies & Fishy Tales – Local school children learn about marine conservation, and come face to face with the Aurelia jellyfish in our Kreisel tank. Join us at the Baa Atoll Turtle Festival, and then follow Peggy’s oceanic journey on our interactive satellite map.
Meet Olivia, an English post-graduate student of Tropical Marine Biology. She joined us at Landaa to research our Reefscapers coral propagation programme as part of her Master’s degree.
Marine Savers May Musings – Read about the Moon jellyfish (Aurelia aurita) in our dedicated Kreisel tank. Our ‘Flying Turtles’ project has taken an interesting turn, and learn how you can contribute towards Maldivian turtle conservation by sending in your turtle photographs for ID.
Learn about Lotte’s new life at Landaa – helping with our Clownfish and jellies in the Fish Lab. There’s time to propagate some corals with our Reefscaping team, before caring for the injured turtle patients at our rescue centre and kick-starting the ‘Flying Turtles’ programme.
Updates From Landaa & Kuda Huraa (April 2019) – Take a front row seat at the Microplastics Workshop to learn about the adverse effects of plastic pollution on whale sharks in the Maldives. And then meet the thriving jellies in our new Kreisel jellyfish tank … the perfect home.
Chathu’s Story – Chathu spent ten weeks as a marine biology intern at Landaa Giraavaru, learning about the different marine biology programmes, including Turtle care, the Fish Lab and Coral propagation.
Monthly Marine Roundup (March 2019) – Fly over Kuda Huraa’s lagoon for a great aerial view of our REEFSCAPERS coral frames! Meet the juvenile flying fish down in our Fish Lab, and discover why March has been a record month for strandings of sea turtles around the Maldives.
Louise’s Life at Landaa – I am currently working on my Master’s thesis with the Reefscapers team. For my project, I will be assessing the growth rate of the coral frames that are located around the island as part of the reef restoration programme.
Meet Marine Intern Julia – There are large numbers of adult Olive Ridley turtles drifting to the Maldives, trapped in discarded (ghost) fishing gear, often wounded and dehydrated. We are busy turtle feeding and monitoring, and discussing our conservation efforts with resort guests.
Sea Turtles And Plastic – The effects of the 8 million tonnes of synthetic waste entering the oceans each year is a huge and growing concern. Plastic debris is being discovered across the entire marine ecosystem, from plankton and corals, to crabs, seabirds, turtles and whales.
AI 4 Corals: Developing our Autonomous Reef-Monitoring Catamaran
Development of automated and AI-based analytical tools for coral propagation projects
We are proud to announce the launch of our AI 4 CORALS fund-raising programme to help achieve our ambitious goals.
We are putting robots and artificial intelligence to work, to help save the coral reefs … and we need YOUR HELP!
Collecting consistent data on our large scale coral propagation projects is a challenge that the Reefscapers have been tackling for over 10 years. Currently this is achieved through the diligence of the Coral biologists, who manually photo monitor each coral frame. We have amassed over 200,000 photographs in 15 years of operation. Unfortunately, this amount of data is difficult to analyse manually, and a wealth of information lays unexploited.
Computer Vision (CV) used to extract information from photographic data on large data sets, along with Artificial Intelligence (AI), emerges as a possible way to replace the traditional bio-statistical framework, which proves mostly inconclusive in uncontrolled environments. It is therefore paramount to try and develop such tools for the future of coral propagation and offer scaling up of such operations, which, if climate forecast are to be trusted, may prove to be necessary in the future.
Reefscapers is embarking on an exciting project, concerning these avenues, using a two pronged approach. An autonomous catamaran is being developed in order to make data collection more robust, as slight inconsistencies can result in painstaking corrections. The solar powered catamaran would use both a GPS autopilot, in order to move around the island, and CV techniques for close range positioning. Once in position, the catamaran captures photographs and video thanks to a camera which can be lowered to the appropriate depth. Retrieved data will be uploaded when in range and could potentially provide a live feed. In addition the catamaran will constantly log different parameters such as temperature, current or wave height.
Feature-matching for steroscopic vision
Prototype design and construction
Good data collection is essential, but it is ineffective without the necessary data analysis. AI algorithms need to be implemented to coral reef propagation in all aspects, from species identification to growth rate calculations and correlation with environmental parameters. In coral research, many questions still remain and some of them can only be answered using large scale data analysis. For example, how important are various environmental factors relative to coral growth? How can we improve coral propagation methods? What makes individual colonies more resistant to bleaching events?
Thanks to our monitoring program we have collected a huge amount of data: We have 47,000 monitoring datasets (4 pictures each), from around 6,200 coral frames, which contain more than 300,000 individual coral fragments. This vast dataset warrants large-scale analysis to determine survival and growth rates, ultimately improving coral restoration techniques worldwide.
State of the art artificial intelligence methods including convolutional neural networks (CNN), object classification and image segmentation will be applied to extract valuable information from the collected raw data. Where is the coral fragment located in the picture? What is its size? What is next to it?
Following on from the first prototype, which will validate the overall framework for the process, fine tuning and additional development will be carried out on a second prototype for which funding is required. The retractable camera will be replaced by an autonomous rover. This will be tethered to the catamaran and will be able to navigate to the different colonies/fragments and carry out basic maintenance tasks (e.g cleaning the base of corals thanks to water jets). More sensors will be installed to retrieve fundamental environmental parameters.
AI development will focus on a database of coral pictures used for automatic species identification. The program will provide a species specific updated growth rate curve, in almost real time, and summary information concerning mortality occurrence in order to guide the coral biologists in their observations. A. millepora polyps and Coral Bleaching 2016
The proposed project development will take place over a one year period, beginning April 2019. An overall costing of $100,000 USD is forecast. In addition to the expenditure for the technological and scientific components, these project costs include staff salaries, flights and lodging for two programmers. Programmers with the necessary coral reef and coral propagation training will be sourced and will join the existing team in the Maldives.
Deep learning framework for coral growth data
We have tens of thousands of coral frame photographs in our database, with the potential to analyse the visible coral growth. To do this, we will need to automatically identify coral fragments (and hopefully their taxonomic family) and to calculate size/growth between consecutive monitoring dates. Conventional image processing software is unable to do this, so we will need to develop and use deep learning algorithms (artificial neural networks).
Development of our autonomous reef-monitoring catamaran to take our coral frame photographs
We have started to design and build a robotic camera system on a floating catamaran, that we hope in 2019 will be able to photograph all our coral frames automatically! This will enable us to scale up our operations, and to gather more data on coral growth.
For it to be autonomous (work independently and automatically), the system needs to know its position and be able to visually recognise frames. We are going to implement ‘Computer Vision’ algorithms that will enable the autopilot system to analyse the immediate environment, in real-time.
Construction of the catamaran is well under way, and we have now received the internal electronic components from overseas. We have also been working on our Artificial Intelligence system, developing software for identifying the coral fragments attached to our frames.
We use a pre-trained ‘Faster-R-CNN’ model, with extra training on 170 manually-annotated frame photographs to identify different objects (Pocillopora fragments, Acropora fragments, dead coral fragments). With the current model, it takes about 10 seconds to process one coral frame photograph, with a promising success rate of 40-45% (mean Average Precision).
Once the performance has increased and the results are more reliable, we will be able to calculate survivability for all frames and link it to recorded parameters.
We have received delivery of our catamaran! We subsequently installed the solar panels and completed the electrical connections, and we are currently testing the battery before installing the remaining systems.
We are also currently training a more complex model for the identification of coral fragments in our frame monitoring pictures. After 14 days of training, the performance is slowly improving. In the meantime, we have configured a database and various data processing tools to handle the coral fragment pictures that have been detected by this algorithm. They will then be processed through a segmentation model, which will identify the fragment more precisely.
The new software model we have trained for detection of the coral fragments is now much more successful than the old one (up from 45% to 70%). This means we now have an automated method of detecting and identifying coral species of any age, without human intervention.
The new model is a Faster R-CNN (Region-Convoluted neural network) with Resnet101 structure pre-trained on iNaturalist species dataset.
Work on the physical catamaran is continuing, as we continue to assemble and solder the electronic components.
Our new crowd-funding campaign AI-4-Corals is now launched on Indiegogo!
All the essential parts have been set up on the catamaran (propellers, battery, charge controller, Pixhawk autopilot) and using a computer we are now able to remotely control the catamaran’s movements. We plan to upgrade the charge controller to provide more power, install the cameras and a then a Raspberry Pi (an onboard computer to control trajectory).
Following installation of a new charge controller, the catamaran embarked on its longest runs to date, testing manoeuvrability and reliability. The next step is the introduction of the Raspberry Pi module.
– The catamaran’s on board micro-computer has been installed, allowing us to remotely control trajectory.
– An additional charge controller has been added to eliminate some early issues with power consumption.
– Adjustments were made to the Pixhawk autopilot in conjunction with the installation of the micro-computer.
– Initial trials of autonomous trajectory control have started.
To make our electronic connections sturdier and more resilient to rusting, we have replaced some wiring and glued many of the components to an acrylic plate. We added small fans to prevent overheating, and the stereo camera is assembled and ready to be attached to the catamaran.
Improvements are continuing to improve coral detection and identification rates; corrections are fed back into the model as training pictures, giving faster and more accurate results. We are also working on a new Python process to detect the frame structure, which will position the fragments relative to the frame and to each other.
We have submitted an abstract on our AI project for the International Coral Reef Symposium in Bremen (ICRS-2020), with a view to presenting our work and raising awareness.
View this post on Instagram
Meet Gaétan, Reefscapers’ engineer working on our latest and exciting project: AI for Corals. Gaétan graduated from @centralesupelec before starting to work at Reefscapers a year ago. He is designing an autonomous catamaran that will use specific AI softwares to monitor the coral frames around the island. After successfully running the first remotely controlled tests for navigation, he is now focusing on the “eyes” of the catamaran, the two cameras that will enable all the data acquisition. Gaétan is dividing his time between this project and helping our team of volunteers when they need an extra pair of hands for replanting or a boat pilot for coral collection. Thanks for the precious help! 😉 For all the info about the AI project, check out our video: https://youtu.be/dbrBGW3xVR4
We have ordered some replacement propeller parts, prepared a designated parking area on land, and will be working on a custom-designed trailer for transport to the ocean.
Our deep-learning model has now been trained on almost 400 coral frame photographs, comprising more than 6,000 individual pieces of coral, and is now relatively accurate at detecting and identifying corals. Next, we are using successive image processing algorithms to detect the metal bars, to then extract the shape of the frame for precise positioning of the corals on the structure.
Our catamaran is currently on stand-by as we are having quality problems with the propellers. We are in the process of replacing them and building a custom trolley to move it from the garage to the beach.
We are making good progress with the artificial intelligence. The aim is to identify which coral species are growing on the frames, just from auto-analysis of the monitoring pictures. This way we can identify frames which have lost their tags, and most importantly analyse coral survival and growth. There are several parts that need to be achieved for this to work:
1. Coral detection: This is done by a deep learning framework, a convolutional neural network (CNN) that learns what coral fragments look like, from pictures that are manually annotated. We have had good results with this, and we continue to provide the network with new training pictures to widen the scope of successful situations. To date, we have analysed a total of 605 pictures with 9,442 annotations. The mean Average Precision is an encouraging 0.74-0.77, but this is highly dependent on the similarity between test pictures and training pictures.
2. Frame detection: To know the position of coral fragments on the frame, we need an approximation of the 3D position of the frame, relative to the camera. We can detect the metal bars by merging the straight-line segments in the picture, to give the position of a few bars (we don’t know which ones) plus some false positives (rubble or other frames in the background). We then use a probabilistic method to try thousands of different camera angles and check how well the resulting pictures match the bars that we have detected. In the end, the algorithm proposes the most likely camera position it found. Once we know the camera position, we can calculate the proposed location of the whole frame on the picture. For now, we have an approximate 60% success rate.
3. Combined information: Once we know the position of the frame and the corals on the picture, we can link them together. For each coral fragment that we detect, we can know which bar it is on, and its position on the bar. We can then merge the data from the 4 monitoring pictures to have a global description of the frame, specifically which coral fragments are present, with size, living state, taxonomic family and bar position.
In order to diversify the training photographs, we have been providing the AI software model with new monitoring sets from different environmental conditions. The current version of the artificial neural network is now trained on 605 monitoring pictures, a total of 9,442 individual fragments. To obtain these results, we had to train the model for more than 3 million seconds… which equates to approximately 35 days!
We have developed a method to detect the frame structure from the monitoring pictures:
- Convert the image to monochrome and increase the contrast to emphasise the shapes.
- Apply a ‘ridge detector’ to analyse the shapes and convert them into a network of simple curves.
- Apply a ‘line segment detector’ to check the curves for straight lines.
- Group these line segments into clusters to represent the frame’s bars; merge into longer/fewer segments.
- Apply a ‘probabilistic method’ that tries random camera angles and checks if the position of the frame for that angle matches the bars that we have detected. This gives us a best fit.
This method works well on 60% of our photographs, but is less successful at processing visually complex environments, for instance when the ground is covered with rubble. So we are developing a second deep-learning method using artificial intelligence (similar to fragment detection), and have started training an artificial neural network to recognise frame bars. This will replace steps 2 to 4 above, and the final step is now used to determine the camera position.