All the Latest REEFSCAPERS News from our International Team of Marine Biologists

Reefscapers Vacancy for a Communications Intern


Please check our Marine Savers employment page for more vacancies (and internships), and be sure to sign-up for our job alerts.



Vacancy : Reefscapers Communications & Marketing Internship 2020


We are seeking a motivated and creative candidate to start in a brand-new role, based at a 5-star island resort in the Maldives. This position is an exciting opportunity for a great communicator with experience of digital marketing and a love for the ocean!

  • Location: North Malé Atoll, Maldives
  • Duration: 6 months contract
  • Apply by: 1st November 2020
  • Start in: December 2020

Job Description

  • Develop communication strategies to promote Reefscapers conservation work and coral frame sponsorship.
  • Develop Reefscapers social media presence and increase engagement.
  • Develop new marketing materials and regularly publish content.
  • Take underwater photos and videos, then edit and annotate for social media and guest presentations.
  • Assist the onsite marine biologist with our coral propagation program and guest activities.

Ideal Candidate

  • BSc/MSc in Communications/Marketing, with experience in social media and campaign strategies.
  • Excellent computer skills, with experience of graphics, publishing and photo/video editing software
  • A passion for the marine environment, with basic knowledge of coral reefs and marine life.
  • Confident swimmer & qualified diver (minimum Level One).
  • Physically fit non-smoker, with a good standard of spoken and written English; multilingual preferred.


To apply, please download our full job description and carefully follow the application instructions:

Job Description

Reefscapers Communications & Marketing Internship



Reefscapers Vacancy for a Marine Biologist


Please check our Marine Savers employment page for more vacancies (and internships), and be sure to sign-up for our job alerts.

Vacancy: Reefscapers Marine Biologist [now closed]


We are seeking a highly motivated and out-going Marine Biologist to start in a brand-new role, based at a 5-star island resort in the Maldives (details provided during interview). This position is an exciting opportunity for a great communicator with a passion for marine conservation and a love for corals!

Job Description

Marine Biology

  • Manage and maintain all aspects of the resort’s Coral Propagation Program.
  • Conduct marine life presentations, guest-oriented snorkelling activities and marine projects.
  • Organise resort staff training in marine biology and conservation awareness.

Public Relations

  • Communicate conservation activities to resort guests & management, and the Reefscapers team.
  • Write illustrated reports (in English) of activities, updates and targets.
  • Contribute to the Reefscapers social media accounts.

Ideal Candidate

  • BSc/MSc in Marine Biology/Environmental Science, Marketing/Communications.
  • Outgoing personality, passionate about marine sciences and environmental conservation.
  • Strong swimmer & qualified diver (essential) able to work long hours in the lagoon in all weathers.
  • Good spoken & written English; multilingual preferred.

Marine Savers at Four Seasons

Marine Savers at Four Seasons

Marine Life from Lab to Ocean! – We refurbish our large marine aquarium, and take a microscopic look at the fish eggs from our breeding pairs of Maldivian Anemonefish. Catch up with REEFSCAPERS coral biologists, and learn about the ‘sea turtle stranding season’ in the Maldives.

Reefscapers Dynamic Duo! – Meet Martyna and Sorin, 2 dedicated interns working on our coral restoration program at Kuda Huraa. Join the dynamic duo as they start maintaining 2,000+ coral frames, collecting & transplanting coral fragments for 7 solid hours in the water each day! 💙

Updates From Our Marine Biology Teams (Sept 2019) – See the Maldivian anemonefish eggs, and watch the Linckia sea star regenerating itself from a single arm. Meet our apprentices and volunteers, helping out with our Reefscapers coral propagation projects.

Turtle Tracks, Jolly Jellies & Fishy Tales – Local school children learn about marine conservation, and come face to face with the Aurelia jellyfish in our Kreisel tank. Join us at the Baa Atoll Turtle Festival, and then follow Peggy’s oceanic journey on our interactive satellite map.

Meet Olivia, an English post-graduate student of Tropical Marine Biology. She joined us at Landaa to research our Reefscapers coral propagation programme as part of her Master’s degree.

Marine Savers May Musings – Read about the Moon jellyfish (Aurelia aurita) in our dedicated Kreisel tank. Our ‘Flying Turtles’ project has taken an interesting turn, and learn how you can contribute towards Maldivian turtle conservation by sending in your turtle photographs for ID.

Learn about Lotte’s new life at Landaa – helping with our Clownfish and jellies in the Fish Lab. There’s time to propagate some corals with our Reefscaping team, before caring for the injured turtle patients at our rescue centre and kick-starting the ‘Flying Turtles’ programme.

Updates From Landaa & Kuda Huraa (April 2019)
 – Take a front row seat at the Microplastics Workshop to learn about the adverse effects of plastic pollution on whale sharks in the Maldives. And then meet the thriving jellies in our new Kreisel jellyfish tank … the perfect home.

Chathu’s Story – Chathu spent ten weeks as a marine biology intern at Landaa Giraavaru, learning about the different marine biology programmes, including Turtle care, the Fish Lab and Coral propagation.

Monthly Marine Roundup (March 2019) – Fly over Kuda Huraa’s lagoon for a great aerial view of our REEFSCAPERS coral frames! Meet the juvenile flying fish down in our Fish Lab, and discover why March has been a record month for strandings of sea turtles around the Maldives.

Louise’s Life at Landaa – I am currently working on my Master’s thesis with the Reefscapers team. For my project, I will be assessing the growth rate of the coral frames that are located around the island as part of the reef restoration programme.

Meet Marine Intern Julia – There are large numbers of adult Olive Ridley turtles drifting to the Maldives, trapped in discarded (ghost) fishing gear, often wounded and dehydrated. We are busy turtle feeding and monitoring, and discussing our conservation efforts with resort guests.

Sea Turtles And Plastic – The effects of the 8 million tonnes of synthetic waste entering the oceans each year is a huge and growing concern. Plastic debris is being discovered across the entire marine ecosystem, from plankton and corals, to crabs, seabirds, turtles and whales.

AI 4 Corals: Developing our Autonomous Reef-Monitoring Catamaran

AI 4 Corals: Developing our Autonomous Reef-Monitoring Catamaran

Development of automated and AI-based analytical tools for coral propagation projects

We are proud to announce the launch of our AI 4 Corals fund-raising programme to help achieve our ambitious goals.
We are putting robots and artificial intelligence to work, to help save the coral reefs … and we need YOUR HELP!

Collecting consistent data on our large scale coral propagation projects is a challenge that the Reefscapers have been tackling for over 10 years. Currently this is achieved through the diligence of the Coral biologists, who manually photo monitor each coral frame. We have amassed over 200,000 photographs in 15 years of operation. Unfortunately, this amount of data is difficult to analyse manually, and a wealth of information lays unexploited.

Computer Vision (CV) used to extract information from photographic data on large data sets, along with Artificial Intelligence (AI), emerges as a possible way to replace the traditional bio-statistical framework, which proves mostly inconclusive in uncontrolled environments. It is therefore paramount to try and develop such tools for the future of coral propagation and offer scaling up of such operations, which, if climate forecast are to be trusted, may prove to be necessary in the future.

Reefscapers is embarking on an exciting project, concerning these avenues, using a two pronged approach. An autonomous catamaran is being developed in order to make data collection more robust, as slight inconsistencies can result in painstaking corrections. The solar powered catamaran would use both a GPS autopilot, in order to move around the island, and CV techniques for close range positioning. Once in position, the catamaran captures photographs and video thanks to a camera which can be lowered to the appropriate depth. Retrieved data will be uploaded when in range and could potentially provide a live feed. In addition the catamaran will constantly log different parameters such as temperature, current or wave height.

Feature-matching for steroscopic vision

Prototype design and construction

Good data collection is essential, but it is ineffective without the necessary data analysis. AI algorithms need to be implemented to coral reef propagation in all aspects, from species identification to growth rate calculations and correlation with environmental parameters. In coral research, many questions still remain and some of them can only be answered using large scale data analysis. For example, how important are various environmental factors relative to coral growth? How can we improve coral propagation methods? What makes individual colonies more resistant to bleaching events?

Thanks to our monitoring program we have collected a huge amount of data: We have 47,000 monitoring datasets (4 pictures each), from around 6,200 coral frames, which contain more than 300,000 individual coral fragments. This vast dataset warrants large-scale analysis to determine survival and growth rates, ultimately improving coral restoration techniques worldwide.

State of the art artificial intelligence methods including convolutional neural networks (CNN), object classification and image segmentation will be applied to extract valuable information from the collected raw data. Where is the coral fragment located in the picture? What is its size? What is next to it?

Planned Developments

Following on from the first prototype, which will validate the overall framework for the process, fine tuning and additional development will be carried out on a second prototype for which funding is required. The retractable camera will be replaced by an autonomous rover. This will be tethered to the catamaran and will be able to navigate to the different colonies/fragments and carry out basic maintenance tasks (e.g cleaning the base of corals thanks to water jets). More sensors will be installed to retrieve fundamental environmental parameters.

AI development will focus on a database of coral pictures used for automatic species identification. The program will provide a species specific updated growth rate curve, in almost real time, and summary information concerning mortality occurrence in order to guide the coral biologists in their observations. A. millepora polyps and Coral Bleaching 2016

The proposed project development will take place over a one year period, beginning April 2019. An overall costing of $100,000 USD is forecast. In addition to the expenditure for the technological and scientific components, these project costs include staff salaries, flights and lodging for two programmers. Programmers with the necessary coral reef and coral propagation training will be sourced and will join the existing team in the Maldives.


Deep learning framework for coral growth data

We have tens of thousands of coral frame photographs in our database, with the potential to analyse the visible coral growth. To do this, we will need to automatically identify coral fragments (and hopefully their taxonomic family) and to calculate size/growth between consecutive monitoring dates. Conventional image processing software is unable to do this, so we will need to develop and use deep learning algorithms (artificial neural networks).

Development of our autonomous reef-monitoring catamaran to take our coral frame photographs

We have started to design and build a robotic camera system on a floating catamaran, that we hope in 2019 will be able to photograph all our coral frames automatically! This will enable us to scale up our operations, and to gather more data on coral growth.

For it to be autonomous (work independently and automatically), the system needs to know its position and be able to visually recognise frames. We are going to implement ‘Computer Vision’ algorithms that will enable the autopilot system to analyse the immediate environment, in real-time.


Construction of the catamaran is well under way, and we have now received the internal electronic components from overseas. We have also been working on our Artificial Intelligence system, developing software for identifying the coral fragments attached to our frames.

We use a pre-trained ‘Faster-R-CNN’ model, with extra training on 170 manually-annotated frame photographs to identify different objects (Pocillopora fragments, Acropora fragments, dead coral fragments). With the current model, it takes about 10 seconds to process one coral frame photograph, with a promising success rate of 40-45% (mean Average Precision).

Once the performance has increased and the results are more reliable, we will be able to calculate survivability for all frames and link it to recorded parameters.


We have received delivery of our catamaran! We subsequently installed the solar panels and completed the electrical connections, and we are currently testing the battery before installing the remaining systems.

We are also currently training a more complex model for the identification of coral fragments in our frame monitoring pictures. After 14 days of training, the performance is slowly improving. In the meantime, we have configured a database and various data processing tools to handle the coral fragment pictures that have been detected by this algorithm. They will then be processed through a segmentation model, which will identify the fragment more precisely.

MARCH 2019

The new software model we have trained for detection of the coral fragments is now much more successful than the old one (up from 45% to 70%). This means we now have an automated method of detecting and identifying coral species of any age, without human intervention.

The new model is a Faster R-CNN (Region-Convoluted neural network) with Resnet101 structure pre-trained on iNaturalist species dataset.

Work on the physical catamaran is continuing, as we continue to assemble and solder the electronic components.

APRIL 2019

Our new crowd-funding campaign AI 4 Corals is now launched on Indiegogo!

All the essential parts have been set up on the catamaran (propellers, battery, charge controller, Pixhawk autopilot) and using a computer we are now able to remotely control the catamaran’s movements. We plan to upgrade the charge controller to provide more power, install the cameras and a then a Raspberry Pi (an onboard computer to control trajectory).

MAY 2019

Following installation of a new charge controller, the catamaran embarked on its longest runs to date, testing manoeuvrability and reliability. The next step is the introduction of the Raspberry Pi module.
– The catamaran’s on board micro-computer has been installed, allowing us to remotely control trajectory.
– An additional charge controller has been added to eliminate some early issues with power consumption.

– Adjustments were made to the Pixhawk autopilot in conjunction with the installation of the micro-computer.
– Initial trials of autonomous trajectory control have started.


To make our electronic connections sturdier and more resilient to rusting, we have replaced some wiring and glued many of the components to an acrylic plate. We added small fans to prevent overheating, and the stereo camera is assembled and ready to be attached to the catamaran.

Improvements are continuing to improve coral detection and identification rates; corrections are fed back into the model as training pictures, giving faster and more accurate results. We are also working on a new Python process to detect the frame structure, which will position the fragments relative to the frame and to each other.

We have submitted an abstract on our AI project for the International Coral Reef Symposium in Bremen (ICRS-2020), with a view to presenting our work and raising awareness.


We have ordered some replacement propeller parts, prepared a designated parking area on land, and will be working on a custom-designed trailer for transport to the ocean.

Our deep-learning model has now been trained on almost 400 coral frame photographs, comprising more than 6,000 individual pieces of coral, and is now relatively accurate at detecting and identifying corals. Next, we are using successive image processing algorithms to detect the metal bars, to then extract the shape of the frame for precise positioning of the corals on the structure.


Our catamaran is currently on stand-by as we are having quality problems with the propellers. We are in the process of replacing them and building a custom trolley to move it from the garage to the beach.

We are making good progress with the artificial intelligence. The aim is to identify which coral species are growing on the frames, just from auto-analysis of the monitoring pictures. This way we can identify frames which have lost their tags, and most importantly analyse coral survival and growth. There are several parts that need to be achieved for this to work:

1. Coral detection: This is done by a deep learning framework, a convolutional neural network (CNN) that learns what coral fragments look like, from pictures that are manually annotated. We have had good results with this, and we continue to provide the network with new training pictures to widen the scope of successful situations. To date, we have analysed a total of 605 pictures with 9,442 annotations. The mean Average Precision is an encouraging 0.74-0.77, but this is highly dependent on the similarity between test pictures and training pictures.

2. Frame detection: To know the position of coral fragments on the frame, we need an approximation of the 3D position of the frame, relative to the camera. We can detect the metal bars by merging the straight-line segments in the picture, to give the position of a few bars (we don’t know which ones) plus some false positives (rubble or other frames in the background). We then use a probabilistic method to try thousands of different camera angles and check how well the resulting pictures match the bars that we have detected. In the end, the algorithm proposes the most likely camera position it found. Once we know the camera position, we can calculate the proposed location of the whole frame on the picture. For now, we have an approximate 60% success rate.

3. Combined information: Once we know the position of the frame and the corals on the picture, we can link them together. For each coral fragment that we detect, we can know which bar it is on, and its position on the bar. We can then merge the data from the 4 monitoring pictures to have a global description of the frame, specifically which coral fragments are present, with size, living state, taxonomic family and bar position.


In order to diversify the training photographs, we have been providing the AI software model with new monitoring sets from different environmental conditions. The current version of the artificial neural network is now trained on 605 monitoring pictures, a total of 9,442 individual fragments. To obtain these results, we had to train the model for more than 3 million seconds… which equates to approximately 35 days!

We have developed a method to detect the frame structure from the monitoring pictures:

  1. Convert the image to monochrome and increase the contrast to emphasise the shapes.
  2. Apply a ‘ridge detector’ to analyse the shapes and convert them into a network of simple curves.
  3. Apply a ‘line segment detector’ to check the curves for straight lines.
  4. Group these line segments into clusters to represent the frame’s bars; merge into longer/fewer segments.
  5. Apply a ‘probabilistic method’ that tries random camera angles and checks if the position of the frame for that angle matches the bars that we have detected. This gives us a best fit.

This method works well on 60% of our photographs, but is less successful at processing visually complex environments, for instance when the ground is covered with rubble. So we are developing a second deep-learning method using artificial intelligence (similar to fragment detection), and have started training an artificial neural network to recognise frame bars. This will replace steps 2 to 4 above, and the final step is now used to determine the camera position.


We have installed stronger replacement propellers for the Catamaran, and started to conduct sea trials, and we are making good progress with our new AI software model, which can accurately detect the position of coral frame subjects in 85% of monitoring pictures.


We have developed a new way to visualise our results, by displaying each frame bar in a different colour, with the attached fragments in the same colour. This enables us to assign each fragment to a position on a specific bar, to compare successive monitoring pictures of a specific frame (and also to match unknown frames by analysing their corals).

However, these models are currently not very reliable, with a cumulative error rate of 10-20%. We are continuing with additional training to enhance the code through trial and error feedback. The next steps will improve the overall accuracy, so we can start using these models in real world conditions. We will then be able to calculate survival rates and match frames that have lost their tag.

We received the new battery for our Catamaran and have resumed sea trials. We reached a new milestone during February… the cameras are now set up and fully functioning! We are, however, currently investigating a power-supply problem that is causing unexpected shutdowns, loss of remote connectivity and propeller unresponsiveness.


Frame Recognition – Through prolonged tuning of the parameters, we have greatly improved the results of the frame recognition algorithm. We can now successfully detect the frame structure’s position in 99.8% of monitoring photos (up from 85-90%). From the 516 test images, only a single photograph was not processed successfully.

Coral Detection – We have improved the performance of our deep-learning algorithm for detecting the coral fragments. By adding data augmentation (random rotations and flipping of the training pictures), we are now able to detect corals more accurately. Further training still needs to be done for some specific cases, such as newly transplanted fragments (which are small and difficult to detect).

Frame analysis – It has taken a lot of work to develop and improve the algorithm that matches the detected coral fragments to the bar. There are many significant algorithmic challenges, including merging data from the four monitoring pictures. The algorithm is already much more accurate, but improvements are essential to be able to track down the growth of our corals at the colony level.

Catamaran – We have installed an additional battery for the propellers, and we are currently investigating an issue with the onboard cameras. We have noticed that nearby boats can sometimes generate radio interference, so we are researching whether a 4G modem would be more reliable than the radio antenna.


2019 February: First automatic detection of coral fragments
2019 April: First sea trials with the catamaran
2019 November: First automatic detection of coral frames
2020 January: First automatic 3D reconstitution of whole coral frames (structure + coral fragments)
2020 February: First underwater pictures taken by the catamaran
2020 March: 99.8% success on coral frame detection


After more than a year of work on artificial intelligence, we have reached an important milestone… our program successfully analysed several complete frames! The software automatically identified the taxonomic genus of the coral fragments and recorded their sizes, all without any human intervention.

This progress opens the door to ‘big data’ analyses by running the program on the whole database (20-30 days of continuous data processing). To do this, we need to make the deep learning models reliable in a wider variety of situations through additional training. We have already conducted one more batch for fragment detection and two more batches for frame segmentation (our two deep learning models).

Currently, we are using:

  • Frame segmentation – 193 manually annotated photographs (now very reliable).
  • Fragment detection – 12,286 manual annotations from 759 photographs (more training required, for specific situations).

We have also updated the code:

  • Parallel operations to use all CPU cores.
  • Added commenting to the python files.
  • Added a command line interface for simple execution.
  • Full documentation for all the steps.
Reefscapers AI analysis of growth (cm) of the coral colonies on frame #LG3324, over time (2017-2019)

AI analysis of coral colony growth (cm) on frame #LG3324, from 2017 to 2019

Reef monitoring catamaran AI recognition of the coral frame

AI recognition of the coral frame

Reef monitoring catamaran AI analysis of the coral frame

AI analysis of the coral frame

Reef monitoring catamaran AI identification of coral colonies

AI identification of coral colonies

October 2020: New Vacancy for a Communications Intern >> MORE DETAILS