Challenge KUL 06 : Turning Reef Footage into Ecological Insights

CHALLENGES 2025

8/23/20254 min read

In marine ecology, fish surveys are traditionally conducted using in situ methods such as Underwater Visual Census (UVC), where abundance is often estimated using the MaxN approach by counting the maximum number of individuals of each species in a single video frame. Although recent advances in AI have enabled automatic fish detection, most applications are still limited to research or development settings. Our goal is to combine marine biology expertise with AI technology to develop user-friendly software or an app that can automatically identify fish species and calculate MaxN values, providing rapid and reliable data for ecological research and conservation.

Challenge description:

The Urgency of Monitoring Coral Reefs
Coral reef ecosystems are among the most biodiverse and vulnerable habitats on Earth. As these ecosystems face accelerating threats from climate change, habitat degradation, and overfishing, eWective monitoring of reef health has become more urgent than ever. One key aspect of this monitoring is the assessment of reef fish communities, which serve as important indicators of ecological change.

Limitations of Traditional Monitoring Methods 
However, traditional methods for studying fish populations, such as diver-based surveys or manual video analysis, are time-consuming, labor-intensive, and limited in their scalability. In an era that demands rapid, reliable, and large-scale ecological assessments, there is a critical need for innovative solutions that can enhance our capacity to monitor these vital ecosystemseWiciently.

Project Overview: AI-Powered Fish Detection Tool
This project proposes the development of an artificial intelligence (AI)-powered tool designed to automatically identify and count fish in underwater videos using a standardized ecological method known as MaxN, which records the maximum number of individuals of each species visible in a single frame. The goal is to transform raw underwater footage into usable ecological data quickly and accurately, making the monitoring process not only faster but also more accessible to a broader range of users—including marine scientists, conservation practitioners, coastal managers, and citizen scientists.

Origin and Rationale
The idea originated from a marine biology graduate student with expertise in coral reef fish and uses statistical methods to understand the connections between fish and their habitats. Recognizing the bottleneck created by manual video analysis, the project seeks to merge ecological knowledge with cutting-edge AI technologies to streamline the entire workflow from data collection to species identification. While current AI applications in marine science are growing, few oWer integrated, user-friendly platforms that can perform high-quality fish identification and data processing in an automated fashion.

Phase 1: Data Collection
The project is carried out in five main phases. The first involves collecting high-quality underwater video data using standard ecological survey techniques such as belt transects or stationary cameras. These videos must be accompanied by detailed metadata, including information on depth, habitat type, and geographic location. Marine ecologists play a key role in this phase, ensuring that data is ecologically representative and suitable for machine learning applications.

Phase 2: Data Annotation
Once the video data is collected, selected frames are extracted and annotated with bounding boxes around each fish, which are labeled by species. These annotations form the training dataset for the AI model and require careful attention from experts familiar with coral reef fish, particularly in species-rich regions like the Indo-Pacific. Annotation tools such as CVAT, Roboflow, and LabelImg are employed to streamline the process and ensure labeling consistency, which is critical for successful model training.

Phase 3: AI Model Development
In the third phase, deep learning models such as YOLOv8 or EWicientDet are used to train the AI system. These models learn to detect and classify fish based on the annotated images and are optimized to perform under a variety of real-world conditions, including poor visibility, overlapping individuals, and varied lighting. The robustness of the model is tested across diWerent reef environments and camera types to ensure wide applicability. A machine learning specialist leads this stage, fine-tuning the model and validating its performance using separate test datasets.

Phase 4: Software Development
The fourth phase focuses on developing a software application that integrates the trained AI model. This tool must be intuitive, requiring no specialized technical background to use. Users can upload videos, initiate the analysis, and receive clear outputs—including species names, counts, and timestamps. The software is built using platforms such as Streamlit or Dash, and features such as drag-and-drop uploads, visual overlays of fish detections, and exportable data summaries are included to enhance usability. Importantly, the application is designed to produce scientifically useful metrics, including biodiversity indices such as Shannon diversity, Simpson’s index, and Pielou’s evenness, allowing researchers to conduct ecological analyses directly from the outputs.

Phase 5: Real-World Testing and Validation
The final phase involves rigorous testing of the software in real-world conditions. Newly collected underwater videos from diverse reef sites are processed through the system, and the AI-generated results are compared against manually verified data to assess accuracy and reliability. Feedback from marine ecologists, students, and conservation practitioners is gathered to further refine the tool’s functionality and user experience. Field validation ensures the system is capable of generalizing across diWerent reef types, geographic regions, and environmental conditions.

Collaborative Approach and Interdisciplinary Team
The success of this initiative depends on interdisciplinary collaboration. Marine biologists provide ecological insight and data curation; AI specialists develop the detection algorithms; software engineers design the interface; and field researchers validate the system in practice. Each discipline contributes critical expertise, ensuring the final product is scientifically robust, technically sound, and practically useful.

Broader Impact and Conclusion
The project’s broader impact lies in its potential to democratize access to reef monitoring tools. By reducing the time, expertise, and eWort required to process underwater video data, this tool can greatly expand the scope and frequency of reef monitoring initiatives. It enables not only more timely ecological assessments but also supports educational initiatives, community science eWorts, and data-driven conservation strategies.

In conclusion, this project represents a powerful integration of marine science and artificial intelligence, oWering a transformative approach to coral reef monitoring. By automating the identification and counting of reef fish, we aim to significantly reduce the barriers to ecological data collection and empower a wide range of stakeholders with the tools needed to better understand and protect coral reef ecosystems. As climate change and human pressures continue to threaten marine biodiversity, innovations like this are essential to supporting informed, science-based conservation eWorts.

To develop a software or app that automatically calculates the relative abundance (MaxN) of each species directly from uploaded underwater videos。

  • Coral Reef Ecologist

  • Data Scientist

  • UI designe

  • Software engineering

  • Data Analyst

  • IT developer

Essential team skills:
Development Goals for the 48-Hour Hackathon:
About the Challenge :