How can we leverage machine learning, computer vision, and other rapidly advancing technology to answer ecological questions?
As technology advances exponentially, the ability to collect massive amounts of video footage for behavioral research presents a new problem: how can we process and extract usable data from these videos in a reasonable amount of time? To accomplish this, we are developing and training a series of cutting-edge computer vision models to detect, identify, and track roving herbivores (i.e., Surgeonfish and Parrotfish) from both lateral and top-down video footage of Caribbean coral reefs. Our dataset includes crepuscular footage and high-complexity backgrounds, traditionally extremely challenging terrain for computer vision models. These models will allow us to automatically identify and track roving herbivores across extensive reef landscapes for full diel cycles, which would otherwise be prohibitively costly in terms of time and effort. In doing so, we will be able to identify large-scale movement and space-use patterns by roving herbivores and understand how these patterns drive spatial resource distribution in a coral reef ecosystem.