Automated analysis of marine video with limited data

Automated analysis of marine video with limited data

By: Levy D., Belfer Y., Osherov E., Bigal E., Scheinin A.P., Nativ H., Tchernov D., Treibitz T.
Published in: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
SDGs : SDG 14  |  Units:   | Time: 2018 |  Link
Description: Monitoring of the marine environment requires large amounts of data, simply due to its vast size. Therefore, underwater autonomous vehicles and drones are increasingly deployed to acquire numerous photographs. However, ecological conclusions from them are lagging as the data requires expert annotation and thus realistically cannot be manually processed. This calls for developing automatic classification algorithms dedicated for this type of data. Current out-of-the-box solutions struggle to provide optimal results in these scenarios as the marine data is very different from everyday data. Images taken under water display low contrast levels and reduced visibility range thus making objects harder to localize and classify. Scale varies dramatically because of the complex 3 dimensionality of the scenes. In addition, the scarcity of labeled marine data prevents training these dedicated networks from scratch. In this work, we demonstrate how transfer learning can be utilized to achieve high quality results for both detection and classification in the marine environment. We also demonstrate tracking in videos that enables counting and measuring the organisms. We demonstrate the suggested method on two very different marine datasets, an aerial dataset and an underwater one. © 2018 IEEE.