Installation of a visual, monitoring system for the seahorses
This is a pilot installation, still under testing, for visual tracking of seahorses and other organisms, with the capabilities of permanent image and video recording, able to adjust the sampling frequency.
Just before the end of 2020, NOUS was installed and live streamed for the first time in the world, the seahorse colony at Stratoni, Chalkidiki, Northern Greeece, powered by an autonomous solar system. With real time video we are able to watch the seahorses at artificial ropes, at a special construction at their unique environment.
The main idea has two lines of development. First we establish a continuous real time monitoring system that provides the information pipeline from sensors. Then a platform using neural networks with customised structure based on training with local data along with machine learning, furnishes efficient and automated advanced monitoring capabilities for the particular species as well as for other critical environment variations.
The system consists of prototype submarine housings equipped with cameras (day and night IR), lights, a temperature measurement sensor and windshield wipers for the camera lenses. The system can be extended so as a hydrophone connected to the housing provides acoustic data. The whole operation is controlled by a multitasking computing unit. The network of the underwater cameras ends up in a submarine hub, which is powered by cable from the shore or by a surface buoy (the latter with power from solar panels). System capacity is estimated to be up to 10 connected cameras grouped in two hubs. In addition the system can be equipped with a hydrological current meter for flow monitoring. The system transfers data through an internet connection, via fibre optic and a GSM link to a server and cloud. The deployed system is able to collect complete weather data along with solar power data (in case of buoys), with their statistics for both.
The complete system will be deployed so as to observe artificial habitats (i.e. seahorse hotels and ropes). A combination of two cameras will provide close-up and distant visual wide information, comparing respective info from the natural habitats, (i.e. sea grass and tube worms). Processed info from the hydrophone will provide us early alerts from the acoustic disturbance at sea bottom due to wave energy. Similarly, flow monitoring we will automatically collect the pattern of the local flow circulation of microorganisms that contribute to the main food source of the seahorses.
The proposed system is equipped with Artificial Intelligence capabilities in order to distinguish, classify, associate and perceive significant differences in measurable parameters that take place and can be considered of scientific interest for the aquatic environment, as described above. Machine Learning and Deep Neural Networks will implement the above algorithms for intelligent information processing from images and spectrograms for sound recognition.