Whale Safe is a tool that displays both visual and acoustic whale detections in the Santa Barbara Channel and the San Francisco Region. It also includes a blue whale habitat model that predicts the likelihood of blue whale presence.
The tool ranks vessels and shipping companies according to their rates of cooperation with NOAA’s voluntary speed restrictions.
Whale Safe uses three data streams: the buoy listens for and identifies the songs of blue, fin and humpback whales with an algorithm and beams its findings to a satellite; a mathematical model informed by present and past oceanographic and biological data predicts where blue whales are most likely to be; and citizen scientists and trained observers report whale sightings via an app called Whale Alert.
Whale Safe’s platform integrates these data sources and alerts ships to their likelihood of encountering whales that day.
This tool certainly opens a door for using AI to identify differences in certain sounds to detect trends. It seems that one of A.I.’s strengths is detecting patterns within data sets that are too large for a human to efficiently analyze, but too abstract for a simple program to recognize.
Immediately, I’m recalled to a device called the Raspberry Shake which allows individuals to listen in to seismic waves. I wonder if the same principle used here to pick up sound waves underwater and identify said sounds into patterns with A.I. could be used on land with seismic waves. Maybe there’s a way to save people in crisis situations, to rescue those who are lost in rubble, or warn people of emergency events before it’s too late to evacuate.
The system also has a chance for what are commonly known as “citizen scientists” to participate. This is simply the posting of data sets from normal individuals to help better inform the info on a given map. Think of services like Waze! This kind of practice could lead to more accurate data simply from one of the most abundant resource gathering technologies spread all over the planet: humans.