Google built an AI model for understanding dolphins
What if artificial intelligence could finally help us chat with dolphins? Thanks to a new collaboration between Google, the Georgia Institute of Technology, and the … The post Google built an AI model for understanding dolphins appeared first on BGR.


What if artificial intelligence could finally help us chat with dolphins? Thanks to a new collaboration between Google, the Georgia Institute of Technology, and the Wild Dolphin Project (WDP), researchers are taking a major step toward that goal.
By blending decades of marine biology with machine learning, Google has developed an AI that talks to dolphins—or at least starts to understand and mimic their remarkably complex vocalizations. Google calls the new model DolphinGemma, and it is built on the same architecture behind Google’s Gemini AI.
But instead of parsing human text, DolphinGemma is trained to decode and generate dolphin sounds like whistles, clicks, and burst pulses. These aren’t just noise, of course. These are the signals that dolphins use to identify each other, coordinate, play, and even squabble.
Over 40 years of WDP research in the Bahamas has produced one of the richest audio datasets of wild dolphin communication in the world, making it the perfect foundation for training an AI that talks to dolphins.
The system works in tandem with a companion tool called CHAT, short for Cetacean Hearing Augmentation Telemetry. CHAT has been loaded onto modified Google Pixel phones and designed to emit synthetic dolphin-like sounds. At the moment, the AI isn’t really designed to translate full dolphin sentences but to create a shared vocabulary so we can talk to dolphins more easily.
Researchers are currently teaching dolphins new synthetic whistles that represent familiar objects, like seagrass or scarves. The goal is to get dolphins to recognize and even “ask” for these objects using their new codewords. This should allow us to better understand dolphins and what they want or need.
DolphinGemma processes incoming dolphin vocalizations, then predicts what sounds might logically follow similar to how like predictive text works on our smartphones. However, it has all been tailored to cetacean communication. By building this kind of back-and-forth, researchers hope to open up basic interspecies dialogue, a feat that’s long been out of reach.
Sure, it might be years before we’re holding full conversations with our aquatic friends. But, this is the most promising attempt yet at creating AI that talks to dolphins. It’s a remarkable blend of biology and technology that could deepen our understanding of intelligent life on Earth and reshape the future of communication.
The post Google built an AI model for understanding dolphins appeared first on BGR.
Today's Top Deals
- Today’s deals: $399 Apple iPad mini, $12 LifeStraw, $630 Google Pixel 9, $270 Vitamix blender, more
- Today’s deals: $742 Apple Watch Ultra 2, $150 WiFi 6 mesh system, 26% off TurboTax, $55 Ring Doorbell, more
- Best Apple Watch deals for April 2025
- Best Apple deals for April 2025
Google built an AI model for understanding dolphins originally appeared on BGR.com on Tue, 15 Apr 2025 at 19:41:00 EDT. Please see our terms for use of feeds.