Some 400 years after the original Mayflower sailed across the Atlantic Ocean, its unmanned robotic descendant has completed the first transatlantic crossing solely on its own decision-making.
After seven years of planning and 40 days at sea, the Mayflower Autonomous Ship (MAS400) finally pulled into Halifax, Nova Scotia, on June 5 after a 3,500-mile journey from Plymouth, UK Originally headed to Washington, DC, the ship — which is propelled by a solar-driven hybrid electric motor and backup diesel generator, and guided by artificial intelligence, cloud, and edge computing technologies — diverted to Canada last week so the team could fix a faulty generator starter. Later this month, it will continue to Plymouth, Massachusetts, where the first Mayflower landed in 1620, before arriving in DC in July.
“I’m both relieved and elated that we have her in Halifax. It’s not the port we intended to make, but `any port in a storm, ‘as the saying goes,” says MAS400 managing director Brett Phaneuf, who also serves as president of the Submergence Group, a UK firm that designs and manufactures manned and unmanned submersibles. “The journey she made across was arduous and has taught us a great deal about designing, building, and operating ship of this nature and the future of the maritime enterprise — made bearable by the fantastic team that came together to realize this goal. ”
The venture, whose cost is undisclosed, is a collaboration between ProMare, the ocean research nonprofit Phaneuf co-founded in 2001, and IBM Research. It encompasses a multicultural team traversing 10 countries, three continents, and four-dozen business and academic partners. The 10,000-lb., 50 x 20-foot vessel advances established automated, remote-controlled, and preprogrammed missions by making real-time decisions at sea with no human intervention (though humans can override in emergencies). The boat avoids hazards, assesses vehicle performance, replans routes, and copes with other novel situations all on its own.
Beyond achieving this once-seemingly impossible feat, the Mayflower also conducted an array of environmental science in remote parts of the ocean. Its findings will help scientists gauge the impact of global warming and pollution on marine life, such as water acidification, microplastics, and mammal conservation. And its success may pave the way for flexible and cost-efficient fleets with low carbon footprints gathering ocean data, while its software could be leveraged to manned ships to reduce risks and human error. Indirectly, MAS findings could aid the development of autonomous AI systems and augmented intelligence for humans across other industries such as shipping, oil and gas, telecommunications, security and defense, finance, and aquaculture.
While the Mayflower tackled the Atlantic Ocean, a number of autonomous long-haul experiments involving research, commercial, and military ships recently succeeded in the Pacific. Among them, Leidos’ Sea Hunter completed a 5000-mile round trip between San Diego and Hawaii in 2019 as part of a US Navy project; the Saildrone Surveyor research vessel last year finished a 2250-mile journey from San Francisco to Hawaii; and just last week, the Hyundai Heavy Industries merchant vessel, Prism Courage, achieved a 6200-mile trip from the US to South Korea using autonomous navigation for half its voyage.
“The ocean is merciless, which is part of the reason why we want to go to AI systems,” says Phaneuf. “We want to send these things for very long periods to disparate parts of the ocean, and not have to worry if someone gets hurt, bored, tired, lost, or if the ship sinks.” A similarly outfitted manned vessel might need quadruple the space, “mostly for stuff to keep humans alive, which consumes a lot of energy.”
An electric tongue and talking whales
The main challenge with the Mayflower design was configuring the technology to provide the continuous autonomous data necessary for the ship to immediately react. “It’s loaded for bear,” Phaneuf says with a laugh, alluding to the suite of instruments. The Mayflower sports six AI-powered cameras and more than 30 sensors covering three weather stations, technology for science experiments, and a visualization system to recognize obstacles like standup paddleboarders, other ships, and icebergs. They include radar, sonar, LIDAR, GPS within within centimeters of accuracy, stabilized 360-degree day and night cameras, thermal imagers, and gauges for motion, fuel, wind, wave height and pattern, and aquatic chemistry. That information feeds into the AI Captain, which uses IBM’s Operational Decision Manager decision-making software to guide navigation and analysis, amounting to a grand experiment in machine learning.
“How does the system cope with new data?” Phaneuf says. “And then if it does successfully or not successfully, does it cope with those situations? Can it learn? ”
One of the more novel instruments, Hypertaste, is a kind of “electronic tongue” that collects chemical, biological, and environmental DNA information. This tool — which IBM adapted from the food and drink industry — autonomously analyzed water quality along the route to determine how changing environments affected the growth of plankton, microscopic plants comprising marine food bases. Hypertaste measured pH and iron concentrations to assess nutrients, salinity, and chemical compositions.
“This mission has been about showcasing what’s possible in difficult-to-reach locations,” says Rosie Lickorish, a UK-based IBM lead research scientist who helped design Hypertaste. “Not a huge amount is known about the interior of the ocean. Some of these areas are very difficult to access easily with the traditional research cruises. ”
The ship also includes a holographic microscope to count and image particles in water, as a means of detecting microplastics and plankton. A hydrophone captured and recorded whale and dolphin sounds to gauge their population distributions. “They’re very difficult to study at the best of times,” adds Lickorish. “You need really good mechanisms to be able to detect and identify the vocalizations.”
“Hold my beer”
The genesis of the MAS400 occurred in 2016 when Phaneuf attended a meeting of like-minded technologists to discuss ways to recognize the 400th anniversary of the 1620 Mayflower voyage. Unimpressed with simply retracing the route in a replica ship — a feat he said had already been tackled in 1954 — Phaneuf suggested a more futuristic approach with an autonomous version.
“People just rolled their eyes,” he says. The response only ramped up his determination to see it through. “It was sort of a` hold my beer ‘kind of thing. “
Phaneuf hired a Naval architect to draft concept renderings, corralled some interested people, and did a little crowdfunding, but the project still crept along at a glacial pace. “I think a lot of it was that people were in disbelief, like, ‘It can’t be done,'” he said. “To be honest, we weren’t sure it could be done. ”
The Mayflower’s guardian angel came in the form of IBM Systems strategist Eric Aquaronne, a France-based engineer who was an enthusiastic participant from the beginning and crucial in getting IBM signed on as a technical partner in 2020. He also tapped colleagues worldwide to help develop more robust data-processing models and repurpose IBM software for this project.
But the road to innovation is never smooth. The Mayflower had a false start last summer when a broken piece of hardware forced the vessel back 100 miles into its journey, and pandemic-related supply chain issues slowed the part’s replacement. Rather than brave hurricane season, the team waited another nine months to depart. This year, it set off on April 27, only to stop in Portugal two weeks later to refuel and fix a generator switch. On May 30, the team rerouted it to its current berth in Canada for more repairs.
Still, these are minor glitches for a project of this complexity. Its success has emboldened the team to consider future voyages that step up its machine learning capabilities, eventually using completely renewable energy, and possibly share data with NASA’s Earth science monitoring. Meanwhile, IBM will start deciphering the Mayflower’s brain with an eye toward augmented intelligence in other areas, such as increasing transparency in financial services transactions or preventing supply chain disruptions.
“The completion of this first transatlantic voyage means we can start analyzing data from the ship’s journey, dig into the AI Captain’s performance, and understand why it made the decisions it did,” says Rob High, an IBM Fellow who serves as CTO of IBM Software’s Networking and Edge Computing division.
Yet the mission’s far-flung technological applications should not overshadow its communal aspect — the shared spirit of adventure that comes from a global effort pushing the boundaries of exploration. To that end, as Phaneuf looks to the future, he’s mindful of its namesake’s contentious past. Heightened sensitivities surrounding the settlers’ treatment of indigenous tribes prompted MAS400 to partner with Mayflower400, an educational nonprofit dedicated to a more inclusive commemoration of its journey and controversial legacy of colonization.
“Our inspiration is not the Pilgrim voyage or what happened after they got there,” says Phaneuf. “It’s this idea that they looked out at the ocean with these rickety old ships and thought, ‘Let’s go for it!’ They jumped off into this unknown with very low chances of survival and took that leap regardless of the outcome. That’s what I find aspirational. “