There are four ways drones typically navigate. Either they use GPS or other beacons, or they accept guidance instructions from a computer, or they navigate off a stored map, or they are flown by an expert in control.
What do you when absolutely none of the four are possible?
You put AI on the drone and it flies itself with no outside source of data, no built-in mapping, and no operator in control.
At least, that’s what Exyn Technologies says it’s doing with ExynAI: allowing drones to function with no GPS, no radio communications, and no stored map. The goal is to enable drones to work where humans can’t, the company says, including underground in active mines.
The claim: Exyn has the first industrial drone that flies itself anywhere.
“It’s having robots do some of the work that folks are doing underground right now, which puts them potentially in risky situations,” Exyn CEO Nader Elm told me recently on the TechFirst podcast. “Not only that, but also now we can actually provide a lot of insight into what’s happening in mining operations generally, kind of building maps and giving them really high-fidelity real-time data on what the geometry of the environment looks like.”
Exyn recently signed a deal with Sandvik Group to provide autonomous drones for mining companies. The idea is to send a drone into a just-blasted area. It’s likely both an active, working mine as well as a brand-new area of the mine, which means it’s potentially hazardous for humans, with cave-ins, settling, or other dangers imminent. And since it’s brand-new, that also means it’s currently unmapped.
The drone flies in, potentially hundreds of meters below ground, without access to either GPS or radio communications, and maps the mine while also returning data on safety and accessibility conditions.
“These are often very dangerous environments, these are environments where they’ve typically just drilled, packed explosives, and set off an explosion,” Elm says. “So it’s inherently dangerous. And that’s why we send in these robots to figure out what the geometry of that space looks like.”
Exyn builds the intelligence that forms the drone’s brain, but the drones it deploys also needs sensors. Because there’s no GPS or beacons or guidance, the drones use lidar, sonar, radar, or even other types of technologies, Elm says, to sense a three-dimensional space, build a map, understand its own position in that space, and then also navigate to additional locations to fully explore the space.
Listen to the interview for this story on the TechFirst podcast:
The intelligence part is called exynAI. It offers sensor fusion, combining data from all the sensors, as well as the ability to break down an operator-driven mission objective into a series of tasks. The system then uses the data obtained from mapping its environment to plan and execute its next steps, constantly recalculating where it is and where it needs to go. All the data is saved — about three gigabytes of data per minute of flight time —and can be downloaded when the mission is over.
Exyn isn’t the only company working in the space.
Others working on, using, and in some cases releasing similar technology include Near Earth Autonomy, NTrobotics, Emesent, and Flyability. In some cases these companies focus on mining or other dangerous industrial activities, in others they provide drones for inspection services. In all cases they’re working on enabling drones to work in remote and communications-restricted environments.
It’s challenging to determine who was first with what, since “first” claims are often related to very specific circumstances, niche use cases, or proprietary technologies, but the clear implication is that market for autonomous drones with industrial capability is growing fast.
And not just industrial uses.
Safety, search and rescue, and civic uses are developing.
“Think of a first responder situation,” Elm says. “There is a building which is too dangerous to go into and search and rescue is there … how do we enable them to be more effective in what they do while being safe? And that’s where robotics really lends itself well … in a search and rescue exercise, the search part happens to be the most dangerous part, where you’ve got teams of people going into a building which is structurally unsafe, and they’re doing nothing but trying to find people and survivors. And then once [the drones] find them, then you can send in other teams to go and extract them and do whatever triage is necessary.”
Next year, Exyn says, it will be be releasing more technology that will enable multiple robots and drones to communicate amongst themselves, and with humans. That gets complicated, of course, but enables valuable new capabilities: quicker mapping, more intelligent search, collaboration, and more.
The hardest part is probably figuring out how to keep humans in the loop.
Elm says that autonomous robotics is, in a way, in a state very similar to the first iPhone right now. The hardware is there, the platform is available, but the software is still lacking. The App Store hasn’t yet been invented.
That, he says, is where the innovation will really be needed in coming years.
“We’ve got to start thinking about the software that goes onto this and the applications we need to apply it to … I think that’s really limited by our own imagination right now,” he told me. “We’ve got to think about the interface between the user and human and the systems, because it’s difficult enough right now — us interacting with one robot — now you just imagine you’ve got to try and interact with what, a dozen to a hundred robots.”
That sounds like a call that for developers, developers, developers.
One thing they’ll need from the industry: a stable platform to develop on. In mobile, they have two of them in iOS and Android. In autonomous drones, it’s not clear what that platform will be.
See a full transcript of our conversation here.