In the near future, an Air Force pilot’s wingman could be flown by artificial intelligence.
The two might fly side-by-side in a highly contested war zone — and the AI aircraft not only takes the lead, it begins making choices. Does it fire missiles or drop bombs ahead of its fighter counterpart? Probably, because the human response has a lag time, unlike a machine that can detect and react immediately, if necessary.
It sounds very much like the Air Force’s proposed Loyal Wingman program. But the service wants to take the concept even further, with an AI plane that trains with its pilot, acting as a sidekick. In short, it’s R2-D2 from “Star Wars” in an aircraft of its very own. The Air Force foresees training generations of AIs and pilots together.
Welcome to Skyborg, an Air Force Research Lab program aimed at pairing AI with a human in the cockpit so the machine can learn how to fly, according to Dr. Will Roper, assistant secretary of the Air Force for Acquisition, Technology and Logistics.
Skyborg for now is all about developing the AI itself. But Roper said the lab plans to experiment with putting it in BQM target drone variations; the XQ-58A Valkyrie, which the service just flew as a long-range, high subsonic unmanned aerial vehicle; and QF-16 converted unmanned fighters.
The AI could graduate and be imported into a manned aircraft someday, he said. That could include the pilot talking to it, much like Amazon’s Alexa or Apple’s Siri.
“I might eventually decide, ‘I want that AI in my own cockpit,'” Roper said during a gaggle with reporters Wednesday at a conference sponsored by McAleese and Associates and Credit Suisse in Washington, D.C. “So if something happened immediately, [the AI] could take hold, make choices in a way that [a pilot would] know because [a pilot has] trained with it.”
The Air Force plans to have “real operational demonstrations” of Skyborg in the next few years. The money has already been allocated from past budgets for experimental testing, Roper said.
The service is also relying on partnerships with universities to develop the AI.
Roper downplayed questions that the AI might ever be fully autonomous, bypassing the human completely. It will just study how a human senses, responds, thinks and decides in order to learn, he said.
“I’m not a proponent of that future where everything is turned over to AI because machine learning right now is just a bunch of data connected and the ability to process through those ‘neuronets’ quickly enough to map an event that you’re seeing in the historical database,” he said.
Once operational in the battlespace, the AI could collect hours of data — exposing it to new information and opportunities to learn.
Roper explained this is how the AI establishes a learning curve to get ahead of an adversary, and it will pass down its “thinking” mechanisms to future fleets of aircraft.
“Imagine having trained person after person, generation after generation … what if, once you get on the curve, what if it’s exponential? And whoever gets on it first has an advantage forever?” he said. “I don’t want China on that curve, I want us on that curve, and us accelerating ahead of the pack.”
The Air Force is also exploring how the AI shares knowledge. For example, if the service puts the AI into a QF-16, can it operate only on that model?
“Learning is taking place on one platform to be shared across a fleet of those platforms … [but] you may want to share that across the entire Air Force,” Roper said. “I would love to have an Air Force that did that.”
He said the technology may even make its way into Air Education and Training Command’s Pilot Training Next program, a first-of-its-kind study testing the ability of students to learn faster and absorb more through cutting-edge technology aids and simulations.
“Young pilots don’t view this as competing [with] a human pilot, they view this as something that will make a human pilot a more exciting job to have,” Roper said. “And our young pilots in the Pilot Training Next program are super excited about training an AI pilot.”
He added, “I’d like to get it in a simulator so that it can fly with our pilots and against our pilots.”
Roper does question how much autonomy is too much.
Based on the AI’s response, that’s “how we’ll know how much we trust, and how much [responsibility] we’re willing to turn over to it,” he said. “That technology moves so quickly … and the technology will be changing faster than the [ethical] policy will ever be able to keep up.”
And people may be skeptical. A “smart” robot scrubbing paint off an old airplane is one thing. But putting AI into a combat operational airplane is another, Roper said.
“We’re finding it tougher to get into new programs, so what we’re doing is studying AI into existing programs and hoping this will force us to determine, ‘How do we use this technology?'”
Roper said the system could go on expendable aircraft, a program the Air Force is studying for future use.
“It starts by getting AI in the operators’ hands,” he said. “Believe me, this will be cool and awesome, and I hope I’m in this job to see it happen.
“You can’t spell Air Force without A [and] I,” Roper added. “It’s going to make being a pilot more exciting.”
— Oriana Pawlyk can be reached at email@example.com. Follow her on Twitter at @ Oriana0214.
Show Full Article
© Copyright 2020 Military.com. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.