torsdag 18. januar 2018

Helikopter - Automatisk transport av forsyninger - AW&ST


Jeg gjentar det jeg har skrevet før: Dette er langt fra nytt. K-Max opererte for US Marines et par år i Afghanistan med stor suksess. (Red.)

The flight demonstrations on a bright, cold December day at an urban training range in Virginia make it look easy. The unmanned helicopter ferries supplies between landing sites at U.S. Marine Corps Base Quantico as serenely as a duck crossing a pond. But an array of screens in a tent near the simulated village shows how hard the autonomy system is paddling to make the flights look effortless.
A Marine on a rise overlooking the landing zone enters a resupply request on a simple tablet interface. A few miles away, the aircraft—with no prior map of the area and knowing only its takeoff and landing coordinates, a departure heading and any no-fly zones entered by the operator—autonomously plans its delivery route.
  • ONR’s five-year AACUS autonomy program culminates in cargo-resupply flight demo
  • Aurora Flight Sciences developed an autonomy kit portable between rotary-wing types
  • Services planning operational experiments to mature technology for future missions
The ground-station operator reviews and approves the route and launches the helicopter. As it lifts off and climbs out, laser sensors scan for terrain, trees, wires and other obstacles. Detecting a tree line in its path, the helicopter climbs to a higher, safer altitude before turning toward its destination.
En route and scanning continuously, the aircraft makes small adjustments to its planned flight to stay clear of detected obstacles. At the landing zone, the waiting Marine spots a hazard, perhaps a person crossing the cleared area, and hits the wave-off button on his tablet. The helicopter autonomously executes a 360-deg. turn, returns to its planned route, and receives the all-clear to proceed.
Approaching the village, the forward laser sensor focuses on the landing zone, mapping walls and other potential obstacles, assessing the slope and roughness of the terrain, and then the system autonomously adjusts the touchdown point for a safe landing. The Marine sees and approves the adjustment on his tablet.
On final approach, the system detects a small obstacle in the landing zone—a Pelican case—and moves the helicopter sideways so the object is safely outside of the rotor disk as it touches down. Someone runs out to unload the aircraft and then, when the zone is clear, the Marine on the rise hits a button and sends it home.

Using the AACUS perception and processing kit, the AEH-1 autonomously lands between buildings in a simulated village. Credit: U.S. Navy/John F. Williams

The demonstrations at Quantico in December were the graduation exercise for the Office of Naval Research’s (ONR) Autonomous Aerial Cargo/Utility System (AACUS) program. The five-year science and technology effort has been the largest autonomy program in the Defense Department, says ONR program manager Dennis Baker, involving an industry team led by Aurora Flight Sciences, now part of Boeing.
AACUS was conceived in response to a Marine Corps urgent operational need to deliver cargo to unprepared landing zones, with no digital maps and no GPS, but with the ability to avoid obstacles en route and select safe landing points in confined spaces. AACUS was designed as an autonomy kit that could be fitted to any rotary-wing aircraft to turn it into an unmanned resupply or medevac platform.
“We have developed the technology ahead of the requirements. It is targeted at Marine Corps needs, but it is not unique to the Marine Corps,” says ONR Executive Director Walter Jones. “We have developed a great capability,” adds Lt. Gen. Robert Walsh, commander of Marine Corps Combat Development Command. “Now it is up to us to figure out how to use it.”
That determination is planned to involve a series of operational experiments over the next two years that will help define autonomy requirements for the Marine Corps’ MUX, a large multirole expeditionary unmanned aircraft that could emerge in the early 2020s as a joint program with the U.S. Army.
Beginning this spring with the Marine Corps Warfighting Laboratory’s Sea Dragon 2025 Phase 2 integrated training exercise at Twentynine Palms, California, these experiments will use the AACUS testbed helicopter, a Bell UH-1H modified by Aurora into the “autonomy-enabled” AEH-1.
Focused on testing logistics technology, the Sea Dragon exercise is to be followed by Naval Air Systems Command (Navair) demonstrations in August of sling-load operations and in October of shipboard landing operations. An Army/Navair demo planned for March 2019 will involve autonomous medical resupply and casualty evacuations—roles that could be performed by the future MUX.

The primary lidar sensor mounted in the nose scans for obstacles in flight and in the landing zone and continuously builds a 3D digital map. Credit: U.S. Navy/John F. Williams

The Marine Corps deployed two Lockheed Martin/Kaman K-Max unmanned cargo helicopters to Afghanistan in 2011-14 to resupply forward bases, learning valuable lessons about controlling such aircraft from the ground. “This is completely different,” says Walsh of AACUS. “This is where we want to go with autonomy.”
As a testbed for AACUS, the AEH-1 has been modified with a digital flight-control system. This uses a Rockwell Collins flight control computer to drive servo actuators attached to the helicopter’s existing mechanical controls under the cabin floor. Fitting AACUS to the elderly UH-1, after flying it in Boeing’s UH-6 Unmanned Little Bird and the Bell 206, showed the system is portable as planned.
The helicopter has lidar sensors on the nose, belly and tail boom to map the world around the helicopter. Near Earth Autonomy has worked with Aurora on the perception system. This includes cameras that use different wavelengths to characterize the landing zone, so the system knows whether it is solid ground or a lily pond. The sensors also can be used for visual odometry, a technique that uses camera images for navigating without GPS.
The nose-mounted Velodyne lidar is the primary sensor, rotating in flight to scan through 270 deg. The data populates a 3D evidence grid made up of blocks marked as empty or occupied, based on lidar returns. Depending on the phase of flight, these blocks are bigger or smaller based on the obstacle resolution required. Approaching the destination, the forward sensor focuses more narrowly on the landing zone, to detect small objects. This enables the autonomy system to select a safe touchdown point as the helicopter approaches, without having to first fly over the landing zone.
The AEH-1 is optionally piloted, and flies in autonomous mode with a safety pilot on board. This has allowed rapid system-envelope expansion in a safe and efficient way, explains Fritz Langford, Aurora’s AACUS chief engineer. Optional piloting allowed the prototype to be focused on autonomy, not vehicle management, with the pilot operating the engine and able to take control of the helicopter “if the autonomy system doesn’t get it together,” he says.
A NASA-developed “highway in the sky” display in the cockpit tells the pilot what the autonomy system plans to do before it does it and why, Langford says. Equivalent situational-awareness displays on the ground station and tablet, which was developed with Kutta Technologies, keep the operator and user informed.
AACUS is designed to fly the helicopter “like a pilot,” says Baker. “A user can be trained in less than half a day to use an app on a tablet to get the aircraft to come to them, and it will fly to the landing zone like a pilot would so we don’t mess up their [way of operating] and put more burden [on the ground forces].” 
AACUS allows an autonomous helicopter to fly into a landing zone without any ground infrastructure, under the supervision of a user without any aviation knowledge, according to           Baker. 
Aurora’s approach to AACUS was to develop a platform-agnostic, portable mission kit with an open architecture that is scalable up or down to different aircraft sizes and speeds. “We made each piece of the system generic to enable reuse,” says Langford. The trajectory and route-planning algorithms, developed with Carnegie Mellon University, are portable between aircraft types. Sensors are modular, and the vehicle-performance model can be tailored to different aircraft by importing the flight manual.
While one transition target for AACUS is the Marine Corps’ proposed MUX, the autonomy technology could be used as a pilot aid in any fly-by-wire helicopter, notes Baker, making operations in degraded visual environments safer. “It could automate flight control in any manned or unmanned aircraft,” he says. The Marine Corps already is working with the Army to test the perception system on a small Malloy Aeronautics cargo UAV to see how infantry units could use the capability organically.
The Marines’ interest in autonomous resupply is driven by its move to distributed operations and the need to support small units spread over a wide geographic area. “We need the ability to distribute quickly, and move logistics forward,” says Walsh. The autonomy capability developed under AACUS also could be used for other missions in support of small squads, including electronic warfare. “We’ve got to keep pushing and moving this technology forward,” he adds.

Ingen kommentarer:

Legg inn en kommentar

Merk: Bare medlemmer av denne bloggen kan legge inn en kommentar.