Lanzisera JPL Amp SPAWAR Visit

JPL and SPAWAR Trip Report 13-14 January 2009 Steven Lanzisera

Summary

We spent one day at JPL and one day at SPAWAR (a naval robotics research center). At JPL we met with Larry Matthies (a computer vision guy primarily), and he’s interested in making small UAVs land on particular targets after being launched from larger UAVs. He doesn’t want to build a UAV if he doesn’t have to, but he would love to work with our platforms to solve problems like that. At SPAWAR, we saw autonomous surface vessels, autonomous ATV type vehicles, autonomous pack bots, and autonomous helicopter type things. They use a combination of LIDAR and stereo vision to do mapping operations, navigation and collision avoidance. Autonomy is well studied and at a high level of ability. They really need small sensors, improved low power communication, and vehicles to augment their current capabilities. Surveillance is the primary area they see our stuff being useful although other areas may be great. SPAWAR is open to people coming down for summers (paid by them) if we’re interested in doing that.

JPL – 13 January 2009

Anita, Ankur and I met with a group of people from JPL to discuss the work we had been doing in small and our vision for what we should do for MAST. I got to JPL a little early and got a brief tour from Hannah Goldberg of some of the activities at JPL she’s worked on recently. Here’s a list of things I saw on her tour:

• An inertial measurement unit testing facility that included a 2-axis rate table capable of sweeping the device under test across temperature while testing angular rate measurement. This facility can be aligned to the stars for an absolute reference using a variety of precision star alignment systems. The test structures are isolated from the building foundation to ensure ambient vibrations don’t impact measurement.

•A mock up of the Mars Science Lab (MSL) used for radar testing. MSL is the new Mars rover being developed. It’s a full size mockup made to a have a radar signature like the MSL to do test of the landing radar. Just neat to see how big it is.

•A microsatellite under development to do inspection of spacecraft. The device uses a structured light system to map the surface it’s looking at in 3D. A laser shines on some grating which produces a grid of points that is 20x20. By looking at the position of points the distance to each point can be calculated generating a 20x20 pixel 3D map of the surface. Wide and narrow field of view cameras can also be used to visually observe features.

Then we met with Larry Matthies and the group of people he gathered. Anita started off with a summary of work done under the nano air vehicle program (the little 10g flapper we worked on before starting MAST). The simulation group (Bob and Yoshi) was interested in getting some of the models developed for the aerodynamics, mechanics and other components. They’re also doing some work on tubular fuel cells that can be wrapped around a structure for more conformal coating and better space use in a vehicle. Right now the diameter of an individual cell is ~1cm, and the cells run at room temperature with simpler structures than standard cells. The power density of not great (as with most fuel cells), but the energy density is great.

We then talked about scenarios interesting to Larry. He hasn’t thought much about indoor scenarios, but he is interested in making a small vehicle land on the corner of a building after being launched from a larger UAV. He’s a computer vision guy, and he thinks that a decent vehicle with vision could be controlled to do that, and we should follow up with him to try and figure out what he needs to do something like that. I suspect this is a tough problem, but it could be really cool. It would be possible to put the intense computation load on the “mother ship” and have much less on the vehicle. Anyway, Larry is interested in looking at using computer vision in real systems to solve real problems, so he could be a good partner.

SPAWAR – 14 January 2009

The next day we went down to San Diego to go to SPAWAR (Space and Naval Warfare Systems Command) robotics group where we were treated to a full day tour by the lab director, Bart Everett. SPAWAR is a technology integrator and demonstrator rather than a basic research center. They try to take cutting edge technology and demonstrate advanced robots using a combination of commercial technology, custom hardware and software, and SPAWAR tinkering and elbow grease. First we say the autonomous surface vessels (tricked out speed boats). These have full 3D LIDAR, stereo vision, radar, and a bunch of computers. These things can map the scene in front of them in real time to do collision avoidance as well as track targets. Next we saw the ground vehicles including the pack bots that they work on to add autonomy, extra sensors, etc. One thing they’ve done is add a module that can eject radio relay nodes so that extend the range of the robot from the base station. They’ve also done sensor fusion where robots can make decisions based on off board sensor readings like vibrations sensors. We saw an autonomous packbot that was told to map a building, and it came in, used its LIDAR to build a map and then when it was done navigated its way back out and parked exactly where it started just using the map it had built rather than GPS or other means. We talked a lot of about small robots, and durability was a primary issue. The problem is the soldiers are not interested in carefully loading their trucks because they would prefer not getting shot. As a result a robot that can’t handle getting thrown into the truck with stuff piled on top of it isn’t going to work. Additionally, extra packaging to make it durable won’t be acceptable because that space could be filled with ammo, armor, etc. Launching robots from the green zone is desirable to avoid this problem. Bart and Co. was very interested in improved smaller sensors (vibration, cameras, etc), and improved wireless mesh networking of the sensors to improve lifetime and data throughput. Essentially, anything that pushes the sensor technology forward (and small robust platforms) is of interest. They’re less interested in research into autonomy than they are in platforms, sensors. One application they were very excited about was sending a robot through a building to take photos of the entire building. They have software that can stitch images together into a 3D map of the building (based on photostitch out of U. Washington and Microsoft). They’re interested in robot hierarchy (one robot launches another that launches a sensor), and they would like to see work in that area as well. There are videos on the wiki that show some of the demos they’ve done that show the highly advanced capabilities of the robots developed at SPAWAR.

PmWiki can't process your request

Cannot acquire lockfile

We are sorry for any inconvenience.