Ingenuity mars helicopter

Ingenuity mars helicopter DEFAULT

Mars helicopter Ingenuity aborted latest flight attempt because of anomaly

NASA's Mars helicopter Ingenuity didn't get off the ground as planned earlier this month.

Ingenuity was scheduled to make its 14th Red Planet sortie on Sept. 18, a relatively short and simple hop that would have demonstrated the little chopper's ability to fly with slightly higher rotor speeds — 2,700 revolutions per minute (RPM) rather than the usual 2,537 RPM.

The mission team is making this adjustment to deal with the Martian atmosphere, which is thinning out slightly as the seasons change on the floor of the Red Planet's Jezero Crater, Jaakko Karras, Ingenuity deputy operations lead at NASA's Jet Propulsion Laboratory in Southern California, wrote in an update Tuesday (Sept. 28).

Related:Watch NASA's Mars helicopter Ingenuity explore intriguing Raised Ridges

Ingenuity performed a high-speed rotation test on Sept. 15, spinning its blades at 2,800 RPM for a spell while it remained on the ground. Everything went well, paving the way for the Sept. 18 flight. But the 4-pound (1.8 kilograms) chopper did not end up taking off that day.

"Here's what happened: Ingenuity detected an anomaly in two of the small flight-control servo motors (or simply 'servos') during its automatic pre-flight checkout and did exactly what it was supposed to do: It canceled the flight," Karras wrote.

Ingenuity has six servos, three for each of its two rotors. The little motors adjust the pitch of the rotors, allowing the chopper to control its orientation and position during flight.

"The servo motors are much smaller than the motors that spin the rotors, but they do a tremendous amount of work and are critical to stable, controlled flight," Karras wrote.

Analysis of the Sept. 18 preflight test has shown that two of Ingenuity's servos oscillated slightly during the "servo wiggle" checkout. The team is still trying to determine the cause, but it may be due to increasing wear in the servo gearboxes and linkages, Karras wrote. (Ingenuity is a technology demonstrator that was originally supposed to make just five flights on the Red Planet.)

Ingenuity passed two additional servo wiggle tests on Sept. 21 and Sept. 23, however, "so the issue isn’t entirely repeatable," Karras wrote. "We have a number of tools available for working through the anomaly, and we're optimistic that we'll get past it and back to flying again soon."

But orbital dynamics will keep Ingenuity grounded for a couple more weeks at least. Mars is now in "solar conjunction," meaning it's on the other side of the sun from Earth. Our star can corrupt and otherwise interfere with communications sent between the two planets, so NASA has stopped sending commands to Ingenuity and its other Red Planet robots — including Ingenuity's much larger partner, the Perseverance rover— until mid-October, when Mars will come more clearly into view.

"Ingenuity will not be completely idle during this time, however; Ingenuity and Perseverance will be configured to keep each other company by communicating roughly once a week, with Ingenuity sending basic system health information to its base station on Perseverance," Karras wrote. "We will receive this data on Earth once we come out of conjunction, and will learn how Ingenuity performs over an extended period of relative inactivity on Mars. See you on the other side of conjunction!"

Mike Wall is the author of "Out There" (Grand Central Publishing, 2018; illustrated by Karl Tate), a book about the search for alien life. Follow him on Twitter @michaeldwall. Follow us on Twitter @Spacedotcom or Facebook. 

Join our Space Forums to keep talking space on the latest missions, night sky and more! And if you have a news tip, correction or comment, let us know at: [email protected]

SPACE.COM SENIOR SPACE WRITER — Michael has been writing for Space.com since 2010. His book about the search for alien life, "Out There," was published on Nov. 13, 2018. Before becoming a science writer, Michael worked as a herpetologist and wildlife biologist. He has a Ph.D. in evolutionary biology from the University of Sydney, Australia, a bachelor's degree from the University of Arizona, and a graduate certificate in science writing from the University of California, Santa Cruz. To find out what his latest project is, you can follow Michael on Twitter. 
Sours: https://www.space.com/mars-helicopter-ingenuity-flight-14-abort

JPL's Plan for the Next Mars Helicopter

The ability to make decisions autonomously is not just what makes robots useful, it's what makes robots robots. We value robots for their ability to sense what's going on around them, make decisions based on that information, and then take useful actions without our input. In the past, robotic decision making followed highly structured rules—if you sense this, then do that. In structured environments like factories, this works well enough. But in chaotic, unfamiliar, or poorly defined settings, reliance on rules makes robots notoriously bad at dealing with anything that could not be precisely predicted and planned for in advance.

RoMan, along with many other robots including home vacuums, drones, and autonomous cars, handles the challenges of semistructured environments through artificial neural networks—a computing approach that loosely mimics the structure of neurons in biological brains. About a decade ago, artificial neural networks began to be applied to a wide variety of semistructured data that had previously been very difficult for computers running rules-based programming (generally referred to as symbolic reasoning) to interpret. Rather than recognizing specific data structures, an artificial neural network is able to recognize data patterns, identifying novel data that are similar (but not identical) to data that the network has encountered before. Indeed, part of the appeal of artificial neural networks is that they are trained by example, by letting the network ingest annotated data and learn its own system of pattern recognition. For neural networks with multiple layers of abstraction, this technique is called deep learning.

Even though humans are typically involved in the training process, and even though artificial neural networks were inspired by the neural networks in human brains, the kind of pattern recognition a deep learning system does is fundamentally different from the way humans see the world. It's often nearly impossible to understand the relationship between the data input into the system and the interpretation of the data that the system outputs. And that difference—the "black box" opacity of deep learning—poses a potential problem for robots like RoMan and for the Army Research Lab.

In chaotic, unfamiliar, or poorly defined settings, reliance on rules makes robots notoriously bad at dealing with anything that could not be precisely predicted and planned for in advance.

This opacity means that robots that rely on deep learning have to be used carefully. A deep-learning system is good at recognizing patterns, but lacks the world understanding that a human typically uses to make decisions, which is why such systems do best when their applications are well defined and narrow in scope. "When you have well-structured inputs and outputs, and you can encapsulate your problem in that kind of relationship, I think deep learning does very well," says Tom Howard, who directs the University of Rochester's Robotics and Artificial Intelligence Laboratory and has developed natural-language interaction algorithms for RoMan and other ground robots. "The question when programming an intelligent robot is, at what practical size do those deep-learning building blocks exist?" Howard explains that when you apply deep learning to higher-level problems, the number of possible inputs becomes very large, and solving problems at that scale can be challenging. And the potential consequences of unexpected or unexplainable behavior are much more significant when that behavior is manifested through a 170-kilogram two-armed military robot.

After a couple of minutes, RoMan hasn't moved—it's still sitting there, pondering the tree branch, arms poised like a praying mantis. For the last 10 years, the Army Research Lab's Robotics Collaborative Technology Alliance (RCTA) has been working with roboticists from Carnegie Mellon University, Florida State University, General Dynamics Land Systems, JPL, MIT, QinetiQ North America, University of Central Florida, the University of Pennsylvania, and other top research institutions to develop robot autonomy for use in future ground-combat vehicles. RoMan is one part of that process.

The "go clear a path" task that RoMan is slowly thinking through is difficult for a robot because the task is so abstract. RoMan needs to identify objects that might be blocking the path, reason about the physical properties of those objects, figure out how to grasp them and what kind of manipulation technique might be best to apply (like pushing, pulling, or lifting), and then make it happen. That's a lot of steps and a lot of unknowns for a robot with a limited understanding of the world.

This limited understanding is where the ARL robots begin to differ from other robots that rely on deep learning, says Ethan Stump, chief scientist of the AI for Maneuver and Mobility program at ARL. "The Army can be called upon to operate basically anywhere in the world. We do not have a mechanism for collecting data in all the different domains in which we might be operating. We may be deployed to some unknown forest on the other side of the world, but we'll be expected to perform just as well as we would in our own backyard," he says. Most deep-learning systems function reliably only within the domains and environments in which they've been trained. Even if the domain is something like "every drivable road in San Francisco," the robot will do fine, because that's a data set that has already been collected. But, Stump says, that's not an option for the military. If an Army deep-learning system doesn't perform well, they can't simply solve the problem by collecting more data.

ARL's robots also need to have a broad awareness of what they're doing. "In a standard operations order for a mission, you have goals, constraints, a paragraph on the commander's intent—basically a narrative of the purpose of the mission—which provides contextual info that humans can interpret and gives them the structure for when they need to make decisions and when they need to improvise," Stump explains. In other words, RoMan may need to clear a path quickly, or it may need to clear a path quietly, depending on the mission's broader objectives. That's a big ask for even the most advanced robot. "I can't think of a deep-learning approach that can deal with this kind of information," Stump says.

While I watch, RoMan is reset for a second try at branch removal. ARL's approach to autonomy is modular, where deep learning is combined with other techniques, and the robot is helping ARL figure out which tasks are appropriate for which techniques. At the moment, RoMan is testing two different ways of identifying objects from 3D sensor data: UPenn's approach is deep-learning-based, while Carnegie Mellon is using a method called perception through search, which relies on a more traditional database of 3D models. Perception through search works only if you know exactly which objects you're looking for in advance, but training is much faster since you need only a single model per object. It can also be more accurate when perception of the object is difficult—if the object is partially hidden or upside-down, for example. ARL is testing these strategies to determine which is the most versatile and effective, letting them run simultaneously and compete against each other.

Perception is one of the things that deep learning tends to excel at. "The computer vision community has made crazy progress using deep learning for this stuff," says Maggie Wigness, a computer scientist at ARL. "We've had good success with some of these models that were trained in one environment generalizing to a new environment, and we intend to keep using deep learning for these sorts of tasks, because it's the state of the art."

ARL's modular approach might combine several techniques in ways that leverage their particular strengths. For example, a perception system that uses deep-learning-based vision to classify terrain could work alongside an autonomous driving system based on an approach called inverse reinforcement learning, where the model can rapidly be created or refined by observations from human soldiers. Traditional reinforcement learning optimizes a solution based on established reward functions, and is often applied when you're not necessarily sure what optimal behavior looks like. This is less of a concern for the Army, which can generally assume that well-trained humans will be nearby to show a robot the right way to do things. "When we deploy these robots, things can change very quickly," Wigness says. "So we wanted a technique where we could have a soldier intervene, and with just a few examples from a user in the field, we can update the system if we need a new behavior." A deep-learning technique would require "a lot more data and time," she says.

It's not just data-sparse problems and fast adaptation that deep learning struggles with. There are also questions of robustness, explainability, and safety. "These questions aren't unique to the military," says Stump, "but it's especially important when we're talking about systems that may incorporate lethality." To be clear, ARL is not currently working on lethal autonomous weapons systems, but the lab is helping to lay the groundwork for autonomous systems in the U.S. military more broadly, which means considering ways in which such systems may be used in the future.

The requirements of a deep network are to a large extent misaligned with the requirements of an Army mission, and that's a problem.

Safety is an obvious priority, and yet there isn't a clear way of making a deep-learning system verifiably safe, according to Stump. "Doing deep learning with safety constraints is a major research effort. It's hard to add those constraints into the system, because you don't know where the constraints already in the system came from. So when the mission changes, or the context changes, it's hard to deal with that. It's not even a data question; it's an architecture question." ARL's modular architecture, whether it's a perception module that uses deep learning or an autonomous driving module that uses inverse reinforcement learning or something else, can form parts of a broader autonomous system that incorporates the kinds of safety and adaptability that the military requires. Other modules in the system can operate at a higher level, using different techniques that are more verifiable or explainable and that can step in to protect the overall system from adverse unpredictable behaviors. "If other information comes in and changes what we need to do, there's a hierarchy there," Stump says. "It all happens in a rational way."

Nicholas Roy, who leads the Robust Robotics Group at MIT and describes himself as "somewhat of a rabble-rouser" due to his skepticism of some of the claims made about the power of deep learning, agrees with the ARL roboticists that deep-learning approaches often can't handle the kinds of challenges that the Army has to be prepared for. "The Army is always entering new environments, and the adversary is always going to be trying to change the environment so that the training process the robots went through simply won't match what they're seeing," Roy says. "So the requirements of a deep network are to a large extent misaligned with the requirements of an Army mission, and that's a problem."

Roy, who has worked on abstract reasoning for ground robots as part of the RCTA, emphasizes that deep learning is a useful technology when applied to problems with clear functional relationships, but when you start looking at abstract concepts, it's not clear whether deep learning is a viable approach. "I'm very interested in finding how neural networks and deep learning could be assembled in a way that supports higher-level reasoning," Roy says. "I think it comes down to the notion of combining multiple low-level neural networks to express higher level concepts, and I do not believe that we understand how to do that yet." Roy gives the example of using two separate neural networks, one to detect objects that are cars and the other to detect objects that are red. It's harder to combine those two networks into one larger network that detects red cars than it would be if you were using a symbolic reasoning system based on structured rules with logical relationships. "Lots of people are working on this, but I haven't seen a real success that drives abstract reasoning of this kind."

For the foreseeable future, ARL is making sure that its autonomous systems are safe and robust by keeping humans around for both higher-level reasoning and occasional low-level advice. Humans might not be directly in the loop at all times, but the idea is that humans and robots are more effective when working together as a team. When the most recent phase of the Robotics Collaborative Technology Alliance program began in 2009, Stump says, "we'd already had many years of being in Iraq and Afghanistan, where robots were often used as tools. We've been trying to figure out what we can do to transition robots from tools to acting more as teammates within the squad."

RoMan gets a little bit of help when a human supervisor points out a region of the branch where grasping might be most effective. The robot doesn't have any fundamental knowledge about what a tree branch actually is, and this lack of world knowledge (what we think of as common sense) is a fundamental problem with autonomous systems of all kinds. Having a human leverage our vast experience into a small amount of guidance can make RoMan's job much easier. And indeed, this time RoMan manages to successfully grasp the branch and noisily haul it across the room.

Turning a robot into a good teammate can be difficult, because it can be tricky to find the right amount of autonomy. Too little and it would take most or all of the focus of one human to manage one robot, which may be appropriate in special situations like explosive-ordnance disposal but is otherwise not efficient. Too much autonomy and you'd start to have issues with trust, safety, and explainability.

"I think the level that we're looking for here is for robots to operate on the level of working dogs," explains Stump. "They understand exactly what we need them to do in limited circumstances, they have a small amount of flexibility and creativity if they are faced with novel circumstances, but we don't expect them to do creative problem-solving. And if they need help, they fall back on us."

RoMan is not likely to find itself out in the field on a mission anytime soon, even as part of a team with humans. It's very much a research platform. But the software being developed for RoMan and other robots at ARL, called Adaptive Planner Parameter Learning (APPL), will likely be used first in autonomous driving, and later in more complex robotic systems that could include mobile manipulators like RoMan. APPL combines different machine-learning techniques (including inverse reinforcement learning and deep learning) arranged hierarchically underneath classical autonomous navigation systems. That allows high-level goals and constraints to be applied on top of lower-level programming. Humans can use teleoperated demonstrations, corrective interventions, and evaluative feedback to help robots adjust to new environments, while the robots can use unsupervised reinforcement learning to adjust their behavior parameters on the fly. The result is an autonomy system that can enjoy many of the benefits of machine learning, while also providing the kind of safety and explainability that the Army needs. With APPL, a learning-based system like RoMan can operate in predictable ways even under uncertainty, falling back on human tuning or human demonstration if it ends up in an environment that's too different from what it trained on.

It's tempting to look at the rapid progress of commercial and industrial autonomous systems (autonomous cars being just one example) and wonder why the Army seems to be somewhat behind the state of the art. But as Stump finds himself having to explain to Army generals, when it comes to autonomous systems, "there are lots of hard problems, but industry's hard problems are different from the Army's hard problems." The Army doesn't have the luxury of operating its robots in structured environments with lots of data, which is why ARL has put so much effort into APPL, and into maintaining a place for humans. Going forward, humans are likely to remain a key part of the autonomous framework that ARL is developing. "That's what we're trying to build with our robotics systems," Stump says. "That's our bumper sticker: 'From tools to teammates.' "

This article appears in the October 2021 print issue as "Deep Learning Goes to Boot Camp."

From Your Site Articles

Related Articles Around the Web

Sours: https://spectrum.ieee.org/the-next-mars-helicopter
  1. Lunchbox queen
  2. Albemarle center console
  3. The patriot nurse real name
  4. Step 1 exam questions

NASA's Ingenuity Mars helicopter spots Perseverance rover from above - but can you?

NASA's Ingenuity Mars Helicopter has completed its 11th flight on the planet, capturing dozens of images including one in which its mothership - the Perseverance rover - is almost impossible to spot.

The helicopter took photographs of boulders, sand dunes and rocky outcrops across the South Seitah region of the Jezero Crater, the location of an ancient river delta where NASA hopes it may find the remnants of microbial life.

"Ingenuity's aerial images are awesome - but even better when you get to play 'Where's Perseverance?' with them," said Robert Hogg, a senior systems engineer at NASA's Jet Propulsion Lab.

He said that "once you find our rover and zoom in, you can make out some details, like the wheels, remote sensing mast, and the MMRTG" - the Multi-Mission Radioisotope Thermoelectric Generator, its power source - "on the aft end".

The rover is bright white speck from about 1,600 feet (500m) away and 39 feet (12m) up.

Ingenuity captured the Perseverance rover in an image taken during its 11th flight at Mars on Aug. 4. Credits: NASA/JPL-Caltech

"The laws of physics may say it's near impossible to fly on Mars, but actually flying a heavier-than-air vehicle on the red planet is much harder than that," the space agency had quipped about the Ingenuity mission.

The little helicopter's 11th flight since its maiden voyage back in April was designed to keep it ahead of the rover, flying about 11mph (five metres per second) to capture images of intriguing geologic features.

More on Mars Perseverance Rover

Ingenuity works autonomously and cannot be controlled by NASA due to the distance between Earth and Mars.

It takes more than 11 minutes to transmit a radio signal 287 million kilometres (178 million miles) back to Earth - while the most recent flight took only 130 seconds.

It follows Perseverance failing in its first attempt to collect a rock sample from Mars as part of the search for signs of ancient life on the planet.

Nasa said on Wednesday that this was down to the rock being unusually soft, and so it was not strong enough to make a sample.

This image taken by NASA’s Perseverance rover on Aug. 6, shows that sample collection tube No. 233 is empty. It is one of the pieces of data sent to Earth by Perseverance showing that the rover did not collect any Martian rock during its first attempt to core a sample. Credits: NASA/JPL-Caltech.

The rover is equipped with a two-metre-long robot arm which has a hollow coring bit and a percussive drill at the end of it to extract samples from beneath the Martian surface.

About half a kilogram in rock and soil samples are intended to be cached in large titanium tubes that the rover will leave on the planet to be collected by a yet-to-be-confirmed future mission.

Sours: https://news.sky.com/story/nasas-ingenuity-mars-helicopter-spots-perseverance-rover-from-above-but-can-you-12380241
Watch the Ingenuity helicopter's first flight on Mars
DateTitleChannelMay 20, 2021Space Cameras: A Sharper Image
Channels that carried the live broadcast include: YouTube and Facebook.

Speakers from JPL:
- Dr. Justin Maki, Imaging Scientist and Deputy Principal Investigator for Mastcam-Z on the Perseverance rover
- Hallie Abarca, Mars 2020/Perseverance Image and Data Processing Operations Lead

Replay on YouTube
NASA JPLMay 4, 2021Mars Helicopter and the Future of Extraterrestrial FlightReplay on YouTube
NASAApril 30, 2021News Briefing: Next Steps for Mars Helicopter
Channels that carried the live broadcast include:
YouTube and NASA App.

Participants are:
- Lori Glaze, planetary science director at NASA Headquarters, Washington
- MiMi Aung, Ingenuity Mars Helicopter project manager at JPL
- Bob Balaram, Ingenuity Mars Helicopter chief engineer at JPL
- Jennifer Trosper, Perseverance rover deputy project manager at JPL
- Ken Farley, Perseverance project scientist at Caltech

Replay on YouTube
NASA JPLApril 29, 2021Taking Flight: How Girls Can Grow Up to be Engineers -
Get Your Ideas Off the Ground!

Speakers from JPL:
- MiMi Aung, Ingenuity Mars Helicopter project manager
- Jessica Samuels, surface systems manager for the Mars 2020 mission
- Priyanka Sharma, systems engineer for NISAR and president of JPL's Advisory Council for Women

Replay Webinar
NASA JPLApril 22, 2021Taking Flight: How Girls Can Grow Up to be Engineers -
Internships and Other Opportunities

Speakers from JPL:
- Vandi Verma, chief engineer, robotic operations for the Mars 2020 mission
- Jessica Gonzales, software systems engineer
- Ota Lutz, elementary and secondary education lead
- Leslie Lowes, STEM informal education specialist

Replay Webinar
NASA JPLApril 19, 2021News Briefing: Mars Helicopter Post-Flight
Channels that carried the live broadcast include:
YouTube and Facebook.

Participants are:
- Thomas Zurbuchen, associate administrator of NASA's Science Mission Directorate
- Michael Watkins, JPL director
- MiMi Aung, Ingenuity Mars Helicopter project manager at JPL
- Bob Balaram, Ingenuity Mars Helicopter chief engineer at JPL
- Håvard Grip, Ingenuity Mars Helicopter chief pilot at JPL
- Justin Maki, Perseverance Mars rover imaging scientist and deputy principal investigator of Mastcam-Z instrument at JPL

Replay on YouTube
NASAApril 19, 2021Live Broadcast: Mars Helicopter First Test Flight Results
Channels that carried the live broadcast include:
YouTube, Twitter, Facebook, Twitch, and NASA App.Replay on YouTube
NASAApril 15, 2021Taking Flight: How Girls Can Grow Up to be Engineers -
Chart Your Path!

Speakers from JPL:
- Kim Steadman, systems engineer
- Nagin Cox, engineering operations deputy team chief
- Samantha Hatch, human resources specialist

Replay Webinar
NASA JPLApril 9, 2021News Briefing: Mars Helicopter Pre-Flight
Channels that carried the broadcast include:
YouTube and Facebook.Replay on YouTube
NASA JPLApril 8, 2021Experts Discuss NASA's Mars Helicopter - Talk for StudentsReplay on YouTube
NASAJPL EduApril 5, 2021Month of Ingenuity - Helicopter Flight PreviewReplay Webinar
NASA JPLApril 5, 2021Mars Helicopter Live Q&A: One Step Closer to First FlightReplay on YouTube
NASA JPLMarch 23, 2021News Briefing: Preview First Mars Helicopter Flights
Read news release ›Replay on YouTube
NASA JPLMarch 11, 2021Public Talk: Helicopters in Space

Replay on YouTube
NASA JPLMarch 8, 2021Taking Flight: How Girls Can Grow Up to be Engineers

Replay Webinar
NASA JPLJuly 28, 2020Mission Tech and Humans to Mars BriefingReplay on YouTube
NASA KSC-JPLApril 29, 2020 Meet Ingenuity, the Mars Helicopter
MiMi Aung, Mars Helicopter Project Manager Replay on YouTube
NASA JPL
Sours: https://mars.nasa.gov/technology/helicopter/

Mars helicopter ingenuity

Ingenuity Mars Helicopter spots Perseverance from above

Can you see NASA's newest rover in this picture from Jezero Crater?

NASA's Ingenuity Mars Helicopter recently completed its 11th flight at the Red Planet, snapping multiple photographs during its trip. Along with capturing the boulders, sand dunes, and rocky outcrops prevalent in the "South Séítah" region of Jezero Crater, a few of the images capture NASA's Perseverance rover amid its first science campaign.

Ingenuity began as a technological demonstration to prove that powered, controlled flight on Mars is possible. It is now an operations demonstration intended to investigate how a rotorcraft can add an aerial dimension to missions like Perseverance, scouting possible areas of scientific interest and offering detailed views of nearby areas too hazardous for the rover to explore.

"Ingenuity's aerial images are awesome—but even better when you get to play 'Where's Perseverance?' with them," said Robert Hogg. "Once you find our rover and zoom in, you can make out some details, like the wheels, remote sensing mast, and the MMRTG"—the Multi-Mission Radioisotope Thermoelectric Generator—"on the aft end."

Ingenuity Mars Helicopter spots Perseverance from above

So where is Perseverance? At the bottom center of the image, you can find Ingenuity's shadow. From there, go straight up. Just beyond South Seítah's dune field near the top of the image and just to the right of center is a bright white speck. That's what a Mars rover looks like from about 1,600 feet (500 meters) away and 39 feet (12 meters) up.

Flight 11 was essentially designed to keep Ingenuity ahead of the rover, allowing it to continue to support Perseverance's science goals by photographing intriguing geologic features from the air. Flying north-by-northwest at 11 mph (five meters per second), it took Ingenuity 130.9 seconds to make the trip to its 8th airfield. From this new staging area, the helicopter is scheduled to make at least one reconnaissance flight of the geologically intriguing South Séítah area.



Provided by Jet Propulsion Laboratory

Citation: Ingenuity Mars Helicopter spots Perseverance from above (2021, August 11) retrieved 15 October 2021 from https://phys.org/news/2021-08-ingenuity-mars-helicopter-perseverance.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Sours: https://phys.org/news/2021-08-ingenuity-mars-helicopter-perseverance.html
Ingenuity Helicopter’s last Mars 4K photo before Solar Conjunction

NASA Ingenuity Mars Helicopter Above

2,800 RPM Spin a Success, but Flight 14 Delayed to Post Conjunction

It’s been an eventful several Martian days, or sols, since our last blog post, so we wanted to provide everyone with an update on where things stand on Mars. In our last post, we explained that we were getting ready to begin flying with a higher rotor speed to compensate for decreasing atmospheric density caused by seasonal changes on Mars. Increasing the rotor speed is a significant change to how we’ve been flying thus far, so we wanted to proceed forward carefully. Step one was to perform a high-speed spin test at 2,800 rpm on the ground and, if everything went well, step two was to perform a short-duration flight, briefly hovering over our current location, with a 2,700 rpm rotor speed.

The high-speed spin test was completed successfully on September 15, 2021 at 23:29 PDT, 11:11 LMST local Mars time (Sol 204 of the Perseverance mission). Ingenuity’s motors spun the rotors up to 2,800 rpm, briefly held that speed, and then spun the rotors back down to a stop, all exactly as sequenced for the test. All other subsystems performed flawlessly. Of particular interest was determining whether the higher rotor speeds cause resonances (vibrations) in Ingenuity’s structure. Resonances are a common challenge in aerial rotorcraft and can cause problems with sensing and control, and can also lead to mechanical damage. Fortunately, the data from this latest high-speed spin showed no resonances at the higher rotor rpm’s. The successful high-speed spin was an exciting achievement for Ingenuity and gave us the green light to proceed to a test flight with a 2,700 rpm rotor speed.

Ingenuity's Upper Swashplate Assembly

Ingenuity’s Upper Swashplate Assembly: The upper swashplate of NASA’s Ingenuity Mars Helicopter controls the pitch of the upper rotor blades as they rotate and is critical to stable, controlled flight. The swashplate is driven by three small servo motors. Credit: NASA/JPL-Caltech.

The test flight was scheduled to take place on September 18, 2021 (Sol 206) and was supposed to be a brief hover flight at 16 feet (5 meters) altitude with a 2,700 rpm rotor speed. It turned out to be an uneventful flight, because Ingenuity decided to not take off. Here’s what happened: Ingenuity detected an anomaly in two of the small flight-control servo motors (or simply “servos”) during its automatic pre-flight checkout and did exactly what it was supposed to do: It canceled the flight.

Ingenuity controls its position and orientation during flight by adjusting the pitch of each of the four rotor blades as they spin around the mast. Blade pitch is adjusted through a swashplate mechanism, which is actuated by servos. Each rotor has its own independently controlled swashplate, and each swashplate is actuated by three servos, so Ingenuity has six servos in total. The servo motors are much smaller than the motors that spin the rotors, but they do a tremendous amount of work and are critical to stable, controlled flight. Because of their criticality, Ingenuity performs an automated check on the servos before every flight. This self-test drives the six servos through a sequence of steps over their range of motion and verifies that they reach their commanded positions after each step. We affectionately refer to the Ingenuity servo self-test as the “servo wiggle.”

The data from the anomalous pre-flight servo wiggle shows that two of the upper rotor swashplate servos – servos 1 and 2 – began to oscillate with an amplitude of approximately 1 degree about their commanded positions just after the second step of the sequence. Ingenuity’s software detected this oscillation and promptly canceled the self-test and flight.

Our team is still looking into the anomaly. To gather more data, we had Ingenuity execute additional servo wiggle tests during the past week, with one wiggle test on September 21, 2021 (Sol 209) and one on September 23, 2021 (Sol 211). Both of the wiggle tests ran successfully, so the issue isn’t entirely repeatable.

One theory for what’s happening is that moving parts in the servo gearboxes and swashplate linkages are beginning to show some wear now that Ingenuity has flown well over twice as many flights as originally planned (13 completed versus five planned). Wear in these moving parts would cause increased clearances and increased looseness, and could explain servo oscillation. Another theory is that the high-speed spin test left the upper rotor at a position that loads servos 1 and 2 in a unique, oscillation-inducing way that we haven’t encountered before. We have a number of tools available for working through the anomaly and we’re optimistic that we’ll get past it and back to flying again soon.

Our team will have a few weeks of time to complete our analysis because Mars will be in solar conjunction until mid-October, and we won’t be uplinking any command sequences to Ingenuity during that time. Conjunction is a special period in which Mars moves behind the Sun (as seen from Earth), making communications with spacecraft on Mars unreliable. Ingenuity will not be completely idle during this time, however; Ingenuity and Perseverance will be configured to keep each other company by communicating roughly once a week, with Ingenuity sending basic system health information to its base station on Perseverance. We will receive this data on Earth once we come out of conjunction, and will learn how Ingenuity performs over an extended period of relative inactivity on Mars. See you on the other side of conjunction!

Written by Jaakko Karras, Ingenuity Mars Helicopter Deputy Operations Lead at NASA’s Jet Propulsion Laboratory

Sours: https://scitechdaily.com/next-ingenuity-mars-helicopter-flight-delayed-until-after-conjunction-heres-what-went-wrong/

You will also be interested:

Ingenuity (helicopter)

NASA helicopter on the Mars 2020 mission

"Mars helicopter" redirects here. For Mars aircraft in general, see Mars aircraft.

"Wright Brothers Field" redirects here. For other uses, see Wright Field (disambiguation).

Ingenuity is a small robotic helicopter operating on Mars as part of NASA's Mars 2020 mission. On April 19, 2021, it successfully completed the first powered controlled flight by an aircraft on a planet besides Earth, taking off vertically, hovering and landing for a flight duration of 39.1 seconds).[11][12][13]

Ingenuity was designed and built by NASA's Jet Propulsion Laboratory (JPL). Other contributors include NASA Ames Research Center, NASA Langley Research Center,[14]AeroVironment, Inc., SolAero, and Lockheed Martin Space.[15] The helicopter had made 13 successful flights as of September 5, 2021.

Ingenuity is operated by solar-charged batteries that power dual counter-rotating rotors mounted one above the other. During its 30-day technology demonstration, Ingenuity was intended to fly up to five times at altitudes ranging 3–5 m (10–16 ft) above the ground for up to 90 seconds each.[1][16] The expected lateral range was exceeded in the third flight, and the flight duration was exceeded in the fourth flight. With those technical successes, Ingenuity achieved its original objectives. The flights proved the helicopter's ability to fly in the extremely thin atmosphere of another planet over a hundred million miles from Earth without direct human control. Ingenuity operates autonomously, performing maneuvers planned, scripted and transmitted to it by JPL.

After the brief demonstration phase, JPL then began more flights as operational demonstrations, to show how aerial scouting can benefit future exploration of Mars and other worlds.[17][18] In its operational role, Ingenuity is observing areas of interest for possible examination by the Perseverance rover.[19][20][1][21]

Ingenuity travelled to Mars attached to the underside of Perseverance, which touched down at the Octavia E. Butler Landing site in Jezero crater on February 18, 2021.[6][7][8] The helicopter was deployed to the surface on April 3, 2021,[22][23] and Perseverance drove approximately 100 m (330 ft) away to allow the drone a safe "buffer zone" in which it made its first flight.[24] Success was confirmed three hours later in a livestreaming TV feed of JPL mission control.[26][27][28] On its fourth flight, April 30, 2021, Ingenuity became the first interplanetary spacecraft whose sound was recorded by another interplanetary spacecraft, Perseverance.[29]

Ingenuity carries a piece of fabric from the wing of the 1903 Wright Flyer, the Wright Brothers' airplane used in the first controlled powered heavier-than-air flight on Earth. The initial take-off and landing area for Ingenuity is named Wright Brothers Field as a tribute.[30] Before Ingenuity, the first flight of any kind on a planet beyond Earth was an unpowered balloon flight on Venus, by the Soviet Vega 1 spacecraft in 1985.[31]

Design[edit]

The main components of Ingenuity

The lower gravity of Mars (about a third of Earth's) only partially offsets the thinness of the 95% carbon dioxideatmosphere of Mars[36] thus making it much harder for an aircraft to generate adequate lift. The atmospheric density of the Red Planet is about 1⁄100 as that of Earth at sea level, or approximately the same as 87,000 ft (27,000 m), an altitude never reached by existing helicopters. To keep Ingenuity aloft, its specially shaped blades of enlarged size must rotate at a speed at least 2400 and up to 2900 rpm, or about 10 times faster[2] than what is needed on Earth.[37][38] The helicopter uses contra-rotatingcoaxial rotors about 1.2 m (4 ft) in diameter. Each rotor is controlled by a separate swashplate that can affect both collective and cyclic pitch.[39]

There are two cameras on board: the downward-looking black-and-white navigation camera (NAV) and the color camera to make terrain images for return to Earth (RTE).[21] Although it is an aircraft, it was constructed to spacecraft specifications in order to endure the acceleration and vibrations during launch.[38] It also includes radiation-resistant systems capable of operating in the environment of Mars. The inconsistent Mars magnetic field precludes the use of a compass for navigation, so Ingenuity relies upon different sensors grouped in two assemblies. All sensors are commercial off-the-shelf units.

The Upper Sensor Assembly with associated vibration isolation elements is mounted on the mast close to the center-of-mass of the vehicle to minimize the effects of angular rates and accelerations. It consists of a cellphone grade Bosch BMI-160 Inertial measurement unit (IMU) and an inclinometer (Murata SCA100T-D02), which is used only on the ground prior to flight to calibrate the IMU accelerometers biases. The Lower Sensor Assembly consists of an altimeter (GarminLIDAR Lite v3), both of the cameras and a secondary IMU, all mounted directly onto the Electronics Core Module and not onto the mast. The down-facing Omnivision OV7251 camera supports visual odometry, in which images are processed to produce navigation solutions that calculate helicopter position, velocity, attitude, and other variables.[21]

The helicopter uses solar panels to recharge its batteries, which are six Sony Li-ion cells with 35–40 Wh (130–140 kJ) of energy capacity[35] (nameplate capacity of 2 Ah).[21] Flight duration is not constrained by the available power, but by the motors heating up one degree centigrade every second.[40]

The helicopter uses a Qualcomm Snapdragon 801 processor with a Linux operating system.[41] Among other functions, this processor controls the visual navigation algorithm via a velocity estimate derived from terrain features tracked with the navigation camera.[42] The Qualcomm processor is connected to two flight-control microcontroller units (MCUs) to perform the necessary flight-control functions.[21]

The telecommunication system consists of two identical radios with monopole antennae which support the data exchange between the helicopter and the rover. The radio link is built upon the low-power Zigbeecommunication protocols, implemented via 914 MHz SiFlex 02 chipsets mounted in both the rover and helicopter. The communication system is designed to relay data at 250 kbit/s over distances of up to 1,000 m (3,300 ft). Antenna located on the solar panel of the helicopter weights 4 grams and may communicate equally in all directions.[43]

The Mars Helicopter team in 2018

Some of the Ingenuity team in 2019

The team[edit]

The history of the Mars Helicopter team dates back to 2012, when MiMi Aung was leading then JPL director Charles Elachi on a tour of the Autonomous Systems Division. Looking at the drones demonstrating onboard navigation algorithms in one of the labs, Elachi asked, "Hey, why don't we do that on Mars?" Engineer Bob Balaram briefed Elachi about feasibility, and a week later Elachi told him, "Okay, I’ve got some study money for you". By January 2015 NASA agreed to fund the development of a full-size model, which came to be known as the “risk reduction” vehicle. As project manager, Aung assembled a multidisciplinary team of scientists, engineers, and technicians leveraging all of NASA's expertise.[44]

The JPL team was never larger than 65 full-time-equivalent employees, but program workers at AeroVironment and NASA AMES and Langley research centers brought the total to 150.[44] Team members include:

  • MiMi Aung — Ingenuity Mars Helicopter Project Manager at NASA's Jet Propulsion Laboratory, «the Mars Helicopter Scout proposal lead»[44]
  • Bob Balaram — Chief Engineer
  • Teddy Tzanetos — Operations Lead
  • Håvard Fjær Grip — Chief Pilot
  • Timothy Canham - Flight Software Lead and Operations Lead (prior to June 2021)[59][60][61]
  • Josh Ravich — Mechanical Engineering Lead
  • Nacer Chahat — Senior antenna/microwave engineer (designed the antennae supporting the radio link on both Ingenuity and Perseverance)[43]

On June 15, 2021, the team behind Ingenuity was named the 2021 winner of the John L. “Jack” Swigert, Jr. Award for Space Exploration from the Space Foundation.[64]

Conceptual design[edit]

NASA's JPL and AeroVironment published the conceptual design in 2014 for a scout helicopter to accompany a rover.[14][65][66] By mid-2016, $15 million was being requested to continue development of the helicopter.[67] By December 2017, engineering models of the vehicle had been tested in a simulated martian atmosphere[21][2] and models were undergoing testing in the Arctic, but its inclusion in the mission had not yet been approved or funded.[68] The United States federal budget, announced in March 2018, provided $23 million for the helicopter for one year,[69][70] and it was announced on May 11, 2018, that the helicopter could be developed and tested in time to be included in the Mars 2020 mission.[71] The helicopter underwent extensive flight-dynamics and environment testing,[21][72] and was mounted on the underside of the Perseverance rover in August 2019.[73] NASA spent about $80 million to build Ingenuity and about $5 million to operate the helicopter.[74]

In April 2020, the vehicle was named Ingenuity by Vaneeza Rupani, a girl in the 11th grade at Tuscaloosa County High School in Northport, Alabama, who submitted an essay into NASA's "Name the Rover" contest.[75][76] Known in planning stages as the Mars Helicopter Scout,[33] or simply the Mars Helicopter,[3] the nickname Ginny later entered use in parallel to the parent rover Perseverance being affectionately referred to as Percy.[77]

Ingenuity was designed to be a technology demonstrator by JPL to assess whether such a vehicle could fly safely. Before it was built, launched and landed, scientists and managers expressed hope that helicopters could provide better mapping and guidance that would give future mission controllers more information to help with travel routes, planning and hazard avoidance.[78][79][80] Based on the performance of previous rovers through Curiosity, it was assumed that such aerial scouting might enable future rovers to safely drive up to three times as far per sol.[81][82] However, the new AutoNav capability at Perseverance significantly reduced this advantage, allowing the rover to cover more than 100 meters per sol.

Preliminary tests on Earth[edit]

In 2019, preliminary designs of Ingenuity were tested on Earth in simulated Mars atmospheric and gravity conditions. For flight testing, a large vacuum chamber was used to simulate the very low pressure of the atmosphere of Mars – filled with carbon dioxide to approximately 0.60% (about 1⁄160) of standard atmospheric pressure at sea level on Earth – which is roughly equivalent to a helicopter flying at 34,000 m (112,000 ft) altitude in the atmosphere of Earth. In order to simulate the much reduced gravity field of Mars (38% of Earth's), 62% of Earth's gravity was offset by a line pulling upwards during flight tests.[35] A "wind-wall" consisting of almost 900 computer fans was used to provide wind in the chamber.[85]: 1:08:05–1:08:40 

Mission profile[edit]

After deployment, the rover drove approximately 100 m (330 ft) away from the drone to allow a safe flying zone.[22][23] The Ingenuity helicopter was expected to fly up to five times during a 30-day test campaign, early in the rover's mission.[1][16]

Ingenuityhanging from the belly of the Perseverancerover during deployment to the Martian surface

Each flight was planned for altitudes ranging 3–5 m (10–16 ft) above the ground, though Ingenuity soon exceeded that planned height.[1] The first flight was a hover at an altitude of 3 m (9.8 ft), lasting about 40 seconds and including taking a picture of the rover. The first flight succeeded, and subsequent flights were increasingly ambitious as allotted time for operating the helicopter dwindled. JPL said the mission might even stop before the 30-day period ended, in the likely event that the helicopter crashed,[85]: 0:49:50–0:51:40  an outcome which did not occur. In up to 90 seconds per flight, Ingenuity could travel as far as 50 m (160 ft) downrange and then back to the starting area, though that goal was also soon exceeded with the fourth flight.[1] The helicopter uses autonomous control during its flights, which are telerobotically planned and scripted by operators at Jet Propulsion Laboratory (JPL). It communicates with the Perseverance rover directly before and after each landing.[85]: 1:20:38–1:22:20 

After the successful first three flights, the objective was changed from technology demonstration to operational demonstration. The goal shifted towards supporting the rover science mission by mapping and scouting the terrain.[86] While Ingenuity would do more to help Perseverance, the rover would pay less attention to the helicopter and stop taking pictures of it in flight. JPL managers said the photo procedure took an "enormous" amount of time, slowing the project's main mission of looking for signs of ancient life.[87] On 30 April 2021, the fourth flight successfully captured numerous color photos and explored the surface with its black-and-white navigation camera. On May 7, Ingenuity successfully flew to a new landing site.

On 5 September 2021, after successful completion of the Operations Demonstration phase, the mission was extended indefinitely.[88]

Operational history[edit]

Comparison of total distance traveled between Ingenuityand Perseverance.[a]
Tracks and locations of Perseveranceand Ingenuityas of August 5, 2021[89]
Perseverance enters Séítah on sol 201

Perseverance dropped the debris shield protecting Ingenuity on March 21, 2021, and the helicopter deployed from the underside of the rover to the martian surface on April 3, 2021.[9] That day both cameras of the helicopter were tested taking their first b/w and color photos of the floor of Jezero Crater in the shadow of the rover.[90][91]

Ingenuity's rotor blades were successfully unlocked on April 8, 2021 (mission sol 48), and the helicopter performed a low-speed rotor spin test at 50 rpm.[92][93][94][95]

A high-speed spin test was attempted on April 9, but failed due to the expiration of a watchdog timer, a software measure to protect the helicopter from incorrect operation in unforeseen conditions. On April 12, JPL said it identified a software fix to correct the problem. To save time, however, JPL decided to use a workaround procedure, which managers said had an 85% chance of succeeding and would be "the least disruptive" to the helicopter.

On April 16, 2021, Ingenuity successfully passed the full-speed 2400 rpm rotor spin test while remaining on the surface.[27] Three days later, April 19, JPL flew the helicopter for the first time. The watchdog timer problem occurred again when the fourth flight was attempted. The team rescheduled the flight, which succeeded on April 30. On June 25, JPL said it had uploaded a software update the previous week to permanently fix the watchdog problem, and that a rotor spin test and the eighth flight confirmed that the update worked.

The Ingenuity team plans to fly the helicopter every two to three weeks during its indefinitely extended mission.[88] The helicopter's longer-than-expected flying career lasted into a seasonal change on Mars, when the atmospheric density at its location became even lower. The flight team prepared by commanding Ingenuity to ground-test a faster rotor blade rotation, needed for sufficient lift. JPL said the higher planned flight speed of 2700 rpm would pose new risks, including vibration, power consumption and aerodynamic drag if the blade tips approach the speed of sound. The test speed was 2800 rpm, giving a margin for increase if the intended flight speed of 2700 is not enough. Ingenuity will face another challenge to remain functional during the Martian winter and solar conjunction, when Mars will move behind the Sun, blocking communications with Earth and forcing the rover and helicopter to halt operations. The shutdown will happen in mid-October 2021, for which preparations were to start in mid-September.[98][99] The Ingenuity Mars Helicopter will remain stationary at its location 575 feet (175 meters) away from Perseverance and communicate its status weekly to the rover for health checks.[100] If the helicopter is still responsive after enduring harsh conditions of the blackout period, JPL may continue flying it; otherwise the team will terminate the mission.[101][102]

List of flights[edit]

Flight No. Date (UTC)
(Sol)
Duration (sec) Max Altitude Horizontal Distance Max Groundspeed Route Summary
1 April 19, 2021 at 07:34
(Sol 58)
39.1 3 m (9.8 ft) 0 m (0 ft) 0 m/s (0 mph) Vertical takeoff, hover, land at Wright Brothers field 18°26′41″N77°27′04″E / 18.44486°N 77.45102°E / 18.44486; 77.45102The first powered flight by any aircraft on another planet. While hovering, it rotated in place 96 degrees in a planned maneuver. Flight data was received at 11:30 UTC.[12][103]
2 April 22, 2021 at 09:33
(Sol 61)
51.9 5 m (16 ft) 4 m (13 ft) Roundtrip 0.5 m/s (1.1 mph) Hover, shift westward 2 m (6.6 ft), hover, return, hover, land[104]18°26′41″N77°27′04″E / 18.44486°N 77.45102°E / 18.44486; 77.45102From its initial hover, it tilted 5 degrees, allowing the rotors to fly it 2 meters sideways. It stopped, hovered in place, and rotated counterclockwise, yawing from +90° to 0° to -90° to -180°, in 3 steps, to point its color camera in various directions to take photos. After that it flew back to its takeoff location.[105]
3 April 25, 2021 at 11:31
(Sol 64)
80.3 5 m (16 ft) 100 m (330 ft) Roundtrip 2 m/s (4.5 mph) Hover, shift northward 50 m (160 ft), return, hover, land[106]18°26′41″N77°27′04″E / 18.44486°N 77.45101°E / 18.44486; 77.45101This was first flight to venture some distance from the helicopter's deployment spot. It flew downrange 50 meters at a speed of two meters per second. After a short hovering above the turnback point it returned to land at the departure spot.[107] Data from the flight was received at 14:16 UTC.[106]
4 April 29, 2021[108][109] (Sol 68) First attempt of flight 4 failed Onboard software did not transition to flight mode.[17]
April 30, 2021 at 14:49.
(Sol 69)
116.9 5 m (16 ft) 266 m (873 ft) Roundtrip 3.5 m/s (7.8 mph) Hover, shift southward 84 m (276 ft), hover, return, hover, land[110]18°26′41″N77°27′04″E / 18.44486°N 77.45112°E / 18.44486; 77.45112Took color images while hovering at its farthest point from takeoff. During the fourth flight Perseverance rover recorded both audio and video of Ingenuity,[111] making the helicopter the first interplanetary vehicle whose sound was heard and recorded by another interplanetary vehicle. In this flight, Ingenuity overtook Perseverance in the distance they travelled during the mission.
5 May 7, 2021 at 19:26[112]
(Sol 76)
108.2 10 m (33 ft) 129 m (423 ft) 2 m/s (4.5 mph) Hover, shift southwards 129 m (423 ft), climb to 10 m (33 ft), hover, land at Airfield B 18°26′34″N77°27′05″E / 18.44267°N 77.45139°E / 18.44267; 77.45139This was the first flight to land at a new location 129 m (423 ft) to the south. On arrival, it gained altitude, hovered, captured a few color terrain images and then landed at that new site, Airfield B.[40][113] This flight was the last in the technology demo phase.
6 May 23, 2021 at 5:20[114][115]
(Sol 91)
139.9 10 m (33 ft) 215 m (705 ft) with direction changes 4 m/s (8.9 mph) Shift southwest about 150 m (490 ft), southward about 15 m (49 ft), northeast about 50 m (160 ft), land near Airfield C 18°26′30″N77°27′00″E / 18.44166°N 77.44994°E / 18.44166; 77.44994This flight was the first in the operation demonstration phase. Towards the end of the first leg of the route a glitch occurred in the navigation images processing system. An image was dropped, and subsequent images with incorrect timestamps resulted in the craft tilting forward and backward up to 20 degrees, with large spikes in power consumption. Anyway, Ingenuity continued flying in that mode and landed about 5 m (16 ft) away from the planned site, assumed as its Airfield C.[116]

It was the first flight when helicopter had to land at an airfield which was not surveyed for it by other means than the MRO orbital imagery.

7 June 6, 2021 (Sol 105) First attempt of flight 7 failed Onboard software did not transition to flight mode.
June 8, 2021 at 15:54[117]
(Sol 107)
62.8[119]10 m (33 ft)[120]106 m (348 ft) 4 m/s (8.9 mph) Shift southward 106 m (348 ft) to land at Airfield D 18°26′24″N77°27′01″E / 18.43988°N 77.45015°E / 18.43988; 77.45015Ingenuity flew 106 m (348 ft) south to a new landing spot and landed at Airfield D. The color camera was not used to prevent glitch of flight 6 happening again.
8 June 22, 2021 at 0:27[121]
(Sol 121)
77.4 10 m (33 ft) 160 m (520 ft) 4 m/s (8.9 mph) Shift south south-east 160 m (520 ft) to land at Airfield E18°26′14″N77°27′03″E / 18.43724°N 77.45079°E / 18.43724; 77.45079Ingenuity flew about 160 m (520 ft) south to land at Airfield E, about 133.5 m (438 ft) away from Perseverance. Just like the last flight, the color camera was not used to prevent the glitch of flight 6 happening again. The bug was fixed before flight 9.
9 July 5, 2021 at 9:03[120]
(Sol 133)
166.4 10 m (33 ft) 625 m (2,051 ft) 5 m/s (11 mph) Shift southwest 625 m (2,050 ft) to Airfield F 18°25′41″N77°26′44″E / 18.42809°N 77.44545°E / 18.42809; 77.44545Ingenuity flew a record length of 625 m (2,050 ft) southwest, over Séítah, a prospective research location in Jezero crater, at a record speed of five meters per second. This was a risky flight, straining the navigation system, which assumed flat ground while Séítah had uneven sand dunes. This was partly mitigated with the helicopter flying slower over the more challenging regions of the flight. Due to these errors, Ingenuity landed 47 m (154 ft) from the center of the 50 m (160 ft) radius airfield. This flight made Ingenuity’s travel distance exceed Perseverance again.[122]
10 July 24, 2021 at 21:07
(Sol 152)
165.4[123]12 m (39 ft)233 m (764 ft)[120]5 m/s (11 mph) Loop south and west over Raised Ridges to Airfield G 18°25′41″N77°26′37″E / 18.42808°N 77.44373°E / 18.42808; 77.44373Ingenuity looped south and west over Raised Ridges, another prospective research location on Mars. Unlike the previous one, Perseverance is planned to visit here. Ingenuity flew a total distance of 233 m (764 ft) past 10 waypoints, including takeoff and landing, at a record height of 12 m (39 ft).[124]
11 August 5, 2021 at 4:53[125]
(Sol 164)
130.9 12 m (39 ft) 383 m (1,257 ft) 5 m/s (11 mph) Shift northwest 383 m (1,257 ft) to land at Airfield H 18°25′58″N77°26′21″E / 18.43278°N 77.43919°E / 18.43278; 77.43919This flight was primarily intended as a transition to a new takeoff point from where the next flight for the photographs of South Séítah region was planned.
12 August 16, 2021 at 12:57[126]
(Sol 174)
169.5 10 m (33 ft) ~450 m (1,480 ft) Roundtrip 4.3 m/s (9.6 mph) Roundtrip northeast for about 235 m (771 ft), landed again near Airfield H 18°25′58″N77°26′21″E / 18.43268°N 77.43924°E / 18.43268; 77.43924The roundup trip about 235 m (771 ft) northeast and back. The return path was laid about 5 m (16 ft) aside to allow another attempt of paired images collection for a stereo imagery. As a result, the helicopter landed about 25 m (82 ft) east from the takeoff point.[127]
13 September 5, 2021 at 00:10
(Sol 194)
160.5 8 m (26 ft) ~210 m (690 ft) Roundtrip 3.3 m/s (7.4 mph) Roundtrip northeast for about 105 m (344 ft), landed again near Airfield H 18°25′58″N77°26′21″E / 18.43285°N 77.43915°E / 18.43285; 77.43915The round trip flew about 105 m (344 ft) northeast and back. The flight concentrated on one particular ridgeline and outcrops in South Séítah.
14 September 16, 2021 (Sol 204) to September 23, 2021 (Sol 211) Flight delayed until after solar conjunction ends in mid-October, following hardware problem during initial flight attempt. Faster rotor spin at 2800 rpm successfully tested on ground. 16 ft high flight attempt at 2700 rpm automatically canceled due to servo motor anomaly. Successful servo tests ("wiggle test") by JPL in effort to diagnose the problem. Brief flight is intended to verify use of 2700 rpm rotor spin during seasonal lower atmospheric density.

Flight experience[b][edit]

Flight property Since deployment
(April 3, 2021/Sol 43)
In tech demo phase In operations demo phase % Work done above
tech demo
Sols achieved 189 31 158 403%
Number of flights 13 5 8 160%
Distance flown (m) 2.83 km (1.76 mi) 0.50 km (0.31 mi) 2.33 km (1.45 mi) 466%
Time flown (s) 1469 s
(24 min 29 s)
396 s
(6 min 36 s)
1073 s
(17 min 53 s)
271%

Ingenuity's imagery[edit]

Flight No. Date (UTC) and Mars 2020 mission sol Photographs Comments
b/w
NAV
color
RTE
Before April 19, 2021 (sol 58) 6[91]6[132]Preflight camera tests
1 April 19, 2021 (sol 58) 15
2 April 22, 2021 (sol 61) 17 3 The first color photosession
3 April 25, 2021 (sol 64) 24 4
4 April 30, 2021 (sol 69) 62 5 ...
5 May 7, 2021 (sol 76) 128 6
6 May 23, 2021 (sol 91) 106 8
7 June 8, 2021 (sol 107) 72 0 RTE was turned off
8 June 22, 2021 (sol 121) 186 0
9 July 5, 2021 (sol 133) 193 10
10 July 24, 2021 (sol 152) 190 10 Five pairs of color images of Raised Ridges taken to make anaglyphs.
11 August 5, 2021 (sol 164) 194 10
12 August 16, 2021 (Sol 174) 197[133]10 Five pairs of color images of Séítah taken to make anaglyphs.
13 September 5, 2021 (Sol 193) 191[134]10

Ingenuity has two commercial-off-the-shelf (COTS) cameras on board. The Sony IMX 214 with 4208 x 3120 pixel resolution is a color camera with a global shutter to make terrain images for return to Earth (RTE). The Omnivision OV7251 (640 × 480) VGA is the downward-looking black and white rolling shutter navigation camera (NAV), which supplies the onboard computer of the helicopter with the raw data essential for flight control.[21]

While the RTE color camera is not necessary for flight and may be switched off (as in flights 7 and 8), the NAV camera works throughout each flight, catching the first frame before takeoff and the last frame after landing. Its frame rate is synchronized with blade rotation to ease online image processing.

During flight, all NAV frames must be carefully stored in the onboard helicopter computer, with each frame assigned the unique timestamp of its creation. Loss of a single NAV image timestamp was an anomaly that caused the helicopter to move erratically during flight 6.

The monopole antennaof the base station is mounted on a bracket in the right rear part of the rover

The longer a flight lasts, the more NAV photos must be stored. Each new record flight duration automatically means a record number of images taken by the NAV camera. The frequency and timing of the camera's operations are predetermined not for the sake of records, but due to the technical necessity. A huge number of NAV files does not overload the local storage of the helicopter. Less than 200 NAV files are uploaded to the NASA storage after each flight starting from the 8th, and the total volume of this package is only about 5 Megabytes[133] The limitations are imposed by weakness of local telecommunications: when landed, helicopter relays data to the rover in a slow mode of 20 kbit/s.[21] Another significant inconvenience here is caused by the location of the antenna on the side of the rover: if turned wrong side to the helicopter, it may impede signal propagation with its massive metal body.

In fact, most of the NAV files are not transmitted to the rover base station for return to Earth. After the fourth flight, MiMi Aung confirmed that "images from that navigation camera are typically used by Ingenuity's flight controller and then thrown away unless we specifically tell the helicopter to store them for later use". From more than 4000 NAV files acquired on flight four, only 62 were stored.[135]

With the end of the flight technology demonstration, Perseverance project manager Jennifer Trosper relinquished her team's responsibilities for photographing Ingenuity to concentrate exclusively on the rover science mission of searching for signs of ancient Martian life. Without pictures from the rover, the flight team relied more heavily on photos taken by the helicopter NAV camera to confirm Ingenuity's location. The helicopter, however, does not create or refine the maps, but rather, depends upon work coordinated by the U.S. Geological Survey’s Astrogeology Science Center and performed by the NASA Mars and Lunar Cartography Working Groups.[citation needed]

To support the Mars-2020 mission, USGS used photos by the High-Resolution Imaging Science Experiment (HiRISE) on the Mars Reconnaissance Orbiter (MRO) to produce Context Camera (CTX) and Digital Terrain Models (DTM) and orthoimage mosaics. Those images were used by the Terrain Relative Navigation (TRN) feature on the Perseverance descent vehicle and helped determine the safest landing location.[136] Using maps created from photos and radar elevation data previously acquired by the MRO and other NASA missions, planetary cartographers manually correlate them with terrain features seen by Ingenuity's small and lens-distorted NAV images.[citation needed] After each NAV frame is assigned a georeference, the resulting flight maps are shown at NASA's Mars-2020 tracking service.[89] NAV frames from Ingenuity are also used to produce moving images that show the Martian terrain passing under Ingenuity during its flights.

Flight 3 (April 25, 2021)

Flight 4 (April 30, 2021)

Flight 5 (May 7, 2021)

Flight 6 (May 23, 2021)
last 39 seconds

Flight 7 (June 8, 2021)
48 sec real-time animation

Flight 8 (June 22)
75 sec real-time animation

Flight 9 (July 5, 2021)
full real-time animation

Flight 10 (July 24, 2021)
full real-time animation

Flight 11 (August 5, 2021)
full real-time animation

Flight 12 (August 16, 2021)
full real-time animation

Unlike Perseverance, Ingenuity does not have a special stereo camera for taking twin photos for 3D pictures simultaneously. However, the helicopter has made such images by taking duplicate color photos of the same terrain while hovering in slightly offset positions, as in flight 11, or by taking an offset picture on the return leg of a roundtrip flight, as in flight 12.[137]

As of August 24, 2021, 1390 black-and-white images from the navigation camera[131] and 72 color images from the terrain camera (RTE)[138] have been published.

Tributes to the Wright brothers[edit]

NASA and JPL officials described the first Ingenuity flight as their "Wright Brothers moment", by analogy to the first successful airplane flight on Earth.[30][139] A small piece of the wing cloth from the Wright brothers' 1903 Wright Flyer is attached to a cable underneath Ingenuity's solar panel.[140] In 1969, Apollo 11's Neil Armstrong carried a similar Wright Flyer artifact to the Moon in the Lunar Module Eagle.

NASA named Ingenuity's first take-off and landing airstrip Wright Brothers Field, which the UN agency ICAO gave an airport code of JZRO for Jezero Crater,[141] and the drone itself a type designator of IGY, call-sign INGENUITY.[142][143][144]

Future Mars rover design iteration[edit]

Mars Science Helicopter, Ingenuity's proposed successor

The Ingenuity technology demonstrator could form the foundation on which more capable aircraft might be developed for aerial exploration of Mars and other planetary targets with an atmosphere like Mars Science Helicopter.[78][21][145] The next generation of rotorcraft could be in the range between 5 and 30 kg (11 and 66 lb) with science payloads between 0.5 and 5 kg (1.1 and 11.0 lb).[10] These potential aircraft could have direct communication to an orbiter and may or may not continue to work with a landed asset.[23] Future helicopters could be used to explore special regions with exposed water ice or brines, where Mars microbial life could potentially survive.[74][21]

Data collected by Ingenuity is supporting planning of a future helicopter design by engineers at JPL, NASA's Ames Research Center and AeroVironment. The Mars Science Helicopter, a proposed Ingenuity's successor, would be a hexacopter, or six-rotor helicopter, with a mass of about 30 kg (66 lb) compared to 1.8 kg (4.0 lb) of Ingenuity. Mars Science Helicopter could carry as much as 5 kg (11 lb) of science payloads and fly up to 10 km (6.2 mi) per flight.[10]

Gallery[edit]

Audio[edit]

Mars helicopter Ingenuity,heard flying on Mars on its fourth flight

Videos[edit]

Maps of flights[edit]

The Wright Brothers Field and the overlook location

The Wright Brothers Field

View of the field from the rover

Rover track and Wright Brothers Field

Flights 1–9

Profile of flight 10

Profile of flight 11

Flights 1–11

Images by Perseverance[edit]

Ingenuity's first flight
(19 April 2021)

Ingenuity's first flight after 30 secs flying

Ingenuity's second flight
(22 April 2021)

Ingenuity's third flight
(25 April 2021)

Ingenuity after its third flight

Ingenuity's fourth flight
(30 April 2021)

Ingenuity's during fifth flight to Airfield B
(7 May 2021)[113]

Ingenuity landing of fifth flight on Airfield B (7 May 2021)

Ingenuity one day after its sixth flight (Sol 92)

Ingenuity four days after its seventh flight (Sol 111)

Ingenuity seven days after its eighth flight (Sol 127)

Additional images about the flights[edit]

Aircraft certification of Ingenuity to fly on Mars

Sours: https://en.wikipedia.org/wiki/Ingenuity_(helicopter)


385 386 387 388 389