In the news today, the FAA has announced that it will be seeking to mandate changes to some of Boeing's 737 model aircraft in the wake of a 2009 (better late than never?) crash of a Turkish Airlines 737. From the WSJ (subscription needed):
Such problems with low airspeed—often prompted by confusion about the status of automated throttles—have been identified as major factors in some of the most infamous airliner crashes in recent years. They include last summer's crash of an Asiana Airlines Boeing 777 into a sea wall while trying to land at San Francisco International Airport. The pilots failed to notice their approach speed dropping.
The FAA acknowledges that should the automation fail, the aircraft is likely at risk:
In its proposal, the FAA said "loss of automatic speed control" can result "in loss of control of the airplane." The agency is proposing a three-year compliance deadline, once the directive becomes final.
We are not familiar with the exact details of the proposed fix, but it likely has to pertain to the radio altimeter which erroneously indicated a landing altitude while the aircraft was still on approach. The erroneous indication caused the autothrottle system to pull the throttles to idle in anticipation of landing but the crew did not adequately respond nor disengage the system. The airspeed subsequently decayed and efforts to add power came too late to prevent a stall and short landing.
The important point here is that the crew eventually recognized the failure but not in enough time to save the approach (or their lives). This crew was flying a fully automatic approach into very low visibility conditions and most likely were concentrating on reaching the decision altitude where they would have to choose to either land or go around. Believing that the auto throttles "had their back" concerning airspeed control, they dropped the airspeed out of their crosscheck.
This is the pernicious nature of automation. It is designed to be "automatic", meaning little or no human input is needed. And as it rarely if ever fails, humans being human, will come to rely on it. Yet should it fail, the humans ultimately in charge are not well suited to instantly take over in a critical phase of flight to recover the aircraft as happened here. As former flight instructors, we always found it easier to fly an approach ourselves than to monitor one flown by a student and to then take and recover the aircraft should the student go off course. Automation is like the perfect student except it gives no warning of failure. At least students were known to be consistently weak or good.
So while the FAA and industry are recognizing that there are significant drawbacks in over-reliance on automation, this mandate from the FAA serves to in effect fix a potential automation defect, not a reassessment of the underlying philosophy. We expect that the next decade will consist of a type of "whack-a-mole" wherein defects in automation will make themselves apparent through various incidents and accidents, and efforts will continue to build a better mousetrap. At some point automatic flight systems will become robust enough to obviate the "stick and rudder" skill set in the people who will man tomorrow's cockpits, but we have a way to go yet.
No comments:
Post a Comment
I welcome feedback. If you have any comments, questions or requests for future topics, please feel free to comment. Comment moderation is on to reduce spam, but I'll post all legit comments.Thanks for stopping by and don't forget to visit my Facebook page!
Capt Rob