When technology kills

 

There’s a very sad report out of Canada, one that might have repercussions in America as well.  It’s an analysis of what caused the crash of a Sikorsky CH-148 Cyclone helicopter last year, killing everyone aboard.

As a pilot guided one of Canada’s navy helicopters up into a tight turn, neither his training nor cockpit indicators warned of how a built-in autopilot would take control and plunge the Cyclone into the Ionian Sea, a military report has concluded.

All six Canadian Forces members on board died in the crash on April 29, 2020.

. . .

The report … said testing wasn’t done during the aircraft’s certification to identify what would happen if a pilot overrode the autopilot more than “momentarily” and in certain complex situations.

“The automation principles and philosophy that governed the Cyclone’s design never intended for the [autopilot] to be overridden for extended periods of time, and therefore this was never tested,” it said.

This was the case even though — as the report stated — pilots are known on occasion to override the autopilot system without manually pressing a button on their control stick, called the cyclic.

. . .

That crash caused the worst single-day loss of life for the Canadian Armed Forces since six soldiers were killed in a roadside bombing in Afghanistan on July 4, 2007.

The report indicated the crash might have been averted if the pilot had manually chosen to turn off the autopilot during the turn. But it also stated that it wasn’t unusual for pilots to override the autopilot and there were no explicit instructions in the manuals on the necessity to manually turn off the flight director.

In addition, the report said the pilot appeared unaware the computer would attempt to regain control near the end of the turn.

When the helicopter flipped around, the report said, the pilot pulled back as far as he could on the cyclic, attempting to right the aircraft that the computer was flying into the sea. Within seconds, the helicopter hit the ocean at massive force.

The board of inquiry said it found no evidence the flying pilot recognized he had lost control of the aircraft until it was too late.

Critical to the crash, the report said, was the aircraft’s software, which was certified by the military. If the autopilot is overridden, the computer accumulates digital commands, referred to as “command bias accumulation.” The more commands a pilot sends manually to the computer while the aircraft is coupled with the autopilot, the more this bias accumulation occurs, the report said.

After a pilot overrides the air speed set by the autopilot, a “feed forward look” occurs, the report said, adding that in some situations, “the pilot’s ability to control the aircraft will be reduced or lost.”

The board of inquiry said the pilots’ training didn’t cover “with sufficient detail” certain risks of flying the aircraft, leaving the flyers unaware the autopilot would seek to keep control of the helicopter.

There’s more at the link.

Thing is, the Cyclone is a military (Canadian-specific) version of the Sikorsky S-92 helicopter, which is in widespread commercial use.  The S-92 also serves as the foundation for the VH-92 variant, under development to replace the VH-3D variant of the Sikorsky SH-3 Sea King, which has served all US presidents since 1976.  The President’s helicopter uses the callsign Marine One when he’s aboard.  I imagine there’s a lot of urgent investigation going on right now in the VH-92 development team, to make sure that the President of the United States won’t be killed by an automated flight system error . . .

It’s reminiscent of the Airbus A320 automated flight control system, which in its earliest versions was alleged to inhibit the pilot from making emergency corrective maneuvers.  That was most famously on display in the crash of a brand-new A320 in 1988 while making a low pass.  It’s been alleged that the automated flight controls shut the pilot out of the loop and flew the aircraft into the forest.

A similar problem is alleged to have caused two fatal crashes of the new Boeing 737 Max airliner, where the computer overrode pilot inputs and caused the aircraft to crash.  Automation is also a factor in the current certification program for the new Boeing 777-9 widebody airliner.  It appears that the aircraft’s systems are still not satisfactory, according to the FAA, which says it “needs more information – including about a major software architecture called the ‘Common Core System’ (CCS) – before considering the 777-9 to be on track to certification.”

Automation can be a pilot’s worst enemy – particularly when he doesn’t know it might cancel out his decisions!

May those who died in last year’s crash rest in peace, and may their families receive what comfort they may.  One hopes that the lessons learned from this crash will make the other helicopters in the CH-148 fleet – and all other aircraft using high-automation, computerized flight control systems – that much safer.

Peter

10 comments

  1. Once again, the drive by moronic computer nerds to remove skilled humans from risky processes shows itself.

  2. Hey Peter;

    Similar reasons happened to the Air France 447, granted it was pilot error but it was exacerbated by the logic system of the airplane. Most of the airplane mishaps are attributed to "Pilot Error" so they try to take the pilot out of the equation as much as possible and have the computer fly the aircraft, not realizing that the computer has its own problems.

  3. This is why I've never sought a job creating software that could cause situations where people could be killed.

  4. Computer 'logic' and human logic are completely different…for a reason… Sigh… Once again the pilots were cut out of the development loop by engineers that knew 'better' than the pilots what needed to happen.

  5. I used to write shooting control software for the offshore oil exploration industry.
    (I say "shooting" because for historical reasons, each sample is called a "shot".).

    Just when I thought I had accounted for every possible situation, along came a new one, or the newest project had some conditions that had never been encountered.

    I spent the better part of 10 years offshore, modifying my software according to the guys who actually did the work, and knew what the client expected.

  6. That's absolutely horrible, and should never have been allowed to happen. "They didn't test for it" is an unacceptable excuse.

    With respect to airliners, as far as I can find Airbus has made no secret of the fact that their fly-by-wire flight envelope protection system overrides the pilots in an emergency. The computer takes precedence.

    https://en.m.wikipedia.org/wiki/Flight_envelope_protection

    Personally that seems nuts, but what do I know. Computers never fail after all!

  7. I am a former USAF C-130 pilot with over 4000 flight hours. That is a horrible thing to have happened to the helicopter crew and passengers. The software team that designed that in a military helicopter should be imprisoned for negligent homicide. When I heard of that Airbus A320 crash years ago, I knew it was the computer. The pilot had done nothing unusual to cause what happened.

    I have worked in some areas I will not elaborate upon, but I have seen software engineers/programmers decide they know better that the users of the software and thus put limitations in that are unworkable. As to aviation software it is all over the place. It could have caused a late friends death as the Captain of an MD-11; his instincts saved himself, his crew, and a couple of hundred passengers.(He died quietly and quickly in bed of a sudden stroke years after retiring from the airline.) There was an article 20+ years ago in the "IEEE Spectrum" magazine that discussed the aircraft software issue and the fact that Airbus saw the pilot as a problem and not a solution. As it stands I would NEVER fly on a completely automated airplane and when I have flown on Airbus I definitely am uncomfortable.

    "Pilot Error" is often just an accident board's way of saying "we don't know what really happened, the pilot is dead, so let's blame him" excuse. I have seen them try to apply it in an accident that was caused by structural failure even though there was no way to know that the problem existed.

  8. I prefer my pilots fly the aircraft except under very specific circumstances, and none of those include periods of hard maneuvering.

  9. what FAA/DOT really doesn't like about the MAX and Boeing is Boeing's insistence that the MAX be marketed without requiring a new Type Cert on the pilots privileges that the purchaser would have to pay for. Would I hesitate to board a 737 as a pax? no, not a problem. Everyone has to die from something

  10. The Max software has been fixed. Technically, under perfect circumstances, the original software would be fine. The problem with the Indonesian flight was they replaced the attitude indicator with a badly refurbished one and failed to calibrate it. MCAS suffered a case of GIGO, and the Airline suffered a case of pilots "Taught to the test" who were taught to let the autopilot do everything and not to understand the system.

    The Ethiopian flight possibly suffered a bird strike to the sensor. GIGO again. But there was also a bit of a cockpit resource management issue. Seized on the problem, they ignored that they were still on full takeoff power, which contributed to them being unable to manually move the elevator trim. What doomed them was turning the electric trim back on to move the trim, but not shutting it off before MCAS had another, fatal dose of GIGO to contribute.

    The new software must have TWO sensors that agree, and can only trigger a single correction. Gods hope that it's never a situation where more than one is needed.

Leave a Reply to capt fast Cancel reply

Your email address will not be published. Required fields are marked *