Comments by "Tony Wilson" (@tonywilson4713) on "Mentour Pilot"
channel.
-
@MentourPilot Sorry if I am coming to this one late, but considering where we are with Max-8 issues and a host of other issues in 2021 there's a couple things to be said in addition. Also my apologies if this is a bit longer than a normal YouTube comment.
I have a degree in aerospace engineering and a private pilots license. I have worked in industrial control and automation systems for 30+ years including a fair amount of work with robotics.
Along the way I became certified in industrial safety systems, which are the systems that cut in and shut down systems when they get too far out of normal. Which is very similar to what the MCAS in the Max-8 was meant to do. Its an area we call "functional safety" and a lot of it came out of the aerospace industry, things like sensor redundancy, multi-cpu self checking systems and MooN (M out of N) methods.
I have at times written the sorts of algorithms for plant and machinery that others could classify as AI. I wouldn't call what I have done as AI because I know that NOTHING anyone has ever done in terms of algorithms come close to thinking or reasoning. When engineers talk about "learning" and "teaching" we ARE NOT talking about teaching or learning as a human does. We are generally talking about an algorithm that can re-tune the parameters for the specific task at hand. I most commonly see this for simple control loops like temperature control.
Its quite possibly a poor choice of words and there's a complete lack of understanding in the general community about what algorithms are and how they work. There's also been a horrendous over-selling of what AI is and what it can do from the computer and IT industries. For people like me we absolutely hate those people because more than anything else they are selling a lie. And those sorts of lies have consequences as we all found out with the Max-8.
The way these algorithms work is they AIM to MIMIC how a human MIGHT manually do something. Like the way a pilot adjusts the trim. You adjust it a bit and then see what happens. You then readjust and see what happens and keep doing that until the plane is trimmed. We can do that in software because its a well defined task. I know how these types of algorithms work because I have written them to do unusual things like pH control in waster water treatment.
These sorts of algorithms work in well defined tasks BUT they fall over in complex tasks like driving a car. Its not been widely advertised but they all gave up on driverless cars and trucks about 2 years ago because they eventually realised just how complex the task was and near impossible it would prove to test.
This is why an autopilot in a plane works. Its a very well defined task -> keep the plane on this heading and this altitude at this speed. Driving a car down a street sounds much simpler until you start considering ALL the possibilities of weather, parked cars, light and pedestrians. Why has there never been an autopilot that could taxi a plan from the apron to the runway? Consider how many things a computer would have to consider just taxiing out to the runway. For starters you'd have to map every bay and taxiway at every airport your aircraft could be used. You'd then also have to be able to consider every permutation of what other aircraft and vehicles could be doing at all of those airports. Its the variety of circumstances ad how abstract the task is, that brings these systems down.
The human brain is incredibly good at abstract situations and can evaluate extraordinary amounts of data per second and eliminate most of it on the fly as irrelevant or less important. Computers as we know them just can't do that. There simply isn't a camera or evaluation system as that's even the tiniest of a fraction the capability of the human visual cortex. BUT keeping a plane on heading, altitude and speed is a very well defined task with almost no other inputs than altitude, heading and speed. There is a possibility that quantum computers will be able to do these sorts of tasks, because its thought they will be able to evaluate vast amounts of data in parallel and do abstract tasks.
So bottom line is there is nothing in the current technology of AI that's even the tiniest fraction of what would be needed to replace you in the pilot seat. And with the safety requirements for redundancy your co-pilot doesn't need to worry either. 👍😀
3
-
2