0.5 Ways of Working
Our Guest This Episode: Dr. Te Wu
The Boeing 737 Max 8 story started with the tragedy of Lion Air Flight 610 that crashed into the Java Sea on October 29, 2018 killing all 189 people onboard. Less than 5 months later, Ethiopian Airlines Flight 302 crashed just six minutes after takeoff, killing 157 passengers and crew. Dr. Te Wu provides a project management perspective on what factors led to these tragic events, and he describes the risk factors that, in hindsight, should have been determined and corrected.
When assessing the impacts of tradeoffs, project managers must exercise responsibility in relation to schedule, budget, quality, and compliance decisions. Additionally, Te urges that one cannot turn a blind eye towards the safety of the end user. Te describes how the MCAS, or Maneuvering Characteristics Augmentation System, modification, as well as other cost saving measures, contributed to these tragedies. Hear about the problem of risk in complex projects, and the practice of identifying, evaluating, and planning risk responses. We look at what can be learned from this tragedy as Te sheds light on the concept of Duty of Care. He offers practical steps to put in place so we don’t make these mistakes in our projects.
Dr. Te Wu is an Assistant Professor at Montclair State University, a Visiting Professor at China Europe International Business School, and the CEO of PMO Advisory. As a certified Portfolio, Program, Project, and Risk Management Professional, Te is an active volunteer including serving on Project Management Institute’s Portfolio and Risk Management Core Teams and an U.S. delegate on the ISO Technical Committee 258 for Project, Program and Portfolio Management.
Favorite Quotes from Our Talk:
"... if project managers could think about these three aspects – be realistic, be fact-driven, and be truthful about the data, at least to one’s self and team – and also be a little bit more holistic, we could solve so many problems upfront and head off the downstream challenges."
"...as a project manager, when we design products that have life at stake, or other big consequences at stake, take a step back and ask what are those risks? What can go wrong? And how can we fix or mitigate those risks?"
When assessing the impacts of tradeoffs, project managers must exercise responsibility in relation to schedule, budget, quality, and compliance decisions. Additionally, one cannot turn a blind eye towards duty of care of the safety of the end user. Dr. Te Wu provides a project management perspective on what factors led to the catastrophic events surrounding the Boeing 737 Max 8 airline tragedies.
01:33 … The Boeing 737 Max 8 Events
05:07 … Initial Investigations
07:20 … Factors Leading to the Events
10:16 … Prior Complaints
12:58 … Technical Complexity and Increased Risk
17:37 … Brewing a Perfect Storm
20:38 … MCAS Software Issues
24:35 … Lessons for the Project Manager
27:15 … Intelligent Project Manager vs. Bold Project Manager
29:58 … Duty of Care
35:07 … Latest on the 737 Max 8
36:44 … Three Responsibilities for a Project Manager
39:15 … Get in Touch with Te
39:45 … Closing
TE WU: … if project managers could think about these three aspects – be realistic, be fact-driven, and be truthful about the data, at least to one’s self and team – and also be a little bit more holistic, we could solve so many problems upfront and head off the downstream challenges.
WENDY GROUNDS: Welcome to Manage This, the podcast by project managers for project managers. I’m Wendy Grounds and here in the studio with me is Bill Yates. Today we’re talking with Professor Dr. Te Wu of Montclair State University. He’s also a visiting professor at China Europe International Business School and the CEO of PMO Advisory.
BILL YATES: As a certified portfolio, program, project, and risk management professional, Te is a very active volunteer, including serving on PMI’s Portfolio and Risk Management core teams, and he’s a U.S. delegate on the ISO Technical Committee 258 for Project, Program, and Portfolio Management.
WENDY GROUNDS: Dr. Te Wu is also going to be speaking to us today about the Boeing 737 MAX 8 story. He has done a lot of research on this project, based on an educational standpoint for project managers. And he’s taken a lot of lessons from that, that we can learn. And it’s a very interesting study that he’s done. So we’re looking forward to digging a little deeper with him today.
Hi, Te. Welcome to Manage This. Thank you so much for being our guest today.
TE WU: It’s a pleasure to be here. Thank you very much for inviting me.
WENDY GROUNDS: We’re going to jump right in. And I want to ask if you can give us some details behind the story of the Boeing 737 MAX 8 events which occurred in 2018/2019?
TE WU: Sure. Absolutely. The first incident or tragedy happened with Indonesian Lion Air Flight 610. That happened October 29th, 2018, and flying a very new Boeing 737 MAX plane. It was a domestic flight from Jakarta to Pangkal Pinang. And it crashed about 13 minutes after taking off, crashed into the Java Sea, killing all 189 passengers and crew members. This was the first major accident involving this new series of plane, the Boeing 737 MAX.
And as you could imagine in an accident of this magnitude, it triggers a number of investigations, not only from Boeing, but from the Indonesian Air Authority. There’s a committee called Indonesian National Transportation Safety Committee, KNKT, as well as of course the United States FAA, Federal Aviation Administration. And at the time Boeing promised full cooperation with the investigation. But even domestically you could see – I have airline friends, for example, that quite a number of fingers were pointing. One, it happened on foreign soil. Two, they were questioned on the rigor of training at Indonesian Lion Air. And in some ways there were a series of, I would say, cultural mishaps trying to point toward human error than toward the potential problem with the plane.
And Boeing, to be fair, probably did do a good analysis, but perhaps not good enough. The reason I say that is less than six months later the second tragedy hit with the Boeing 737 MAX series. And this time, on March 10th, 2019, was Ethiopian Air Flight 302. In this case it was an international flight from Ethiopia to Kenya, and it crashed six minutes after taking off, killing all 157 passengers and crew members. And what was the most tragic fact about that crash was the crash site was actually hard to find. It was in a mountainous area, unlike Java Sea, which everything sort of sank to the bottom ocean, making it by definition relatively hard.
In this case you would think you could find a crash site relatively quickly. And given that was only six minutes from the airport, there was a massive search. It was hard to find because there was no observable big pieces of the crashed plane. Eventually they found the site. It was a crater that was 32 feet deep, 92 feet wide, and 131 feet long. The crash impact was so severe it caused that crater. Estimation showed that the flight at the point it hit the ground was almost 600 miles per hour, so in other words it was crashing straight into the ground.
So those were the two tragedies. There were a lot of questions on what Boeing could have done better, especially after the first crash. Let’s say the first crash happened maybe perhaps due to human error. But the second crash was clearly something that we as the world know a lot more about and could have done much better.
BILL YATES: What emerged from the initial investigations into both plane crashes?
TE WU: So a couple things that emerged. It turns out on the first crash the crew of the Lion Air 610 did not even know there was a system really called MCAS on their plane. And it was not in the pilot’s manual. It was in the maintenance manual, but how many pilots go read maintenance manuals, for example? There was no cockpit indicator light saying something went wrong. They just knew the plane wasn’t behaving, and they couldn’t control the plane. Essentially they really didn’t know what go wrong other than the plane was out of control.
So that may be sad enough, but fast forward five months to the Ethiopian Air. In this case the pilot actually knew what went wrong. By that time Boeing did implement training of MCAS, but it was a training of a one-hour iPad video training. Which was enough, actually. The two pilots on Ethiopian Airlines were, especially the chief captain was quite experienced. So they knew what went wrong. The problem is they couldn’t disengage the system. The plane became uncontrollable. In order to recontrol the plane they had to engage it again. And when they engaged the autopilot system, you also engage the MCAS system, which further sent the plane down. So you can argue which one’s more tragic, one not knowing, while the other one actually knew what’s going wrong, but couldn’t save it. And that’s really what happened.
So after the crash on March 10th, 2019, the next day FAA reaffirmed the airworthiness of the Boeing MAX. On the same day, however, China grounded the plane. And shortly followed the European, the Saudi Arabia started grounding the plane, as well. And by March 13th, three days later, our closest friend and ally, the Canadians, also grounded the plane. Our FAA actually didn’t ground the plane willingly. It was President Trump ordered the plane to be grounded, and hence we grounded the plane. And that started the long march essentially for Boeing to try to recertify the plane, which they achieved almost 17 months later.
BILL YATES: Te, this is such a tragedy, these two events. It’s sobering to even talk about this. There are a lot of lessons to be learned. You know, my mind goes to risk management, goes to resource management, so many places. But let’s step into this. Let’s talk about factors first. So what factors led to these two events, these two tragedies occurring?
TE WU: That’s a really good question. I think a lot of people are asking those questions. I think we have to look at them at two different moments in time because at this present moment we know quite a lot more, multiple investigation from KNKT to FAA to our Justice Department, not to mention Boeing’s internal investigation. But it’s worthwhile to look at how we responded to the tragedy in the first place. So one thing it is important to mention that airline travel, commercial airline, is one of the safest of the transportation modes that we have.
BILL YATES: Right.
TE WU: And, that’s clearly evident by the number. What happens after every, you know, even minor incident, a slew of different bodies from the company itself to FAA to other regulatory bodies come in to examine if the parts that fall belongs to a manufacturer, say General Electric, and take a very close look at the accident. So that’s one reason why post-action report investigation was so important that you have safety.
And you could actually see that in the numbers. The last major U.S. passenger accident happened on February 12th, 2009. It was a Continental flight 3407 that crashed near Buffalo, New York, killing 49 people onboard. That was the last major one. So that’s now almost, what, 13 years ago, 12 years ago. And if you add up all the fatality in the U.S. commercial airline incident from that point on, 2009 to let’s say late last year, there were less than 40 fatalities. And that even includes a passenger who actually got killed in the airfield, illegally walked onto the airfield, struck by a flight. So that even counting those it’s less than 40.
So clearly these MAX accidents combine about 346 people is a big deal. After the accident with Lion Air and the Ethiopian Air, it is worthy to examine what are the risk factors at that moment that we knew of. And so a couple things that really dominated the airwaves shortly after those accidents. One, the lack of understanding of the MCAS system. And this is not even before going to the lack of safety factors involved in the system. But the fact that the pilot didn’t know about it, or later the pilot had one hour of iPad training.
It’s also worthy to point out that the Indonesian airline Lion Air 610 that crashed, the flight of that same plane right preceding that flight, so the earlier crew member, also complained of the same problem, but they were able to control the system in the plane and land it safely. As a matter of fact, if you looked at even among the U.S. pilots, there was a lot of anxiety around the system because there were others finding that the system for this plane, for whatever reason, was hard to control. And they didn’t truly know the problem. So if you look at the flight unions and the conversation there, it later emerged that there were a lot of complaints, as well.
So right off there a single risk factor is just say training, a lack of communication. If you have a captain of a flight responsible for 200 lives, it would be nice to know what’s in your system and how to use them. Almost immediately after both of these accidents, there was the protectionist mentality, and somewhat dismissive attitude that both of these accidents happened on foreign soil. And there were immediate questions of how well did each respective country and these companies train their pilots. In the U.S. press you could see all the U.S. pilots have to go through X years of training, flying other commercial flights, compared to those.
It is worthy to note that Ethiopian Air is really the gem of the entire African continent. It is one of the most respected of those airlines. So there was a lot of finger pointing elsewhere. Even our FAA conducted our own analysis, after the Lion Air, and their statistical analysis showed that less than probably another year there would be at least one other major accident.
Now, this gets a little bit fuzzy. My reading, I actually read different numbers. One says that the FAA model, statistical model, suggests that there’s another accident likely within less than a year. Another report I read said multiple accidents are likely within a year. So even one is, frankly, quite a lot. And yet our FAA reaffirmed airworthiness after Lion Air, reaffirmed airworthiness after the second accident. Which is a bit mindboggling because the rest of the world’s shutting us down, while we, the FAA still say our planes are good. So it couldn’t help but raise the point, are we trying to protect our commercial interests over lives of human beings? So that’s a second huge risk factor when you have a regulatory body that is no longer trusted.
BILL YATES: That’s true. Te, there are a couple of factors that you brought out, too. One goes right to the engineering of the 737 MAX. And there was also the factor of the competitive product, if you will, the Airbus A320 that was being developed at the same time. So we have to look at that in terms of the pressure that was put on the Boeing team to develop the 737 MAX. But let’s talk about the approach they were taking, the engineering, and the impact that the decision had in terms of having a larger engine, and having it higher up on the wing. Just from a standpoint of technical complexity, that’s got to really ramp up the risk as you’re developing a product like that, or an aircraft.
TE WU: Absolutely. And that actually requires a little bit of context setting because your first part of your question was a really good one. Boeing actually had a lot of time to develop the MAX 737 series. But they were also working on other major projects at the time. So it caused a significant delay with the Boeing 737. Essentially the wake-up call happened when the Airbus, the biggest competitor, introduced their A320 Neo. An A320 Neo essentially flies very much in the similar segment as the Boeing 737. Boeing 737, put in proper context, is the work horse of almost any major aviation company. This is because it could carry somewhere between 150 to 200 people, fly over a distance like domestically for most airlines is perfect. Doesn’t have to land anywhere for refueling. And that is true for the Airbus series, as well.
When Airbus introduced their A320 Neo, and it became approximately 15 to 20% more fuel efficient, and they announced it to the world – and by the way, American Airlines, one of the biggest loyalists of the Boeing, started adopting the plane. Boeing essentially panicked. So they essentially let all that development time that they originally had, they sort of just were playing with different ideas. But to be fair, they were developing at the time Boeing 787 Dreamliner. There was another project around Boeing 747-400 plane that they were finishing technically. So they were distracted. But all of a sudden they had this wake-up call when the most loyal of their customers, started adopting the Airbus. And they launched an aggressive effort to build the MAX series, 737 MAX.
How Airbus really improved the fuel efficiency, in a nutshell, is they had a larger engine bolted onto the wing and, combined with other changes, really made the plane more fuel efficient. So Boeing in some ways followed the same game plan. They introduced a larger engine to put on the Boeing plane. The problem with the Boeing plane is it was designed in the ‘60s. And it was designed purposely to be closer to the ground so you can move people, luggage, in and out of the body of the plane easier. Well, now with the bigger engine, if you’re on the ground, the engine would hit the ground because you wouldn’t have the clearing.
So Boeing engineers did a lot of studies, and they finally had a solution. They made a major splash, by the way, when they announced that they have a solution. And the solution is mount the engine a little bit higher, or significantly higher, on the wing. So this way it won’t hit the ground when it lands. It’s a perfect solution to a fairly old problem. The basic design of 737 was built in the ‘60s. In some ways the latest Boeing is the design from back in the ‘60s, now combined with the latest computer sensors.
So as you can imagine, if you bolt something higher, and also it’s heavier because it’s bigger, on the wing, it changes the aerodynamic of the flight. And this is when they start making adjustments. What they eventually decided to do is borrow a system that they developed, first for the military plane in the refueling tank, as you can imagine, the KC I think 45 refueling tankers, fully loaded with fuel, is a very, very heavy plane. And they introduced the MCAS system as a way of making small tiny adjustments to the plane, changing the pitch, if the plane is sort of heading the wrong direction. So it points the nose higher. And this Maneuvering Characteristic Augmentation System was successfully used in that military setting.
BILL YATES: Te, by the time they implemented the MCAS into the 737 MAX what changes were made to the original design and how did this impact safety considerations?
TE WU: So both in terms of the speed of trying to get the plane launched, but also trying aggressively to cut cost, there were a number of safety considerations that changed from the original design of MCAS to by the time they implemented in the 737 MAX. And those changes, each of them perhaps seems quite reasonable in itself, started what I call “brewing a perfect storm.” You know, a perfect storm in business, or for that matter even in life, is usually a confluence of multiple factors that people for whatever reason didn’t conceptually bring it together.
Just like for example the other day I was telling my daughter, we were walking in the park, and her shoelace is untied. I don’t know why kids do not like to tie their shoelace. I don’t think it’s just my daughter, I think it’s every kid on the planet. And I had to remind her to tie her shoelace. She said, “Dad, why? I’m almost home. I’m only within a block.” And this is why I remind her, and I actually had a conversation with her on the perfect storm. A perfect storm by definition is a confluence of multiple factors. Some factors are not even well understood, but yet converge at a single point. And when multiple points of failure happen, accidents tend to ensue.
So something as simple as a shoelace, average walking, sure, not a big deal. But what if you’re suddenly crossing the street, and the light is changing, and you suddenly now had to sprint. Most people don’t like to wait for the next light. And what if the road is a little bit wet from the last rain that we had? This is summer, after all. And what happens if there’s a car coming, and maybe that car driver, for whatever reason, got distracted on a gadget and didn’t actually look at the light as carefully? Now you have loose shoelace, wet pavement, rushing across a light, with a car incoming that may not see you. These are what happens in real life.
So in Boeing’s case, to me, from an external party objectively looking at it, is they had a plane in which they want to make it more fuel efficient. They put a bigger engine and made it higher on the wing. Makes perfect sense. In order to adjust for the heavier load, or the changing dynamic of the plane, they introduce a system that originally was supposed to hook to two sensors and taking reading from both sensors to adjust whether the MCAS kicks in or not. But one sensor should do fine, too. Okay, so they cut out one sensor. And that could save some money. And by the way, they even save money, as I mentioned, Lion Air, they actually didn’t even have a cockpit indicator because that’s treated as enhancement. Costs more money to put that indicator light. Combined with lack of training.
Oh, I didn’t even mention the software issue with the MCAS. Originally in the military plane it was a fairly low system. Works in the background. Very rarely used, only at the point of taking off. But in order to make the adjustments more automatic, in order to have less training for the pilots, training later became a huge issue because this flight simulator training is expensive. And that adds to the total cost of the ownership of the plane. So Boeing wanted to launch a plane that really the pilots don’t need any further training, which is essentially what happened to Lion Air. So they made the software a little bit more aggressive. Kicks in a little bit sooner. Have a little bit higher pitch based on one sensor.
This is how I see the brewing of a perfect storm. If everything worked to its design specification, and as it should, the system would really be used. But that’s not the case. And when now the system’s activated, a series of unintended consequences started to happen. The angle-of-attack sensor in which was based on, the single sensor that was wired, that’s actually on the nose of the plane. It’s around the side of the nose of the plane. It is very prone to defects. Why? Because a bird could hit it, for example.
BILL YATES: Right.
TE WU: Right? It’s around the exterior hull. So now you have a confluence of events brewing. And because Boeing at that time was trying to save money, reduce time, they tried to design the plane in almost half the usual time. They didn’t quite get to half, but that’s what they were trying to. So a lot of factors were overlooked. Later studies have shown, and the Department of Justice actually filed a suit against Boeing for that reason, is they also misled FAA. In their conversation with FAA, the plane change was so infrequent and so minor that it does not warrant any additional training. And for whatever reason, and this is not our subject of talking about FAA, so I’ll just leave it at that, saying FAA accepted their recommendation and agreed with very little training. So now you have pilots with no training.
So that’s essentially what are some of the major risk factors. I think if I comprehensively put it at, one, the environmental factor led to a very aggressive project. And by the way, plenty of internal people complained about that. But the noise never bubbled outside the organization.
Number two, there were a series of design changes, from physical design of the plane by moving a bigger engine, moving it higher on the wing, to software changes, to changes to how it hooks up to the angle-of-attack sensors. And while each in its own microsphere makes logical sense, I don’t think anybody was trying to design a faulty plane. I don’t think there was a comprehensive look. Which is the role of project manager. Project manager by default is in charge of the entire project. They’re the first line of defense who should have took a step back and look at all these changes, say does it make sense.
I’ve led a lot of projects. I’m not expert on everything. But we go ask questions. We go find others to help us answer those questions. To me, that failed. Maybe failed because the plane was so being aggressively trying to be developed and marketed. So now you have a perfect storm brewing. Combined with FAA that were essentially not very effective, you lose the guard rails on those, as well. And then, frankly, when the accident did happen, the sense of culturalism, cultural superiority perhaps, kicked in and saying it didn’t happen on our watch, it didn’t happen with our pilots, without really examining the underlying factor more critically. And of course then you add an additional layer later of commercial interests and protectionism that comes with that, you led essentially to the tragedy of two planes.
BILL YATES: Te, when we look back at this project, again, it’s sobering. And I’m thinking, okay, what lessons can we learn from this, or what can I apply in my future projects? One of the things that you’ve hit on I completely agree with. There needs to be someone who looks at this holistically; right? There needs to be a project manager. There needs to be maybe even a program manager. Someone needs to have a view of the output of the project from the view of, okay, who is going to be impacted by this? Not just my team, not just my customer, the public. Who’s going to use it?
You’re dropping a pebble in the water and seeing where those ripples go. Somebody needs to take a holistic view and like you said, is there a potential for a perfect storm brewing? What are some practical things that we can do as project managers to get our head up out of the weeds and look at that big view?
TE WU: Excellent question. This is something I think about all the time. I think there are at least three things we can do. One, I think for all project managers developing products and services that impact people, it’s worthwhile to take a closer look at what kind of risks are there. Different products have inherent different risks. A plane, for example, carries people. Human lives are sacred. And that is a sacred duty. So when you design a plane or automobile or boat, it is worth to look at what are the various types of risk.
In this case of Boeing, engineering risk, software risk, I’m very sure Boeing engineers are aware of. But what they need to think about is what I call that “perfect storm.” Perfect storms are usually formed by a combination of known risks that is quite manageable, and probability-wise fairly low. But it combines with emergent risk. Emergent risks are small risks appear in the left field, something you don’t really think about. In colloquial terms we sometimes call it “unknown unknowns.” It is not something that we readily see. And when these risks do happen, the probability against are fairly low because even up to that point Boeing 737’s been flying for some time now, at that point for well over a year, and hundreds and thousands of flights.
So, yeah, the accident only happens a small percentage of time. But when these happen, the confluence create that storm. So I think as a project manager, when we design products that have life at stake, or other big consequences at stake, take a step back and ask what are those risks? What can go wrong? And how can we fix or mitigate those risks?
BILL YATES: So the first one is risk. What’s the second aspect that you consider?
TE WU: The second one is there is a difference between what I call an intelligent project manager, let’s say the smart, intelligent one, versus a bold project manager. The reason I make that distinction is a bold project manager is going to fight through hell to get to what they want. They’re given a set of directions. They follow that set of directions. And it’s like in military, if you tell a sergeant, “Go conquer that hill,” that sergeant will figure out ways to go conquer that hill. That is what I call “bold” project manager. But the intelligent project manager often takes a step back and asks, what is the purpose of our mission? Why are we conquering that hill? What about the adjacent hill? Can we do something else, not even conquer a hill, and still achieve the strategic objectives?
Because at the end of the day, in project management we call “project program portfolio.” These are artificial constructs that we create for ease of management, for application of resources. So what is Boeing’s project? Is it really a project just to develop a plane? Or is it something broader to have something that makes life of human easier, and of course make Boeing quite a bit of money. Where’s safety in that? Project managers often have a bad reputation. I come from a lot of software projects. Our bad reputation was sort of get something done, toss it over the fence, hoping somebody on the other side of the fence catches it and supports it. Right. That may be fine for a low-key software, but that doesn’t work fine for airplanes.
So I think the second step for an intelligent project manager is to ask, what is this project? What are we trying to do? What are the end objectives? And how do we best advise? Because, again, projects are artificial. We create them. We put the boundary. So that’s the second one.
BILL YATES: Te, there’s one thing I want to add to that because I agree with you. We need to be intelligent and ask the question of why. And just as you were describing that, it struck me, I think about benefits analysis and where it encourages project managers to really grow to think about long-term benefits of the output that we’re producing through the project. But there’s a flipside of that, too; right? We need to not only think about the benefits, but think about the negative impacts. Okay, this could be amazing. Hey, let’s put water in these disposable containers. It’s about 16 ounces, and you can stick it in your backpack or whatever. This is awesome. Let’s go to market with it. And now we have water bottles filling the ocean.
So in those cases it’s like, okay, we need to think back, we need to pause and be intelligent, ask the question why, look at the long-term benefits, and look at the long-term impacts, too. So that’s a strong point.
TE WU: That’s actually a great segue into the third. The third concept is what I call “duty of care.” In our profession, take Project Management Institute, we have the code of ethics. You know, there are things we shall not do, for example, lie to our customer; right? That’s just bad management in any industry. But I think in project management, we have to put our conscience into the work that we do. We have to think about the bigger impact, the longer term implications. And you could see examples of that from all over the place.
Just take the plane Boeing 737 MAX. Is the project manager’s job getting that plane out the gate? What about safety? What about quality? And what about maintainability? So duty of care I think for all project managers is, yes, they may actually have to put their job on the line for that, by the way. So it is not an easy decision. But it is an obligation that we have as project professionals to adhere to a standard of reasonable care. And I know you have to define what that “reasonable care” is in that context. But basically try to not harm human lives, try to develop the goods and services that we promise, at a balanced level, if not minimal level, of harm to the stakeholder.
And by the way, if there is harm, then say it. Express it. Every product we develop has limitations. There is no such thing as a perfect product. That’s why we put in instructions and guides of how to use it. In Boeing’s case, the fact that they’re trying to hide MCAS in the beginning from that other pilot, I mean, maybe “hide” is too strong because they didn’t explicitly hide it, they just didn’t mention it. That is just, an error of such a magnitude, that the pilots in the Lion Air Flight 610 didn’t even know about. So that’s what I’m referring to, that duty of care. Every plane has its own risk. Pilots have their responsibility. But for them to do their job well, they have to at least have the knowledge of what’s in it and how that impacts the plane.
So this concept of duty of care is really figuring out what is the right balance of reasonable care in the context of the industry, the product, the service that you’re delivering, making sure that, if there are indeed challenges or difficulties or risks, then you identify them. You communicate them and train people accordingly so they know how to react when these things happen.
BILL YATES: Yeah. This is good. It’s a different mindset, I think, for project managers. And it means as a project manager I need to be challenged to look beyond that which I’m producing. I’ve got to be bold and ask questions when things seem to potentially have a bad impact further down the road. And I cannot rely on other agencies or other departments, as well. I think, you know, in the case with Boeing and the FAA, we can see the system as it was being changed. The FAA delegated responsibility to Boeing that perhaps they shouldn’t.
And if I’m the project manager for a project, and I’m assuming somebody else is going to be checking something for me, or some other department’s going to be responsible for this potentially negative impact that I’m creating, I can’t do that. I can’t have that attitude. I need to own that, as well. And I need to help analyze that. These are thought‑provoking.
TE WU: I mean, I think the reality in today’s contemporary business setting is it’s highly competitive. And I don’t want to put so much burden on the project manager that the job becomes so heavy, weighted down by the responsibility. But what is important for project managers to know is to really define very clear accountabilities and responsibilities. They also make sure that this handoff between one party to another happens. In certain products such as a plane, redundancies are paramount because they’re essentially checking their own work. Right? In other cases it may not be. But that doesn’t mean that there should be any lack of communication.
One of the fears I have, and I see that in our society all the time, from the latest data breach, for example, you read back at what happened with a recent hacking, is companies fail to take very commonsense security approach that you and I, probably take it for granted at this point with, for example, authentication of passcodes. But they don’t. In this day and age, I’m not sure that’s excusable anymore.
So I think as a project manager, at the very, very minimum we have that responsibility of calling it out. And if there are things that we cannot possibly do because we didn’t have the time or resources or money, which is again very common, document it. List them. Communicate them. Make sure your sponsors know about them. Make sure the next team working on the next phase know about them. It may come across as a little bit of passing the buck because I couldn’t deal with it. And there is going to be a little bit sense of that. But it is far better than shove it under the rug and hope and pray that the problem will happen on somebody else’s watch.
WENDY GROUNDS: At the moment, Te, what is the latest update on the 737 MAX? I think I believe it’s operational, and it’s flying?
TE WU: Flight Boeing 737 MAX is flying again. FAA certified the plane in November of 2020. And then shortly there followed by the Brazil, the Saudi Arabian, the European. But the Chinese authority to this day still ground the plane. So it is not fully fly-worthy around the world.
I often get a question from my students that say, “Professor, will you sit in the new plane?” For all intents and purposes, I think this is probably the safest plane on the planet at the moment.
BILL YATES: True.
TE WU: Because if nothing else, the fact that the Europeans, the Brazilians, Saudi Arabia reaffirmed the plane. Our friends in Canada. By the way, nobody trusts FAA’s recommendation anymore, so every one of them did their own independent analysis. That plane is audited to death at this point.
BILL YATES: Yeah.
TE WU: So it is probably one of the safest planes in the air.
BILL YATES: That’s true.
TE WU: For that reason.
BILL YATES: There was the news headline, I saw it for April 12th, 2021, talking about the 737 MAX being back in service. Also I thought this was interesting. I saw a headline June 2nd, 2021, that said “FAA safety official retires.” This person was criticized over the Boeing jet. So the person who led the FAA’s Aviation Safety Office since 2017 and oversaw the approval of Boeing’s planes is retiring.
WENDY GROUNDS: Te, thank you. This has been a very sobering lesson that we can learn from. And, you know, we appreciate the research that you’ve put into this tragedy. I think it’s been a lot of good lessons from this. You know, it’s an interesting story is one way we can look at it. But it’s a tragedy, as well, and it’s something that we hope people can look at and say, I need to consider duty of care and consider my obligations as a project manager so that things like this don’t happen again.
TE WU: Well, thank you so much. And thank you for inviting me. You know, the big takeaway for project managers, there’s really, to me, there’s three things project managers have to do all the time. One, be realistic. Be realistic with time, money, resources, what does it take. You know, I always joke that you can’t have a pitcher of orange juice if you’re only given one or two fresh oranges. It doesn’t matter how hard you squeeze those oranges, you’re not getting that pitcher. And project managers, one of their first jobs is just be realistic. They shouldn’t inflate; they shouldn’t deflate. Support it with facts, evidence, experiences, stories. And justify it. So that’s number one. Just be realistic.
Number two, be truthful. In other words, look at the data. Look at the numbers. Look at what science says. At least be truthful to yourself and your team. Because I know there’s a lot of politics in organizations, and you may have to phrase this so something that is acceptable to another party. But underpinning that still has to be that layer of truth.
And the third is be holistic, something I alluded to before. Projects are artificial creations. We put the boundaries on projects. As a matter of fact, I tend to use the word “initiative” and “project” for that difference. Initiatives are endeavors that has not yet solidified its boundaries. So it’s still a big mix of stew. But projects are endeavors that have been well-defined. Doesn’t mean you can’t change. But at least you have put a stake in the ground, a line on the sand, saying that’s what projects are. But make sure that these boundaries you put have the holistic thinking. It’s not just delivering that deliverable per se. What are the benefits you’re trying to achieve, and the impact of the people using them?
To me, if project managers could think about these three aspects – be realistic, be fact-driven, and be truthful about the data, at least to one’s self and team – and also be a little bit more holistic, we could solve so many problems upfront and head off the downstream challenges.
WENDY GROUNDS: If our listeners would like to reach out to you and just talk a little bit more or hear a bit about your work, how can they reach you?
TE WU: Thank you for asking. They could send me email. My email is email@example.com. So firstname.lastname@example.org.
BILL YATES: Thank you so much. We really appreciate it. And thanks for putting the 737 MAX story in a project context. Really appreciate it. Good stuff.
WENDY GROUNDS: That’s it for us here on Manage This. Thank you for joining us. You can visit us at Velociteach.com, where you can also subscribe to this podcast and see a complete transcript of the show. You also just earned Professional Development Units by listening to this podcast. To claim them, go to Velociteach.com. Choose Manage This Podcast from the top of the page, and click the button that says Claim PDUs, and follow through the steps. Until next time, keep calm and Manage This.
Very good podcast!! It brought back memories of when I was involved in the redesign of the Space Shuttle Solid Rocket Motors following the Challenger incident. It seems that all, or most, major disasters can be boiled down to that “perfect storm” of small decisions.
Interesting podcast – having a background in human factors it’s always interesting to view these perfect storms and wonder how we got there. But it won’t be the last one we run into, but hopefully the last plane design perfect storm.
Thank you Dr. Te Wu; this was a great and interesting podcast. I can only imagine the amount of research that went into this case(s). Also do agree with being smart and logical about projects, and stay away from bold.
Nice podcast! For me, the biggest takeaway is executing an effective handover process at multiple points during the project!