Development Server
  Print
85th GM Presentation Chiles
09/08/16

The 85th General Meeting Feature Presentation
James R. Chiles

Telltale Signs of Risky Behavior

The following presentation was delivered at the 85th General Meeting Monday afternoon session, May 9. It has been edited for content and phrasing.

Introduction:

James R. Chiles addressed the General Session in 2011 and 2014, and he is a regular contributor to the National Board BULLETIN, where he provides excellent insight into the world of technology and history. Mr. Chiles has written extensively about these two topics since 1979. What distinguishes him from other technology and history writers is his approach to research. His access to tragedy sites has provided readers with profound insight into the internal dynamics of disasters.

Mr. Chiles is the author of Inviting Disaster: Lessons from the Edge of Technology and The God Machine: From Boomerangs to Blackhawks - The Story of the Helicopter. His work has appeared in such prestigious publications as Smithsonian, Air & Space, and Aviation Week. Additionally, Mr. Chiles has served as commentator for a television series based on his book, “Inviting Disaster.” He has also appeared on History Channel programs such as “Titanic at 100,” “Katrina: An American Catastrophe,” “Engineering Disasters,” “Life after People,” “Wild West Tech,” and “Megadisasters.” He also appeared on the National Geographic series “Seconds from Disaster.” 

Mr. Chiles' slide presentation can be accessed here.

Mr. Chiles:

I want to thank you so much. When I started out first writing for The BULLETIN and then speaking at your annual meeting, I thought it was a chance for me to pass on messages to you, and I hope that I have. I have had a chance to come back to you again and write regularly for The BULLETIN, but what has surprised me is that your messages have absorbed in me.

This talk in many ways is based on an address I gave in January to the National Transportation Safety Board, an organization that investigates major incidents. In my talk today, “Telltale Signs of Risky Behavior,” I'll talk about a particular personality type I made up called the Narcissistic Risk-Taking Leader, or NRTL. And that leads me to your week’s theme, Safety: Standing between You and Disaster. What is the face of safety? The answer is its many faces, it's a team, and it’s a group.

In all the cases I have studied, I really can't think of a single one incident where one person made all the difference. Now, yes, there are heroes you hear about, like Chesley Sullenberger. But he had a team behind him, and so did the following people.

Moira Smith, an officer for the Port Authority, NYPD, saved hundreds of lives on 9/11 by urging people to keep moving and not get paralyzed by the sight at the base of the South Tower. I wrote about another case in my book where a DC-10 very nearly crashed in 1972 because of a bad design that led to a cargo door blowing out over Windsor, Ontario. Bryce McCormick had trained himself, on his own, to handle the aircraft with hardly any control at all. He basically had control over two engines and a little bit of the elevator, not the rudder, which was jammed over, and not the ailerons. He had to land this DC-10 at the Detroit Metro Airport, and he had an amazing self-taught skill level that he learned on a very primitive simulator.

But the point I want to make is that it wasn't just him. He also had a first officer by the name of Paige Whitney, who saved the day on his own, because once they landed, Bryce McCormick was paralyzed by the fact that the aircraft, because the rudder was jammed over, was heading for the fire station at Detroit Metro, and McCormick froze up. What irony. An airplane makes it to the ground in miraculous fashion, and it's going to crash into the fire station. Well, Paige Whitney – in an example of teamwork, used the thrust levers to steer the jet away from the fire station and roll to a stop. In fact, the plane flew again. How many times has that happened, that a plane was actually salvageable because a fellow knew how to use the engines to fly it when nobody else had ever done that before? It was teamwork; higher reliability organization in action in June of 1972.  Teamwork stood between the 67 passengers and six crew, what the NTSB calls 73 souls on board.

I try to have a positive view in my work. I found when working with TV, you just can't get those guys to think positively. They are always looking for that emotional side, the fear, the suspense. At each commercial break you’ve got to have suspense and fear and people to blame. And that's one thing I can do when I am writing for The BULLETIN and speaking here, I can give a positive side. General Robert M. Littlejohn is a positive fellow. He said one of the worst jobs in World War II was getting rid of the mountain of surplus. With everything he did, he never complained. He went into that job and came out of it with the notion that "There ain't no hold that can't be broke." And essentially today we might say “there is no wicked problem that can't be solved.” I really believe that. We may not like the solution, but problems are all solvable.

My book, Inviting Disaster, is still in print after 14 printings. It is a book with 13 common patterns crossing time and technology, in which error, bad design, fatigue, and poor leadership combined together to fracture complex, high-energy systems in startling ways. I call this the system fracture, where problems start building up and building up, and nobody does anything, and then it finally breaks. Knowing these patterns and having a team in place to spot them can help with prevention and crisis management, and what Karl Wright called vu jade, meaning a situation where you feel no one has ever been before. And I certainly read about those, talked to people who were in them, said it is eminently scary.

One example of people experiencing vu jade is from Chernobyl. The anniversary of that accident was this year, and people recounted stories where they just could not believe what they were seeing, and not just seeing, but what they were feeling, like the floor shaking and the roof falling in. Very powerful systems can do this.

It's a real problem when failures add up; when there are many little cracks and nobody has stopped -- nobody has stepped in. This is a picture of the Silver Bridge. There were plenty of cracks building up in the support bars, but nobody caught it, and it collapsed. The good thing is -- remember, we are trying to get that positive side of it -- there is a thing called the crack stopper. And I have a few examples from the book. A “crack stopping” organization knows to catch a system fracture early. There are no perfect systems; there are always flaws. The point is spotting them and having people react in time.

And here is an actual real-world example of a crack stopper. Many of you electrical engineers, one of the advantages of a direct current is that you can tie together alternating current systems, like the west, the east coast, with a direct current intertie, and it will not pass those shocks along. If any of you remember the Great Blackout in 1965, the reason it spread was because there weren't any DC interties, and a little problem just grew and grew. One of the solutions, (a crack stopper), is to use these direct current interties. There are some in California and the West. And Homeland Security would rather not say where they are, because they are so critical to the grid. If you know what to look for, they do look different.

Red flags have a long history, about as long as Hartford Steam Boiler, in fact. This is a picture of Casey Jones in 1900. He was a man of fame and history, who on the Illinois Central, ran his locomotive into the back of a train trying to make the time required by the Illinois Central. And he apparently ran over some torpedoes on the track which were set there to explode and warn him that there was a tail end of a train blocking the track. But he bravely stuck to his controls and tried to brake and was killed in the process.

That idea of red flags and warnings is very much relevant in the railroad industry. There is the torpedo, red lanterns, and red flags. The idea of a telltale in an old-fashioned locomotive boiler is that the bolts would spray steam into the firebox and the fuseable plugs that show the low-water condition, it was like a back-up to a pressure gauge. Basically, when the engineer and the fireman would see the firebox filling with steam, they knew they had a problem, a form of a red flag.

Let me share a few samples of red flags before I get into my main case study. One is undercapitalized banks, like those that are heavily dependent on taxi medallions in New York, they have that problem. The regulators regard undercapitalization as a red flag. Another is money laundering. Even people moving less money are now caught through red-flag methods, because under federal regulations you are not supposed to move over $10,000 in cash, but now with even less than that they will catch you.

A third example is medical fraud. There is a large amount of literature on medical fraud. Also procurement schemes with the government, and from your respective states, jumpered alarms and bypass safeties. There are a lot of maleficences out there; people cutting corners for various reasons. Red flags are really useful, people in small business should know, because embezzlers leave the same trail of red flags pretty much every time.

If you have ever been in a small business and had a very loyal bookkeeper who never asked for vacations and worked late at night, that's a problem. It doesn't mean they are guilty. See, that's the point about red flags. They are not a conclusion. When you notice red flags it means you need to start looking. Very much what Mark Masters has been talking about: things you need to pursue or investigate as a form of due diligence. But the bookkeeper, who never takes vacations, won't have bank reconciliation, hates audits, and gets very irritated when somebody is going to take their place – is showing signs of bright red flags.

Other well-known red flags can involve “extrasensory inspection,” in which inspectors fake inspections. One of the tipoffs is they have very vague descriptions with lots of photographs and very little narrative. The point is they will say there are a few problems, but it's nothing specific. They perform what’s called the drive-by inspection, and they are doing like 30 inspections a day. Well, that's a pretty good tipoff: how are you going to visit 30 places in one day?

Regarding the paper-fakers, and to emphasize what Mark Masters said, is that a very common pattern in red flags is the middle man. He's a middle man; they use bad record-keeping to cover it up.

But the earlier you can close those gaps, the better. People have to know ahead of time, even before they join the organization, that you have all kinds of firewalls set up, and then it probably won't even happen, because they will avoid you: Ah, these guys are no fun; they have all of these precautions against us embezzlers.

In 1855, Billy Hamilton, a classic narcissistic risk-taking leader, was an engineer on the steamboat Fanny Harris, and he would consistently run the pressure well over the red line, because he loved to race and pull practical jokes. He would literally endanger people; throw wrenches at people who tried to stop him from exceeding the rate of pressure set by the Federal Steamboat Inspection Service.

But while it’s easy to name and blame a villain, more important is: who hired and supervised him? Take Captain Schettino of the Costa Concordia, for example. His trial is not yet quite finalized. But the key question is who supervised and hired him? Who put him in that place? Did that person fail to act on early signs of poor judgement, and if so, why? To me that's the most important thing.

On that note, let’s meet the Narcissistic, Risk-Taking Leader. The NRTL will lead his team into risky settings for no reason. He’ll say it’s good for morale, and he likes to cheer with them. This is important, because they feed his ego. He insists he’s an expert, so the rules don’t apply to him. And he’s rude to subordinates who challenge or question him. 

Fairchild Air Force Base had such a leader. In 1991, the Cold War is called off for USAF Strategic Air Command, and this had a big impact. Bases like Fairchild AFB were changing or closing, and there was high office turnover. Fairchild got the news they were going to be moved over to tankers. They are no longer going to fly B-52s. But despite that, hotshot pilot Bud Holland, who had been a B-52 bomber pilot during the Cold War, still got to strut his stuff in a B-52. He was really good at handling B-52s, but he lacked judgment, meaning the big picture: Why am I doing this? Who is it going to affect? That's the judgment part of airmanship. Holland completely lost that judgment. And the inexplicable thing is that the Air Force had made him Chief of Standardization and Evaluation, 92nd Bomb Wing. He was the safety officer for this wing, and he was an extremely risk-taking, dangerous guy.

Holland wanted to roll a B-52. It was totally illegal to do that, and it had never been done before. Over 1991-1994, commanders kept changing the rules to allow him to fly certain banned maneuvers in air shows. And he kept breaking those rules. Crew members began to protest: “I will never fly with this fellow. I would sooner die. In fact, I will die if I fly with Bud Holland,” but their protests had no effect, and the crewmen lost trust and fell into angry fatalism: “Just wait, you’ll see.”

For example, in March of 1994, Holland was flying as low as three to five feet over a ridge in the Yakima range for photos -- three to five feet, and nothing happened. And what’s worst, he was inspiring other pilots to do the same thing even though they didn't have his skill.

A few months later, in June of 1994, he was rated as okay to run an air show. Only one man really tried to stop him. His name was Mark McGeehan. But the commanders wouldn’t let him stop Holland, so McGeehan insisted on flying every time Bud Holland did an air show, because he wanted to be there to try and keep it from crashing. 

Holland was pilot in command of the Czar 52, and it had four souls on board. It was June 24, 1994, during a second practice for the June 26 airshow. He'd made a couple of runs around Fairchild and made a vested approach, but they didn't do a standard climb-up. Bud Holland was at the control, and started a deep left turn. He broke all the rules about such dramatic maneuvers, completely breaking the rules about altitude, the air speed, the bad angle – he broke all those things. But he still got to do what he wanted. First, instead of 30 degrees, he went 60 degrees, and then 95 degrees, meaning it's partly upside down. He was trying to fly around the tower when he hit a line. He crashed and came very close to killing a couple hundred people who were in a training session near where it crashed.

Holland’s maneuvers, although they would never have been used in the Cold War, were part of a culture of going in and blowing up Soviet bases. The culture was to get there and drop the bomb, and you probably won't make it back. If you are curious about this, there is a really good article about whether these traits are hard-wired or bred by circumstance called "Why Men Take Chances," and there is a book called Beyond the Blue Mountain. Blue Mountain is a key concept for special weapons and special operators, such as Special Air Service, Navy SEALs, and the Green Berets. Blue Mountain is a story based upon a poem called the "Road to Samarkand," and it's very much about going where no one has ever gone before, this is how I live, this is what I do. That's part of the culture called Beyond the Blue Mountain.

There are plenty of NRTLs out there, and if you look at the overall culture, it will tell you a lot. If you hear they are telling stories about people like Bud Holland, that's a problem. Managers struggle to find and supervise people for boring but critical jobs involving powerful vehicles. It’s a culture where no one enforces what Admiral Hyman Rickover called the “discipline of technology.” It’s where people love to hear the escapades of somebody bigger than life. These people feel their job is boring, and you will find them reading on their smart phone instead of operating a train.

And another example is highmarking. This is when people on snow mobiles try to keep going higher and higher on the slope. Over two-thirds of all snowmobile fatalities in North America are due to highmarking, and they cause avalanches way up on the slope, but the goal is cutting slopes higher than anyone else. Another example is street racing, which puts all sorts of people in danger. 

In closing, I would encourage organizations to have green flags: a safety–centered system. These cultures have Andon (full stop) tools in place, and they are used. They follow the “Golden Rule” crew-matching principles that came from some excellent work by Swiss Re Centre for Global Dialogue in a document called "Golden Rules of Group Interaction in High-Risk Environments." Tribal memories in these safety cultures are vivid and passed down. And these green flag leaders welcome bad news – they are not afraid of it – but instead they actually welcome bad news and then follow up quickly. I thank you for your time. It's been a pleasure to dig more deeply into these subjects, and I thank you very much.