Print This Page


88th GM Presentation James Chiles

Print Date: 11/21/2024 6:42:05 AM

The 88th General Meeting Speaker Presentation

“Timeless and Fearless: Durable Lessons in Engineering”

James R. Chiles

The following presentation was delivered at the 88th General Meeting Monday General Session, May 6, 2019. It has been edited for content and phrasing.

INTRODUCTION: James R. Chiles is a technology author and commentator. His written works have appeared in Smithsonian, Air & Space, the Boston Globe, Popular Science, Harvard, Aviation Week, Mechanical Engineering, and Invention & Technology.

He has appeared as a featured commentator for a History Channel television series based on his book, Inviting Disaster, and also appeared in History Channel programs Titanic at 100, Katrina: American Catastrophe, Engineering Disasters, Life after People, Wild West Tech, and Megadisasters. He also appeared on National Geographic’s Seconds from Disaster series.

Mr. Chiles is a regular contributor to the National Board BULLETIN. In his column, he shares lessons learned across many industries to illustrate the universality of safety issues and the importance of safety cultures in the workplace.

Mr. Chiles’ slide presentation can be accessed here.

MR. CHILES: Thank you so much. In fact, I owe a lot to Wendy White. She contacted me in 2010 and asked, “Would you like to write for this organization called the National Board?” At that time, I didn’t know who they were. It turned out a perfect connection for me and why I’m so pleased to continue to write for them, although I'm not a mechanical engineer nor an inspector. It gave me a chance to dig into what is really the first of the machine frontier described in the book, Inviting Disaster, which is steam power. 

Nothing else in human history rivals what happened with steam, beginning with Denis Papin, Thomas Savery, and James Watts. It got bigger and more powerful and more complicated, and really continues today. 

So it's been a great connection for me to build on this idea of machine frontier and the human factors that go into it, the discipline of technology. 

I have the title of "Fearless," and I would say is not the same as reckless, which is a pattern I often see in the machine frontier where people are being heedless and moving ahead without any thought of past lessons. 

Here is an example. You heard about the Boeing 737 Max.  

I came across a book. I was not looking for it; it was highly proprietary by Boeing. It was called Design Criteria and Objectives, and that's supposed to be the book of wisdom of Boeing Airplane Company.  Why they didn't absorb the lessons into the 737 Max is a mystery to me, because there have been problems before. 

And here is the general principle I pass along to you, and I think it's related to the boiler industry, it's not unusual at all for problems with safety equipment to cause a disaster. It's so important to do certification at testing. 

Think of Three Mile Island. From what we know, the pressure-operated relief valve triggered at seven minutes at Three Mile Island, and although it wasn't really a human safety catastrophe, it certainly was an economic catastrophe. In the 737 Max, a safety device went wrong. 

So the work that I did and continue to do talks about this machine frontier, things adding up. In fact, a quote from Three Mile Island comes back to me. A radiation safety inspector said, "It just seemed to go on surprise after surprise." 

A pattern that you see on the machine frontier is people, who are not masters of that subject, experiencing this terrifying thing called “vuja de”, feeling that no one has ever been there before. That is scary. 

The Unreasonable Man.

Let me begin by this, picking up on something that Andy Andrews [keynote speaker at the 88th General Meeting] said, this idea of what is true is not necessarily the full truth. 

Imagine you are near the Reflecting Pool in Washington, D.C., in 1953. There is a scrawny guy in his 40s picking up bottles with his son. He has a gunny sack, he's putting bottles in his bag, and he's going to get the five-cent redemption. Well, you might look at that guy and say go get a job, do something important. 

That was Admiral Hyman Rickover, who had grown up in great poverty and had come over from Poland. He was also an amazing contributor to America, technology, and its history. He had been so poor as a boy that he maintained those habits, and he wanted to pass them on to his son, Robert. 

I think a good aspect of what Andy was talking about – when you add up all the little pieces, you get closer to the truth, but you never quite get there. 

Well, Rickover was an unreasonable man, and insisted that his people be unreasonable too. 

Iroquois Theater. So this is my point about sense-making, about knowing your world, about looking at the problems ahead of time and not being startled into this vuja de, this sense that the whole world has turned against you. That's how I phrase vuja de. It's a phrase from Carl White. 

Iroquois Theater had rushed into production. It had rushed into putting on a show called “Mr. Bluebeard,” because they wanted to get all these people in the theater in Chicago even though the theater wasn't quite finished. In fact, a lot of the fire equipment hadn't been installed. 

But on December 30th, 3:15 p.m., a fire started and could not be stopped by the primitive, uncompleted equipment available. The asbestos curtain didn't work. Nothing worked. People opened the doors and fed the flames. 

Six hundred people died, and that eventually led to an invention called the panic bar. It had been discussed beforehand, but the Iroquois Theater fire involved so many people trapped at the exits, that the panic bar, which you see them all over this building and every building you go into.  You push on it; the door opens out and allows people to get out easily. They are not trapped by the doors.

Well, the thing that struck me in reading about the Iroquois Theater fire was a young man named Emil Plachecki, who was a German immigrant student. What is so remarkable about him is when everybody else was running for the exits understandably, he instead ran for the men's bathroom on the third floor. He climbed up a window with a sash cord and broke his way through a skylight. He was the only person to escape in any way similar to that on the third floor of this fire that killed six hundred people. So he was injured and burned, but somehow it just struck me that he had paid attention before this fire as to what are his options for escape. 

And I hear that again and again when I go to talk with people on the machine frontier. They are always thinking about an option B, option C, how do we get out of here, how do we stabilize this situation. 

And that's what I learned from the various near misses in airplane crashes – the best pilots stabilize the situation first, then they diagnose, and then they act. 

My wife always encourages me to put things in three words, and I will work on that later. But it would be to stabilize, then diagnose, and then act. 

Don't start acting. It's a common fabric in these things where people just push buttons, and then pretty soon even an expert doesn't know what the system is doing. 

Ecclesiastes had this to say – and I love the Book of Ecclesiastes, but I don't agree with everything it said. I believe Solomon wrote this, "What has been done will be done again. There is nothing new under the sun." 

You may have seen in the news – you have heard of a Florida panther story. This is a Florida turtle story:  A Florida turtle crashed through a windshield on an interstate. Where do you find a Biblical analogy to that? The turtle was okay; it crawled away. The driver was okay; she drove away. The turtle had been flipped up by a car. 

There are so many things on the technological frontier, the machine frontier, that have no historical analogies at all, and so we continue to struggle with that. And sometimes we are just surprised at tech. 

So let me pose this question to you. Which of these two events was more startling – the explosion of the atomic bomb over Nagasaki and Hiroshima or a blackout in November of 1965? I would argue that the blackout was the most startling about what technology is doing. Do we really have control of this?

And I say that because several times people had predicted atomic bombs including H. G. Wells in 1914, when he predicted an atomic hand grenade. So people were aware of this idea of atomic power. They didn't know why it would be so big, but they knew something big was going to happen. 

But when I researched technology history, nobody expected that this blackout would put fifty million people in the dark. Where did this come from? In fact, the best quote to me is from a radio disk jockey in New York who was broadcasting at the time when the cycles per second began to drop, and he noticed a song that was being played, "Everyone's Gone to the Moon," just sounded so slow and dreary. He thought, “Gee, am I imagining this?”

Afterwards he said – and this is all recorded, "I had no idea something like this could happen." 

And that was – very briefly explained – a safety device that had been misadjusted at the Sir Adam Beck II Power Plant near Niagara Falls. A relay triggered unnecessarily. When there was enough power going through that, it shut down a line. Too much power flowed eventually down to the U.S., because it couldn't make its way to Toronto, and threw a bunch of power plants offline. 

So we are at this present shock. Maybe in line with some of the things Andy was saying, this can be a bit scary, Cassandra-like. It looks very likely, at least the Department of Defense believes, we will be relocating a lot of superstructure and infrastructure, like rail lines, power plants, pipelines, and internet landings – you may know that Virginia is the prime site for internet landings around the world. Cables land mostly in Virginia.

The population is moving. We already see political discussions from that. 

New levels of automation bring about emergent behavior. Has anyone heard that phrase, emergent behavior? It means when the machine does something you don't expect at all. 

And that's happened several times with airliners. Maybe it's the cause of the missing MH370. We still don't know. 

Emergent behavior is one of those vuja de things. Where did this come from? What is this machine doing? 

And, of course, we have media whiplash, where the public doesn't know who to trust. So we conclude that we just won't trust anybody. 

Meanwhile, the equipment and projects get bigger and bigger. I saw this when I went onto offshore rigs and to an offshore technology show. Imagine a crane that can lift fifteen thousand tons. That's not unusual in the offshore technology field. 

So that's one thing I tried to bring to people through this book. A lot of stuff that you don't see is out there and is affecting your world, but it may be in space or it may be out in the ocean. You just don't see it. A whole ship can be lifted up by another ship and moved around. 

Here is an example of things getting bigger and bigger. If you look out the window, you will see the Bingham Canyon Mine. Doesn't that look like a fake photo? It looks like Photoshop, but it's not. That is what was called a rock avalanche, not a landslide. It was so fast; it was called a rock avalanche, hitting speeds well over 120 miles an hour, 165 million tons into the bottom of that mine. 

But it's really something to be proud of, because the company, Kennecott Copper, had suspected something like this would happen. They had five different kinds of instrumentation on that slope, because they knew it was somewhat at risk due to a seismic fault. Ahead of time that morning, at 11:30, they said everybody needed to clear away from the potential damage. And they knew where the risky part was. Early evening there were two different rock avalanches. Quite a bit of equipment was damaged or destroyed, but nobody was injured. 

Pretty impressive. It was something, that humanity does cause a pretty sizeable seismic signature. It didn't slow them down that much. I just give that as an example. The scale is so astonishing, we just hear about it now and then.     

I would say the first case of uniform standards went right to federal rather than the states. A ship blew up called the Lucy Walker, and it started a chain of events that ended up with the creation of the Federal Steamboat Inspection Service in 1871. 

And the worst explosion was that of the steamboat Sultana. Many of the men on the Sultana had survived the Andersonville Prison Camp. So here they are burned or killed on the Sultana due to terrible human error and pushing things to the limit without knowing what they are doing. 

But out of this tragedy – and Andy said this, we need to seize the moment. The moment is so important.  The Sultana helped lead to the Steamboat Inspection Service, and certainly built support for what HSB [The Hartford Steam Boiler Inspection and Insurance Company] had already started, which is we have to get control of these things. 

And because they moved around between the rivers and the oceans, it was easier to make an immediate federal argument, but it wasn't easy in the sense it took over thirty years to convince Congress that this was really necessary and wasn't an overrreach. Because many senators, including one from New Jersey, said this was a terrible overreaching of federal authority. 

So you might wonder where this uniformity is leading.  An example I would give underway today, and why I think the present is as exciting as the past, is autonomous cars. 

Those at this moment are struggling – or they should be struggling – to come to uniformity, because, unique among a lot of technology, you have different brands of autonomous cars who will have to agree with each other and exchange information.  

It's much more complicated than the fight between VHS and Sony Betamax where you could just keep buying things and eventually the market said let's put Betamax in a museum. 

Well, not with autonomous cars. They have to agree. And I will give you a quick example. In Minneapolis, where I live, there are two interstates that come together, and one of those interstates is built out of two lanes where they merge. Because there is no yield and no right of way, what are the cars going to do? What are the machines going to do since the new autonomous cars won't have steering wheels? It's going to be all up to the algorithms, all up to the technology. 

Right now, it looks like the states are leading the way and will be working that out. Eventually, though, I’d imagine the NHTSA would take over.

So this is the little laboratory idea, that states can have that function, but eventually you have to move towards uniformity. They have a place, but it can get out of control. It can get too extreme. There could be bad ideas.

You saw this with the early traffic laws where some states wouldn't allow a car to come into their state unless it had already been registered. How are you going to go from state to state if you have to register first? 

Here is an example of where states can lead the way, and the feds will follow. Then I will give another example of where the states weren't necessary. 

This is a Nash 1949. It was decided it would be the safe car, and it led the way with seatbelts. In 1961, Wisconsin, because not surprisingly Nash was a manufacturer in Wisconsin at the time, said all cars sold in Wisconsin must have seatbelts. The whole issue wasn't resolved until 1984 when the feds said all new cars have to have seatbelts. 

Now, this is an interesting case. I heard this when I was at a helicopter convention. It's such an amazing story that I had to check it out. You have two cars, both Chrysler LeBarons. They meet head-on full-speed in Culpepper, Virginia, in what's called Rural Route 640. The ambulance pulls up, and the guy hops out and looks at the crumpled wreckage, and he sees two people sitting on the road.

And he says, “Did somebody already haul these bodies away? I can't believe they called us, and they already hauled the bodies away.” 

And the people say, “No, there are no bodies. “

This is not something to joke about. Nobody ever survives a head-on. 

Well, those two people on the berm were the drivers. Their names were Ron Woody and Priscilla Vansteelant, the first case of two airbag-equipped cars running into each other. What are the odds?  Probably one in ten million. And one of them didn't even have a seatbelt on. 

So in that case, Chrysler led the way, not the states, and later on, it became a federal mandate. 

So Rickover was big on this notion – this idea of having a signed inspection form so that you know who to go back to. You really shouldn't even bother if there isn't a signature or name on it. 

This is a form from a chimney sweep company. I like it because on the form, the customer says I received this on a certain date, and the inspector says I did this, and here is my signature, and here is the date. So if there were a fire, then you would know who to go to. 

So I tried to find a picture of Rickover smiling, and I couldn't. This picture is as close as I could find. He himself interviewed some of you over the years. I suppose he smiled sometimes. 

But he would have been proud. He spoke to the National Board in Baltimore, and the National Board asked him what's important, and he said, “One of the most important things is a signed inspection form.  There is no substitute.”

And so, of course, part of this is there have to be inspectors. That was part of the problem at the Brazilian dam failures in 2015 and 2019. There were inspectors, but there weren't enough of them, and they weren't independent enough. That was the Vale iron ore mine dam failure. 

What I would boil down as far as critical thinking is ask questions. One of the op-eds I wrote for the Board was on red flags. 

You might wonder how I’m supposed to spot those red flags. These are everything from embezzlement to faked jumper cables. One of your inspectors told me how to spot the signs of jumper cables that are cutting out the alarms. 

One of the ways you do that is – in fact, this is a CIA intelligence secret – you ask a series of questions. You keep digging down into the story, and that often will develop red flags. 

That’s also the secret to police interrogation – checking a person's story in a nice way, not throwing them against the wall. It's none of that. It's just probing down in that story to see if it holds up.  

And basically no spy can ever go that deep. That's how spies are usually uncovered. 

So another word – a three-word level is the Technology Readiness Level (TRL).  If you would like to know more about that, I will be around later. That often will tell you how solid an idea is, how much it's been tested. The Pentagon and NASA have used TRLs for quite a while, and they are really a great method.  Maybe I will do an op-ed on that too. 

So winding up, I started out with less scary stories, and you might think the next hundred years might see all these things – artificial intelligence, the singularity, population shifts. 

But one thing that struck me when I looked back in history, in 1938, there was a quote from a Dickinson College professor who said, "The young people of today, we are raising a hopeless and soft generation."

Well, people might say, and Andy pointed out, gosh, I don't think so. That turned out to be the greatest generation, the exact people he was talking about. 

So part of the lesson I would pass on, the anti-Cassandra, is these three words, "We will rise." Whatever that situation is, people are so adaptable. You learn from each other. Whenever that crisis or opportunity arises, we will rise to meet it. 

Thank you.