Black and White

The Hogwart's School of Management


Susanne Ramharter, MSC

A Few Principles of The Hogwart's School . . .

Unshakable faith in the Fairy Godmother: 

"We place absolute confidence in the Titanic. We believe the boat is unsinkable." (White Star Line Vice President P.A.S. Franklin - by the time Franklin spoke those words Titanic was at the bottom of the Sea)

Management by Magic: 
"I come here today confident that we have turned the corner on this merger." (Carly Fiorina on Feb. 4th in 2002 on the HP-Compaq Merger –– on Feb. 10, 2005, Fiorina was booted from the HP Board)

The Green Lantern strikes again: 

 "You only have to kick in the door and the whole rotten structure will come crashing drown." (Hitler on invading the Soviet Union in the summer of 1941 - we all know how that turned out)

The Abracadabra of War: 

"I am absolutely certain that, whereas in 1965 the enemy was winning, today he is certainly losing. The enemy's hopes are bankrupt." (General Westmoreland at a speech to the National Press Club on November 21st, 1967 –– who had no clue that the Tet-Offensive would start in January of 1968)

Twinkle Twinkle Little Star: 

"The new division will be established by February 2009." (COO of a large Austrian Bank, in December 2008 –– fully believing that the Corporate Project Elves would conveniently remove all obstacles, buy, install and implement new hardware and software, shift hundreds of people to new jobs, design, implement and train new procedures, etc. - which, amazingly, the elves could not to do)

Magical Thinking

    So, what do Franklin, Fiorina, Hitler, Westmoreland and the COO have in common (apart from the fact that they were all wrong)? Magical Thinking, that’s what.

    Magical Thinking is the belief or conviction that one’s actions, words or thoughts can have a causal impact beyond normal cause and effect. In the examples above, it is the assumption that ‘because I wish it, it will be so,’ completely independent of such unpleasant details as context or facts. 

    One of the earliest subscribers to this notion was the Greek mythical figure of Icarus, son of Daedalus. To escape Crete, Daedalus had made wings of feathers and wax for himself and his son. He warned Icarus to fly neither too low, nor too high, in order to effect a successful escape. But poor Icarus, once off in the air, he became so enamored of his own abilities––a mortal leaving earth and actually flying just like the Gods––that he believed normal rules no longer applied.  Up he flew. And up. He wanted to see the Sun (Apollo). And, finally, when the wax holding his wings melted, he crashed, burned, and ultimately drowned. Magical thinking may sound an amusing description of behavior, but it may have serious, sobering, even deadly, results.

    Now, neither our COO nor the other people mentioned above had the naïveté of Icarus. Each, in their own way, was or is, actually a person of high intelligence, one that we would probably consider to be perfectly rational. But, with the advantages of hindsight, knowing that they were all more or less spectacularly wrong, we have to ask: “What were they thinking?”

    As you may have guessed by now, the answer is they were applying Magical Thinking to real world events. Of course not one of them ever consciously thought of the Tooth Fairy, the Green Lantern, Magical Elves, etc. Each and every one of them were certainly fully convinced of that their course of action was rational, well-thought out and intelligent. And yet, they all succumbed to Magical Thinking. Why?

    Now, normally in psychology, the concept of Magical Thinking is most often applied to children. Picture an infant playing Peek-a-Boo: when the child holds his or her hands over their eyes, they truly believe that they are invisible. This is the first of many tests of reality rituals that emerge in childhood development. The beliefs that the Easter Bunny, Santa Claus, the Fairy Godmother etc., can effect desired outcomes are only natural extensions of this very early childhood ritual, tests which, alas, eventually require the child to adjust their thinking to the facts of life, facts which may not bring them a pony, but that give them access to how one might actually organize one’s life to get a real pony in the future.

    But in this paper we are talking about seemingly rational, intelligent adults, not children! I would argue that that doesn’t mean that the mechanisms at work are not similar. In both cases, the individual, whether child or adult, is faced with a situation of real need: need to have, or need to be, or need to achieve. There is a goal to be attained, for whatever reason. Now, facts and reality may provide cause for concern that the goal cannot be achieved, but it appears to be the case (see examples above coming out of the magic hat) that rather than face those facts, many adults prefer to relieve anxiety by Magical Thinking. Such people somehow manage to get quite far in life in terms of worldly power, but––as we shall see––their characters fall back on what is easier to believe than what is true, leading to disaster. In my observation, Magical Thinking in adults is usually the result of Cognitive Biases, Hubris, or––most often––both.

Cognitive Biases

    It is a well known fact in psychology that most humans are not very good at dealing with ambiguity. Our brains are simply not wired to efficiently deal with two or more opposing ideas at the same time. It is also well known and shown by numerous studies, that, given one thought that confirms our beliefs, and facts that contradict them, we will stick with our beliefs almost every single time. This is the core of the concept of ‘Cognitive Dissonance,’ first studied and defined by Leon Festinger and wonderfully explained by Carol Travis in her book, Mistakes were Made (but Not by Me) (see Appendix: Further Reading).

    Cognitive Dissonance is but one of a number of mental shortcuts, or heuristics, that our brains have evolved to deal with complexity. Heuristics are quite necessary for dealing with reality efficiently. Just imagine the time it would take to deal with the following situation if we had no heuristics: 

    You are driving a car and coming up to an intersection with a red light. Normally, you don’t even think about this, you brake and stop the car, either in back of the car in front of you, or at the intersection. But if your mind does not use heuristics, you would, (a) see the red light, (b) think about what it means, ( c) decide to stop, (d) decide to put your right foot on the brake (possibly shift down in a manual transmission), (e) think about distance to intersection or car in front, (f) monitor and control your braking action if manual transmission, (g) then monitor your gear, (i) bring car to stop.

    You see what I mean.  If you had to do all of that considering, judging, deciding one step at a time, it would probably lead you either stuck in the intersection (with much loud honking going on around you), or crashed into the back of the car in front of you. So, heuristics and the ability to make instantaneous decisions are a good thing, right?

    Turns out the answer is Yes and No. As useful as these heuristics are in order to enable our minds to quickly sort through recognized patterns of incoming data and respond accordingly, they can also lead to Biases. We have dealt with the red light at the intersection so many times in the same way that we have developed a “bias.” We do not notice that THIS time there is a new factor, maybe a baby carriage, a child running after a ball, something out-of-the-ordinary. Biases are heuristics that are good, even necessary to enable quick decisions, ‘snap-judgments’, so to say, rather like an automatic knee-jerk reaction in our brain that decides “if A, then B”. But these automatic responses can sometimes lead us into trouble, as surely the people in our examples have shown. While Biases may help us to be quicker in our decision making, the decisions we wind up making are often not based on a new reality (A may look like an A, but it is not always a true A). To achieve true rational thought we have to “wake up” to those situations where the heuristics do not apply rather than continue with these (wrong) Biases and Magical Thoughts.

    Consider just a few of the Biases we are all subject to:

    • Confirmation Bias:

    We tend to pay more attention to all information or data that supports our views, and downplay everything else.

    General Westmoreland had reports from his own commanders telling him that the American offensives in Vietnam were not working, expressing serious doubts about the possibility of victory, but they did not fit in with his own, somewhat bigoted view of Asians, his faith in American supremacy and his need to be a winner –– so the reports were ignored. In German we have a wonderful saying from the poet Christian Morgenstern:  “Weil, so schließt er messerscharf / Nicht sein kann, was nicht sein darf.” Translated to English it means, "That which must not, can not be."

    • Attribution Theory: 

    If you fail, it’s because you are incompetent and is all your fault; whereas if I fail, it’s because of circumstances so it’s not my fault. Hitler was convinced, in his own messianic (i.e. crazy) way, that he was always and absolutely correct. If things did not go as planned, it was because his staff, commanders, soldiers were idiots, slackers, or worse, traitors. For a great example, see the original version of the Hitler Rant on YouTube here:  As an aside: I have observed this is exactly what many corporate leaders do when their pet mergers, projects or products fail: it’s rarely because they themselves had a bad idea or did not do enough due diligence for the execution, it is because their incompetent, recalcitrant, and defiant staff blew it.

    • Overconfidence Bias

    While it is true that to take big risks or make substantial changes or even to climb the corporate ladder one must be confident, the Overconfidence Bias leads to a form of tunnel vision: it is simply inconceivable that one could be wrong. Perhaps no one has pointed out your errors in so long a time that you have forgotten that being wrong is possible! All of the examples given above succumbed to this bias to a certain degree, as, in my experience, do many high level managers.  Going back to the examples, the Titanic had 2223 people on board, but only room for 1,178 in the lifeboats - of which there were only 20, instead of the 64 that were planned. Obviously, there was a serious case of overconfidence at work.

    The Overconfidence Bias leads to the Dunning-Kruger effect, which found that people generally overestimate their own capabilities, particularly when they lack even the skills to evaluate competence vs. incompetence in a certain field. Why else would more than 90% of drivers rate their own skills as ‘above average’?

    Which leads to one of my favorite sayings by the great Bertrand Russell: “The trouble with the world is that the stupid are cocksure and the intelligent are full of doubt.”

    • Availability Heuristic: 

    We put more faith into easily available information, most often from memory, but also, we use the information that is most easily available from what our immediate surroundings tell us. This heuristic, often helpful, makes the source and quality of feedback extremely important. Managers who love their “team-players” will most likely get their own ideas and wishes reflected back to them as “feedback” and “input,” hiding the fact that the source and quality of the feedback was not “feedback” at all but was always in the manager’s own head. Oh, what a vicious cycle!

    • Sunk Costs Fallacy  

    Finally, a heuristic that has probably cost more lives and money than all the others: the • Sunk Costs Fallacy. The reasoning here is that ‘we’ve thrown so much time, money, etc. into this venture, we can’t just write that off.’ As a longtime manager responsible for various change programs and projects, I’ve seen way too many resources and way too much money thrown at doomed undertakings, sometimes over years, only to have them finally die a quiet and ignoble death all because no one could say earlier on, “You know what? Maybe we shouldn’t have done this to begin with.” One of the best examples of this bias is the Vietnam War (later to be followed by Afghanistan and Iraq): how could the tremendous number of lives lost, pain, suffering and the negative effects to the economy already ‘spent’ be justified if one just stopped? And that same reasoning (or rather, fear) is what keeps corporate boards shoveling good money after bad, managers sticking with processes that don’t work, and bad marriages stumbling along.

    • Optimism Bias, or the Valence Effect of Prediction

    the tendency to overestimate the probability of good things happening. We simply hate failure, so, together with the Confirmation Bias, this is probably the core element of Magical Thinking.

    The intrinsic nature of a Bias as a heuristic or mental shortcut that is intuitively (or unconsciously) applied makes it so difficult to combat and is a basis for Magical Thinking.  In her speech that I’ve quoted above, Carly Fiorina actually talks about the fact that many mergers fail, in fact, she devotes a good section of her speech to such failures. But she is yet convinced that hers will succeed. This is an example not only of Bias, but also Hubris.


    Hubris goes back to the ancient Greek word Hybris (ὕβρις), meaning pride or arrogance. Many of the Greek myths dealt with this condition, the aforementioned tale of Icarus daring to compare himself with the God Apollo being one such example.  Pride and arrogance, or Hubris, are in fact the second major cause of Magical Thinking. In short: ‘I will it, therefore so be it.’ Never mind that it may not be realistically possible because of constraints in time, energy, manpower, resources, external forces or whatever. I will it. How magical is that?

    The internet is full of stories about hubris, not just from Greek mythology, or even the madmen of 20th century politics such as Hitler and Stalin.  Just google ‘failed mergers,’ ‘worst CEOs,’ or even ‘examples of hubris’ and you will find thousands of examples. Again, many of these people were or are intelligent beings which we know because they had to be to reach such stellar positions.  So, even taking into account biases such as the Confirmation Bias, or the Overconfidence Bias, what is wrong with these people?


    They can’t all be flaming sociopaths, can they?

    It is interesting to speculate, but it is most likely that most are not sociopaths. Nevertheless, consider this, no matter how low on the ladder, a person in a position of any authority is a step above someone else. You may have a company car with a dedicated parking space, or even a driver, a corner office, your own personal assistant, etc., or maybe only small versions thereof. But any and all such conveniences or symbols serve to demonstrate that you are more important than others. Your time is more valuable, your opinions matter, and, as the conveniences and symbols show, the rules that apply for the common folks simply do not apply to you.

    As many, many studies have shown, power corrupts. Even a little bit of power, even imagined power can alter normal behavior. So it’s no wonder that people in positions of authority can become overproud and arrogant - remember, the Attribution Theory tells them that they achieved their success not through cronyism, or luck, or having the right contacts or degree, but because they are simply better people. And because they are better, they are invariably right and the rules for the little people do not apply to them. 

    What else could explain why so many politicians, even politicians of small villages; so many corporate leaders and corporate department heads and other figures of some degree of public prominence get caught in adultery, corruption, and other unappetizing affairs? They believe that it is their right to behave in such ways, they deserve it, and they just can’t imagine being caught and judged by plebian standards. If that isn’t Magical Thinking, I don’t know what is.

    But, to be fair, it’s not only the outward trappings of power that lead to hubris. The sycophants play a major role. The ‘team-players’, good-ol’-boys, the networks, the hangers-on and yes, let’s be clear, the brown-noses, all are as important as visible symbols of success toward hubristic behavior in leaders. In a meeting with the board, what middle manager has the guts or spine to stand up and tell the emperor that he is naked? It is difficult to know where to lay blame when the courage must come from those who could lose their jobs by speaking truth to power, but it is equally true that such lack of courage could contribute to a powerful person being deprived of the facts that could lead to many third parties losing their lives (as the example of the Titanic shows). This difficulty demonstrates how the Confirmation and Overconfidence Biases can be multiplied in strength by our refusal to recognize their existences, both by top managers not building into their systems ways of hearing opposing views; and in support staff by not recognizing their ultimate responsibility to stick to the truth when lives are stake.

    Interestingly, just as in the tale of the Emperor it was a child who finally spoke the truth in that story, so it is much more likely that in meetings with the board, a peon from the stockroom may show more backbone than his immediate manager and tell the emperor just where the naked spots are. This is why middle managers may take along the experts from their ranks to meetings, but seldom let them actually speak at all.

    So, if no one in the emperor’s immediate surroundings ever has the spine to tell the truth, to say ‘nay,’ instead of ‘Yes, Sir!” how can we really blame the emperor for his delusions? It’s so much easier to go along, to say yes, to be part of the team, even if you know that what is being considered is wrong. Maybe there is the hope that when the crap hits the fan, you’ll come away without too much spatter; or maybe you hope you can explain the ghastliness of the results by saying, “I was just obeying orders.”

    Even the Nobel Prize Winner in Economics, Robert Shiller, working on the Advisory Board to the Fed between 1990 and 2004 said his warnings about the housing bubble were made gently and quietly because, “Deviating too far from consensus leaves one feeling potentially ostracized from the group, with the risk that one may be terminated.”  Standing up and speaking out is dangerous and not good for careers. Full disclosure: I know this is true from personal experience because that is exactly how I ruined mine.  (And am I ever so happy now that I did the right thing!)

    Lest this all seems a bit too abstract, let me quote Steven Denning, the acknowledged Business Guru, who says in his book, Leader’s Guide to Radical Management,

    “A less obvious stumbling block is psychological. Managers are used to having comprehensive plans with Gantt charts and fixed delivery dates, often spelled out over several years. Such magisterial plans, however, can lead to magical thinking, with managers confusing plan with reality, viewing the world as an extension of their will, even losing their grasp on the real world as something independent.”

Antidotes for Biases and Hubris and Magical Thinking

    Luckily there are proven antidotes to Magical Thinking, and many wonderful people have published profound works about Biases, Hubris, and how to overcome them, whether from the standpoint of psychology, philosophy, business perspectives, behavioral economics or even Buddhist philosophy. (For just a few of my personal favorites, please see the Appendix, Further Readings.)

    I will close with a wonderful quote along with my personal Ten Commandments.  First, the quote, 

    “The antidote to hubris, to overweening pride, is irony, that capacity to discover and systematize ideas. Or, as Emerson insisted, the development of consciousness, consciousness, consciousness.” ~Ralph Ellison

Secular Ten Commandments

    Finally, Bertrand Russell said it best in his 1951 piece for the New York Times Magazine entitled, The Best Answer to Fanaticism: Liberalism, also referred to as the ‘Secular Ten Commandments,’ 

    And my own mantra, very simple but not easy: Tread lightly.

Author info:

Susanne Ramharter is an academically trained Change Coach who originally studied art history, switched to IT Management, and has spent most of her career managing large projects, teams, and profit centers in the areas of IT, outsourcing, and marketing. Her experience has taught her that the facts or sequence of events are not always the root of success or failure, but, rather, understanding the stories about them and therelationships between those involved usually makes the most difference in finding the most creative way forward.

These insights led her to focus on understanding both the mindsets and narratives that influence the actions of corporations, their managers and staff, so as to help shape their growth. The Social Era and its possibilities are a great way to utilize and build on existing strengths and develop new ones. To that end, Susanne is very active on Google+ and Linked-In as well as other social media platforms and uses both to tell stories about art, business, mindsets, and just having fun. As part of the SynaptIQ+ Book Series published by S+™, Susanne and Meg Tufano are writing two books, one on Loyalty, and the other on Value. She is a member of The SynaptIQ+ Think Tank and lives in Vienna, Austria.


About the Artist, Ria :

Ria Nieswaag (born 1950) lives and works in Delft, The Netherlands.

From 1984-1988 she was trained as Creative Therapist in Zeist. She combined her activities as creative therapist and artist until 1994 when she definitely chose for the career of an artist. This decision resulted in an assignment by Rijkswaterstaat (1998), the Dutch Ministry of Infrastructure and Water, on the occasion of its 200th birthday. Queen Beatrix was present at the opening of the exhibition. A book consisting of the paintings Ria made for the Ministry was published: ‘17 schilderijen, Nederland als kunstwerk’ (17 paintings, the Netherlands as a piece of art).

From 1995 onwards Ria has been exhibiting in distinguished galleries in the Netherlands as well as abroad (o.a. Argentina, Brazil, MASC for the organisation ‘Paint a Future’).

In 2006 a small book of Ria’s paintings was published: Ria Nieswaag, Paintings 2005-2006 (a choice); ISBN:13 978-90-810765-1-7

From December 2008 to February 2009 an overview exhibition of Ria’s work was held in Museum ‘Het Prinsenhof’ , the municipal museum of Delft. On this occasion the book ‘Spiegels van de ziel’ (Mirrors of the soul) was published. This book contains an overview of paintings by Ria Nieswaag from 1994-2008; ISBN: 978-90-74063-38-8