Skip to main content
Download the PDF Version

Given that people are thinking all the time, is there a right way and a wrong way to go about it? Here we examine some guides to more logical thinking. It can deliver us from manipulation, and lead to a happier and healthier life …

Next to breathing, thinking is arguably the most common of all human activities. We eat and sleep only at intervals; we walk or talk only part of the time. But as long as we are conscious, we think constantly. The ability to do so in the abstract – to have ideas – is what sets homo sapiens apart from the rest of creation. Descartes spoke for an entire species when he wrote, “I think, therefore I am.”

According to Ralph Waldo Emerson, our very lives consist of what we are thinking all day long. Yet, considering how vital the mental processes are to human existence, it is remarkable how little is done to ensure that they are effectively exercised. We are told a great deal about what to think, but very little about how to think. This may be because most people regard thinking as something that comes naturally. They would no more seek instruction on how to use their minds than on how to use their veins.

But talking is a natural function too; and just as people can learn to express themselves more clearly, they can learn to think more clearly. The first recorded attempt to teach reasoning skills was among the philosophers of ancient Greece. Aristotle, for one, propounded certain formal laws of logic. These have since been widely disputed, but they formed an indispensable starting-point for the study of how to think.

Aristotle’s work was carried on by scholars in the Middle Ages who developed a list of approaches to reasoning to be avoided. They called these fallacies – errors which have the deceptive appearance of making sense. They gave them Latin names which make them sound forbiddingly “intellectual.” In fact, fallacies are common in everyday life. We are liable to slip into fallacious reasoning, or have our own thinking affected by it, at any time.

Take the fallacy the medieval scholars called secudum quid , which is nothing more than what we today would call jumping to conclusions. We visit a strange town and see two men reeling about on the streets; from these two instances we conclude that the town is full of drunkards. “People in this town are very rude, too,” we think after being treated brusquely by the only local sales clerk we meet.

Gross over-generalizations like this may seem harmless, but they can lead to serious damage socially and politically. When applied to groups, they create misleading stereotypes. Two members of such- and-such a group are lazy and unreliable, therefore they are all lazy and unreliable; three members of another group are charged with stealing, therefore they are all criminals. There is a murder in an ethnic neighbourhood; we are frightened ever to go there, because everybody there is a potential murderer. It is from such crude labelling that vicious racial and sectarian prejudices arise.

A related over-generalization is the assumption that a localized and temporary opinion or sentiment represents a universal principle – that what we deem to be true here and now is going to be true everywhere and forever. This is often accompanied by the belief that what one deems to be good for oneself is good for the whole society.

Over-generalizations are the lazy man’s substitute for rigorous thought, and mental sloth may be the only explanation for how widely some of them are accepted. In the 1950s, the press critic A.J . Liebling summarized the American newspapers’ approach to foreign news this way: “Man go church, good man, no lie. Man not go church, man bad, lie. Communists bad, whatever they say lie.” Scores of millions of people went along unquestioningly with that mindless line.

Many American lives were then being ruined by the over- generalization which guided Senator Joseph McCarthy’s fanatical hunt for Communists: “If it waddles like a duck and it quacks like a duck, it must be a duck.” The Senator and his henchmen raised to a high art the fallacious theory of guilt by association. Smith had lunch with Jones, who once attended a meeting of a Communist front group. Therefore both Jones and Smith are Communists.

Making cock-eyed connections, and the fallacy of ‘you’re a fine one to talk’

Guilt by association incorporates erroneous correspondence, the assumption that a thing that has certain attributes in common with another will resemble it in all respects. If you were to believe that, you might also believe that whales, being mammals, can walk. It leads to the kind of thinking that ascribes a uniformity of opinion to every single member of a race, a religion, a sect, or a nation. Demagogues on personal power trips are only too happy to take advantage of this error to pretend that they speak for their entire group, which is of one monolithic mind.

Guilt by association also has elements of the fallacy in which ideas and things are mixed up with personalities. You may think, ” That charity can’t be a good cause because the man who runs it is an egotistical publicity hound.” In fact, his lust for fame has nothing to do with his ability to run a charity, or with its worthiness. An awareness of this fallacy is handy in making judgments in politics, in which personalities are often confused with issues. You don’t like that politician’s appearance or his way of speaking. Therefore you reject his policies out of hand.

Personalities also come into play in what might be called the ” you’re a fine one to talk” fallacy. Under its spell people may absolve themselves of their faults on the specious grounds that others are just as bad as they are, or worse. A wife says she wishes her husband would not leave his socks on the bedroom floor. He retorts: “Yeah? And what about the dent you put in the car?” which is irrelevant to the question. Such cock-eyed connections are often made in political debates, deliberately or otherwise. They can be fatal in business, in which the management that concludes ” we’re no worse than anybody else” is courting bankruptcy.

Among the other fallacies that rest on irrelevancies is circulus in probando – arguing in circles. You can think in circles, too, without stating an argument aloud. Circular reasoning conveniently supplies its own authority. Someone might declare that Thackeray was a greater novelist than Dickens. Why? Because the most discerning critics say so. And who are the most discerning critics? Those discerning enough to discern that Thackeray was a greater novelist than Dickens, that’s who!

Thinking in circles often entails joining an intellectual herd charging round and round. Everybody thinks such and such; it must be so for the simple reason that everybody thinks that it is so. A variation of this is basing a conclusion on an unprovable assumption . Fowler’s Modern English Usage gives a grisly and ridiculous example: that fox hunting is not cruel because the fox enjoys the fun.

Baseless conclusions are sometimes palmed off as “self-evident truths.” The phrase is a contradiction in terms since the word ” evident” implies the presence of signs that point unmistakably to a conclusion. The less verifiable the “self-evident truth,” the more fiercely those who subscribe to it will attack anyone who dares question it.

One tactic for defending a flawed piece of reasoning is to cite the endorsement of some prominent person or book. Of course, the validity of opinions does not necessarily depend on the fame or eminence of those who hold them. The principle applies equally to self-appointed gurus and impressive-sounding statistics, which can always be misinterpreted or deliberately skewed to support a certain cause. In his Guides to Straight Thinking , Stuart Chase quotes a sign in a British school that got to the heart of the matter: “The teacher could be wrong. Think for yourselves.”

‘It is better to know nothing than to know what ain’t so’

In our attempts to think for ourselves, we should refuse to be included in declarations that “everybody knows” something or other. Everybody else might indeed know it, but a critical thinker will withhold acknowledgement of a fact until it has been demonstrated satisfactorily. Similarly, if a speaker says that “most experts agree” on something, we have the right to ask: What experts? What precisely have they agreed on? Such challenges can be important because, as Bertrand Russell once observed, “Most of the greatest evils that man has inflicted on man have come through people feeling quite certain about something which, in fact, was false.”

“It is better to know nothing than to know what ain’t so,” Josh Billings wrote. But how are we to distinguish between what is so and what “ain’t?” Since people are always citing “the facts” to support their points of view, it helps to know what separates a fact from a mere notion. A few years back the California Department of Education defined a fact as “an understanding based on confirmed observations and inferences, and … subject to test or rejection.” No one can unilaterally create a fact to fit their opinions, feelings or prejudices, as people frequently try to do.

Facts are elusive at the best of times. The great Canadian explorer and writer Vilhjalmur Stefansson illustrated the point by telling of a man coming into a house and saying, “There is a red cow in the front yard.” Stefansson pondered the possibilities of error: “The observer may have confused the sex of the animal. Perhaps it was an ox. Or if not the sex, the age may have been misjudged, and it may have been a heifer. The man may have been colour-blind, and the cow … may not have been red. And even if it was a red cow, the dog may have seen her the instant our observer turned his back, and by the time he told us she was in the front yard, she may in reality have been vanishing in a cloud of dust down the road.”

Because information is so fallible, scientists take five steps in attempting to establish what qualifies as knowledge and what does not: (1) asking questions; (2) making observations; (3) reporting results; (4) answering questions arising from those results; (5) revising assumptions in the light of the answers. Even then, they do not look for certainties, but for high probabilities. A scientist will say, “The evidence supports this hypothesis.” He will not say: “This is the truth.”

You can use the five-step system in your own efforts to think more logically, and also to assess the thoughts of others. Can they stand up to questioning and review? Have the assumptions implicit in them been revised to take account of the latest developments?

Some fairly reliable signals exist to indicate when people are on shaky logical ground. One is that they will refuse to listen to contrary arguments or evidence that might spoil their hypotheses. If forced to listen, they are likely to treat contrary arguments or facts not as challenges to the validity of their conclusions, but as attacks on their probity or dignity. In the marginal notes to a speech, an old Member of Parliament is said to have written: “Weak point. Emote!”

High on the list of fallacious tricks of rhetoric is one called argumentum ad populum , meaning an argument appealing to popular passions. It can usually be spotted by the splashing around of emotive abstractions like honour, dignity, and pride. You can be reasonably sure that you are being exposed to this type of propaganda if the message is couched in simplistic unitary terms: there is one problem, one solution, one indisputable body of evidence. Either there is one monstrous enemy, or there are enemies everywhere. In either case, the enemies all have the same traits.

The ability to detect a fallacious argument is the critical thinker’s primary defence against demagoguery and brain-washing in advertising, politics and other public affairs. In a plea for the teaching of reasoning skills in grade schools, Toronto author and journalist Erna Paris wrote in The Globe and Mail : “Imagine a society in which children were taught to distinguish argument from emotion, and to evaluate information according to the quality of the evidence backing it up! We would still be faced with prejudice and a stubborn human unwillingness to see the other person’s point of view … But more of us would be equipped to resist the opinion manipulators, the weavers of superstition, and the propagandists with political or other agendas.”

How to avoid mistaking our impressions for the real thing

In the absence of such teaching except in specialized courses in philosophy, ordinary people must rely largely on horse sense to assure that they practise logic themselves and detect the lack of it in public discourse. There are, to be sure, a few books on the subject, and the larger encyclopedias have articles on logic describing the various fallacies and other intellectual tools. In broad terms, however, no one can go wrong by questioning all generalizations, looking for supporting evidence for every assertion made, and being on guard against extremes in thinking, whether in others or oneself.

On the personal side of the question, we would not be human if we did not occasionally allow our minds to go to extremes, if only when we are hurt or angry. The surest way to avoid extremes is to be aware of the danger of thinking in absolute terms.

Absolutism thrives on words like “is,” “are,” “be” and “am,” which lead people to confuse their interior judgments with exterior reality. “Statements such as ‘this picture is beautiful’ or ‘the outlook is good’ or ‘this steak is overcooked’ are not statements about the picture, the outlook or the steak, but the speaker’s reaction to them,” S.I. Hayakawa wrote in his Language in Thought and Action . People are naturally inclined to mistake their impression of a thing or event for the thing or event itself – to mistake the map for the territory. “But, of course, no one can get outside the limitations of one’s nervous system to see reality directly and absolutely objectively. If we could do this, we would never be fooled by magicians or optical illusions, and we would never misinterpret a situation.”

By avoiding “is” and other absolute words, you can clarify your thinking. The distinguished semanticist Dr. Albert Ellis once gave some examples of how much more precisely and completely thoughts are constructed if one abstains from the verb form “to be”: “John is lethargic and unhappy.” / “John appears lethargic and unhappy in the office.” / “John is bright and cheerful.” / “John appears bright and cheerful at the beach.” / “Mary is smart.” / “Mary scored 160 on her IQ test.”

Absolutist thinking seems to be a natural product of western culture , with its black-and-white view of the universe. Our legal system decrees that a defendant is guilty or not guilty; we vote either for one candidate or another; all too often, we can see only two ways of doing things, a right way and a wrong way; we are inclined to divide our tastes crudely into what we like and do not like. In relations with people who are not of our own kind, we think in terms of “them and us.”

Absolute judgments tend to strengthen ‘the power of negative thinking’

Aristotle’s system of logic, which for centuries guided western thought, asserts that everything must be one thing or another. Like a light switch that is either on or off, it makes no allowance for degrees. This encourages what semanticists call “two-valued orientation.” A typical two-valued judgment might be, “He who is not with me is against me.” It does not contemplate the possibility that he could be with you on one issue and against you on another, or be with you at one time but against you at another when the circumstances have changed.

This all-or-nothing approach gives rise to childish judgments: ” That is good, this is bad; they are right, the others are wrong; he is stupid, she is smart.” It establishes an intellectual regime of ” allness” in which people falling into certain categories are all deemed to think, feel or act in the same stereotypical way.

“Allness” can also affect one’s thinking about oneself, as in, ” They are all against me.” It is associated with a lot of other absolute words: “ Nothing ever goes right for me. I’ll always be a failure. I never make any progress. Everything is falling apart for me. And nobody cares. Everybody is out for himself these days.”

Absolutist thinking tends to reinforce “the power of negative thinking” because it sets up unrealistic expectations. In their personal relations, people under its influence expect others to treat them well or badly all the time, instead of treating them well some of the time, badly some of the time, and neither well nor badly some of the time.

Instead of seeing their own lives and those of others as processes undergoing constant change, they see them as static. Writing of a theoretical young man who has been going through a hard time and concludes, “I’ll never get over this,” Stuart Chase commented: “He thinks this unfortunate ‘time’ is all ‘times.’ Blinded by absolutes , he cannot see other ‘times.’ He believes his case is identical with all past and future cases in his life.” He does not realize that “what has happened can never exactly repeat itself. No two contexts are the same.”

The fallacious notion that what has happened before will happen again generates “pre-emptive thinking” intended to prevent its recurrence. Thus a young woman who has broken up with two or three men becomes convinced that, as far as men are concerned, she is “a failure;” because she believes this, all her relationships with men do indeed fail.

Being about as happy as we make up our minds to be

Self-defeating thoughts can hold us back from meeting our full potential, e.g.: “I won’t approach the boss with that idea of mine because I’m sure to make a fool of myself.” In this regard we would be wise to keep in mind Hayakawa’s caution that what we think about anything – including ourselves – is not the reality of it: “One’s self-concept is not oneself. It omits a great deal about oneself.”

What you think of yourself and the world around you can literally be hazardous to your health. In recent years, experts on stress have determined that a person’s self-concept plays a key role in how much stress he or she can take. If people jump to conclusions, take things personally, or fall for other fallacies, they will act as though everything around them is dangerous. This triggers the instinctive fight-or-flight response which causes unhealthy stress.

“Most folks are about as happy as they make up their minds to be,” Abraham Lincoln said. Despite all the scientific, technological, and social advances made since Lincoln’s time, his words remain true . External conditions can cause misery, of course, but the spiritual wellbeing of ordinary individuals depends more on their state of mind than on their circumstances. That state of mind can be improved by efforts to think more clearly, because by doing so we can eliminate baseless self-doubts and fears.