How still we became,
witness and thing seen
~ Dorothy Walters
Self-inquiry is simple. It does not require you to do anything, change anything, think anything, or understand anything. It only asks you to pay careful attention to what is real.
I have two sons. When they were about four, they both went through a phase of having nightmares. I would go into the room and switch on the light. Two small eyes blinked at me from the corner.
"What's the problem?" I'd ask.
"Daddy, there's a monster in the room," a timid voice would reply.
Now, I had more than one choice of how to respond. I could tell my frightened boy that it was not true, there was no monster, go back to sleep. That response is the equivalent of reading a book that says, "We're all one, there is no problem, just be with what is."
Fine ideas, but they don't help much. I could also have offered to feed the monster cookies, talk with the monster, negotiate. That approach is like some kinds of psychotherapy. Treat the problem as real, then fix it on its own terms.
But the only real solution I ever found was to have a good look. Under the bed, in the closet, behind the curtains, we undertook an exhaustive search.
Eventually my sons would let out a deep sigh, smile at me, and fall back to sleep. The problem was not solved but dissolved. It was never real in the first place, but it took investigation to make that a reality.
See also: "The Translucent Revolution," interview with Arjuna Ardagh by Deborah Caldwell, Beliefnet.com
The World Loved by Moonlight
by Jane Hirshfield, from The Lives of the Heart
You must try,
the voice said, to become colder.
I understood at once.
It is like the bodies of gods: cast in bronze,
braced in stone. Only something heartless
could bear the full weight.
For me it's a kind of poem I have begun to think of as a "pebble"-small, recalcitrant, oblique, yet somehow moving. At least let's hope it's moving to others, besides you and me. Its source was a sentence written by Chekhov in a letter to a young writer: "If you want to move your reader, write more coldly."
The advice is chilling, true, and rich, I think, and leads in many different directions of thought. This poem follows one of those directions: that if one were to imagine a world in which there were mythic, conscious deities, then those beings would have to be very cold, very detached, in order to bear seeing what they must see in the course of any given day. So much suffering, so much foolishness, so much anger.
To be able to watch that at all-and even more, to play some active role in its continuance-would demand total heartlessness. It's the same lack of pity that Virgil demands of Dante as they tour the regions of Hell. Pity, the ghost-guide tells the poet, is forbidden. It is true for the contemporary writer as well, and for any seeker after truth.
A certain detachment is needed to look the fullness of life eye to eye; yet that very detachment is what permits the viewer to feel things fully, to know them without blinking. A paradox, that. And so we come to the title, "The World Loved by Moonlight." The "cold" light of the moon is equally a kind of passion and love for the earth, no less than the sun's warmer gaze. Much happens a night, in the dark.
It's probably because I am in some parts of myself a deeply sentimental creature that Chekhov's and Dante's idea strikes me so forcefully, like the twist of a knife. It's that twist that powered the poem into its speech.
[Check out Come, Thief for more of Jane Hirshfield's pepples.]
"We tend to explain success and failure is by looking at the attributes of the thing or the person that succeeded or failed. And what I argue – and the Mona Lisa is just a way of illustrating this general argument – is that these explanations are actually vacuous, right? They're logically circular."
~ Duncan Watts
We are prone to think that the world is more regular and predictable than it really is, because our memory automatically and continuously maintains a story about what is going on, and because the rules of memory tend to make that story as coherent as possible and to suppress alternatives. Fast thinking is not prone to doubt.
The confidence we experience as we make a judgment is not a reasoned evaluation of the probability that it is right. Confidence is a feeling, one determined mostly by the coherence of the story and by the ease with which it comes to mind, even when the evidence for the story is sparse and unreliable. The bias toward coherence favors overconfidence. An individual who expresses high confidence probably has a good story, which may or may not be true...
...When a compelling impression of a particular event clashes with general knowledge, the impression commonly prevails. And this goes for you, too. The confidence you will experience in your future judgments will not be diminished by what you just read, even if you believe every word...
...Overconfidence arises because people are often blind to their own blindness.
True intuitive expertise is learned from prolonged experience with good feedback on mistakes. You are probably an expert in guessing your spouse’s mood from one word on the telephone; chess players find a strong move in a single glance at a complex position; and true legends of instant diagnoses are common among physicians. To know whether you can trust a particular intuitive judgment, there are two questions you should ask: Is the environment in which the judgment is made sufficiently regular to enable predictions from the available evidence?...Do the professionals have an adequate opportunity to learn the cues and the regularities?...Many of the professionals we encounter easily pass both tests, and their off-the-cuff judgments deserve to be taken seriously. In general, however, you should not take assertive and confident people at their own evaluation unless you have independent reason to believe that they know what they are talking about. Unfortunately, this advice is difficult to follow: overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.
Excerpt from “The Science of Why We Don’t Believe Science,” by Chris Mooney, Mother Jones, April 18, 2011:
The theory of motivated reasoning builds on a key insight of modern neuroscience. Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.
We're not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.
Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. "They retrieve thoughts that are consistent with their previous beliefs," says Taber, "and that will lead them to build an argument and challenge what they're hearing."
In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers. Our "reasoning" is a means to a predetermined end—winning our "case"—and is shot through with biases. They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.
That's a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don't want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn't too emotionally invested to accept it, anyway. That's not to suggest that we aren't also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It's just that we have other important goals besides accuracy—including identity affirmation and protecting one's sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.
“To know how good you are at something requires the same skills as it does to be good at that thing. Which means, if you’re absolutely hopeless at something, you lack exactly the skills that you need to know that you’re absolutely hopeless at it. And this is a profound discovery, that most people who have absolutely no idea what they’re doing have absolutely no idea that they have no idea what they’re doing. It explains a great deal of life.”