as we know it: on the evolution of culture, technology and knowledge
home | about this site | about the author | polvo.ca | contact
home / mindbrain  

selection theory

media ecology and evolution

tech ideology

art and illusion

mindbrain

cultural evolution

amor fati

the reading pile

unrelated

E.H. Gombrich wrote, "To probe the visual world we use the assumption that things are simple until they prove to be otherwise." Now, using computerized "perceivers", some UofT researchers are proving that inputs are insufficient for knowledge.

This is the way the brain works. Sensors are always flawed; they simply do not provide enough information for us to reconstruct our world. The brain must use prior knowledge to interpret our surroundings and we found that it seems to do this optimally.

The article in Nature to which this story refers, and which I have not read, can be purchased here.

submitted by Paul J Kelly on 2003-08-23T04:07 | Comment | Permalink

Ramachandran describes some principles of art which can be derived from neuroscience. There's a fascinating description of an experiment with herring-gulls. Newborn chicks peck at the mother's red-spotted yellow beak to beg for food. They will even beg from a dismembered beak...

And you say: "Well that's kind of stupid - why does the chick confuse the scientist waving a beak for a mother seagull?"

Well the answer again is it's not stupid at all. Actually if you think about it, the goal of vision is to do as little processing or computation as you need to do for the job on hand, in this case for recognizing mother. And through millions of years of evolution, the chick has acquired the wisdom that the only time it will see this long thing with a red spot is when there's a mother attached to it. After all it is never going to see in nature a mutant pig with a beak or a malicious ethologist waving a beak in front of it. So it can take advantage of the statistical redundancy in nature and say: "Long yellow thing with a red spot IS mother. Let me forget about everything else and I'll simplify the processing and save a lot of computational labour by just looking for that."

That's fine. But what Tinbergen found next is that you don't need even a beak. He took a long yellow stick with three red stripes, which doesn't look anything like a beak - and that's important. And he waved it in front of the chicks and the chicks go berserk. They actually peck at this long thing with the three red stripes more than they would for a real beak. They prefer it to a real beak - even though it doesn't resemble a beak. It's as though he has stumbled on a superbeak or what I call an ultrabeak.

We don't know exactly why, but obviously there are neural circuits in the visual pathways of the chick's brain that are specialized for detecting beaks as soon as the chick hatches. They fire when seeing the beak. Perhaps because of the way they are wired up, they may actually respond more powerfully to the stick with the three stripes than to a real beak. Maybe the neurons' receptive field embodies a rule such as "The more red contour the better," and it's more effective in driving the neuron, even though the stick doesn't look like a beak to you and me - or maybe even to the chick. And a message from this beak-detecting neuron now goes to the emotional limbic centres in the chick's brain giving it a big jolt and saying: "Wow, what a super beak!" and the chick is absolutely mesmerized.

Well now what's this got to do with art, you're wondering?

Well this brings me to my punch line of about art. What I'm suggesting is if those seagulls had an art gallery, they would hang this long stick with the three red stripes on the wall, they would worship it, pay millions of dollars for it, call it a Picasso, but not understand why - why am I mesmerized by this damn thing even though it doesn't resemble anything? That's what all of you are doing when you are buying contemporary art. You are behaving exactly like those gull chicks.

This time the novelist David Lodge was in the audience:

David Lodge: (novelist and critic) I mainly thought about this issue in relation to literature, rather than art. You can produce rules which allow you to identify literature. It doesn't mean that everything in that category is good, is valuable. It seems to me you have to do the same for art. In that case you have to give the criteria which would separate art from non-art. But within the category of art you could have all kinds of different evaluations.

Ramachandran: Well let me give you a partial answer to that. Let's take kitsch art. Kitsch art has a lot of these rules - grouping, and maybe even Peak Shift and maybe this and that. But instantly, if you're a sophisticated art critic or a person who buys art in auctions, you know this is kitsch. Now one cynical view would be what's kitsch for one person is high art for another. It's completely arbitrary. Well I don't agree because you all know that you can mature from kitsch to genuine art, but you can't slide backwards once you have. So the question then becomes a challenge for me - well, what's the difference? Kitsch art employs the same principles so what's the difference between kitsch and the genuine article? I would answer that what happens in kitsch is that you go through the mannerisms of real art, superficially deploying these principles without really understanding them. And if you go and look at some kitsch works of art, that's what happens. But it doesn't quite grab you in the same way because you haven't done it properly. The same holds for literature or for music or any of those things.

Excellent point about noone ever sliding back to kitsch.

submitted by on 2003-07-01T03:29 | Comment | Permalink

This lecture is mostly about visual perception and the modularity of the visual cortex, as well as the older unconscious visual brain. Patient's with blindsight can point to things they cannot "see" because their unconscious visual system is still working. Once again, the most memorable part of this lecture was in the Q&A:

JIM HURFORD: I'm a linguist at Edinburgh. You have been talking about consciousness and people often talk about self-consciousness. Now you've been talking as if there's a subject of consciousness in the brain - and what one is conscious of is some object of consciousness elsewhere. And it would seem that your analogy of the little chap looking at the screen would rule out the possibility of self-consciousness. Do you agree with that?

RAMACHANDRAN: Well, one of the things that I think about self-consciousness is usually people think that the reason you developed self- consciousness and introspection is to allow you to model the behaviour of other people. You know, if you know what it is to feel anguish, you can better judge somebody else's anguish. I think it's the other way around, personally. I think what happened was your brain developed the capacity to model other people's behaviour very, in a very sophisticated manner using for example the mirror neuron system. Then you apply that same algorithm to your own body so you see yourself as a person in that representation you have created, and that is the origin of self-awareness. But there is a great deal more work to be done before we can, you know, give you a specific answer to that question.

submitted by on 2003-07-01T03:03 | Comment | Permalink

Have finally gotten round to listening to Vilayanur S. Ramachandran's Reith lectures and they are amazing. I will blog each one I listen to. Ramachandran proceeds like a detective solving riddles of human behaviour and perception. The evidence comes from brain damage patients, as always, because of the modular nature of the brain. This first lecture is about the brain's map of the body and how sensation is represented. Be sure to listen through to the Q&A which is often a strong compliment to the lecture. Here's a sample:

HALLIGAN: Peter Halligan, Cardiff University. Rama, you focused on two central themes that were motivating the connection between your talks and one of them related to the explanation of brain function in terms of evolution. What is the purpose do you think of having a phantom limb and what can be the explanation for the seventy percent of people who actually have pain associated with it?

RAMA: Yeah I mean when I said that you would need to have an evolutionary perspective, it doesn't follow that every quirk of human behaviour and every symptom you see has an evolutionary function it could be a by-product of something else that evolved for a specific function.

MILLER: Rama, we have all got phantom limbs. And we all have a body image by virtue of the fact that we have a cerebral representation of our limbs. When we lose the limb but keep the cortex, of course we have a phantom, there's no evolutionary function in that.

RAMA: Sure I think that Jonathan has answered your question for me. But let me emphasise also that in a sense your body is a phantom as Jonathan just pointed out.

submitted by on 2003-07-01T02:45 | Comment | Permalink

Raj Persaud on Dr Richard Wrangham whose research has discovered "a strong universal tendency to underestimate the enemy's strength, while overestimating your own side's capacity."

But in a sobering analysis for the potential combatants in a possible Iraq war, Prof Wrangham found that in prior Middle East conflicts superior forces were significantly more likely to lose battles where deception was used by an inferior strength enemy. Weaker forces tended to initiate battles by a factor of two to one, even though it would be expected that, all other things being equal, this should only occur 50 per cent of the time.

It seems that the universal human tendency to suffer from positive self-illusions not only starts many wars but also increases the chances of successfully bluffing the enemy into believing he cannot or is unlikely to win and so it also increases the unpredictability of wars.

Prof Wrangham points out that one positive illusion with which most wars begin and which even the superior side suffers from is the unrealistic belief that the war will be a quick one. This characterised the start of the Boer War, the First World War, the Second World War, the Suez crisis, Vietnam, various Afghanistan conflicts, Yugoslavia, Rwanda, Chechnya and the Congo.

Whatever your view on war with Iraq, Prof Wrangham's research suggests that deception plays a central role in human conflict, a conclusion that reflects the current preoccupation with whether or not Iraq is hiding weapons and its true resolve to defend itself.

submitted by on 2003-03-11T02:39 | Comment | Permalink

Steven Pinker offers a solution for education in a NYT Op-Ed. Let's start here:

Finally, a better understanding of the mind can lead to setting new priorities as to what is taught. The goal of education should be to provide students with new cognitive tools for grasping the world.

Sure, but people also need to learn how to detach from the world. The mind's "grasp" is always uncertain and this is cause of much mental strain. Such strain is eased through fantasy. Fantasy is irrepressible because there will never be adequate likeness between the mind's model of the world and the world. Fantasy fills in the gaps, offers the illusion of likeness, the comforting certainty.

Science is the means, as Rorty says, of "remorselessly enforcing the distinction between truth-seeking and wish-fulfilment". But how do we train ourselves to really know how the mind is tricked? How do we properly channel our fantasies so they encroach as little possible into the real world? We can continue to study the mind scientifically and use the arts, which are very old cognitive tools. The arts take fantasy out of the real world so we can learn to detach ourselves from the mind's tendency of veering toward wish fulfillment.

Perhaps fantasy training might help combat the roots of scientific illiteracy which "leads people to squander their health on medical flimflam and to misunderstand the strengths and weaknesses of a market economy in their political choices." Maybe. Maybe not. But the problem with every educational crusade is that it is based upon the theory of mind du jour. And it's a problem because we really don't understand the mind. Pinker asserts his recommendations--"The obvious solution is instruction at all levels in relatively new fields like economics, evolutionary biology and statistics"--are scientifically grounded but his conception of mind, as inferred from this article, is one-dimensional and epistemologically naive. There is no reason to uphold his solution, which I don't really disagree with, over those who argue for more arts education.

Other points:

Pinker says "Educators must figure out how to co-opt the faculties that work effortlessly and to get children to apply them to problems at which they lack natural competence." He then complains "Even if learning music were shown to enhance math skills, that doesn't mean it is as effective as the same number of hours spent learning math." But isn't this one way of co-opting the effortless faculties, one of many which arts advocates are always putting forward? Perhaps arts advocates are wrong, but it's a stab at meeting Pinker's criteria.

I'm curious about the following:

Yet most curriculums are set in stone, because no one wants to be the philistine who seems to be saying that it is unimportant to learn a foreign language or the classics. But there are only 24 hours in a day, and a decision to teach one subject is a decision not to teach another. The question is not whether trigonometry is important it is but whether it is more important than probability; not whether an educated person should know the classics, but whether it is more important to know the classics than elementary economics.

What are "the classics"? Does he mean "Classics" i.e. the study of Western antiquity? No one receives a Classical education anymore. Or is it classic literature taught in the English curriculum?

submitted by on 2003-02-16T19:10 | Comment | Permalink

"ignorance more frequently begets confidence than does knowledge"—Charles Darwin

This isn't as grand a statement as I'm making it out to be. Follow the link to know Darwin's context.

submitted by on 2003-01-23T03:38 | Comment | Permalink

"It hinders the creative work of the mind if the intellect examines too closely the ideas as they pour in."—Friedrich von Schiller

submitted by on 2003-01-23T03:24 | Comment | Permalink

Anna Fels has written a little piece about the mind's abhorrence of an explanatory vacuum. This passage was of interest:

But this drive to come up with the causes of events is hardly limited to therapy patients. Neurophysiologists discovered the same phenomenon in a radically different context. While mapping the brain, they were amazed to find that when the area responsible for an emotion was electronically stimulated, subjects experienced the mechanically induced feeling, then instantly came up with reasons for their responses.

If you activate the area of the brain that generates laughter, for example, the subject may happily "explain" that his hilarity stems from an overly earnest looking doctor or an odd diagram on the wall.

submitted by on 2002-12-31T04:14 | Comment | Permalink

E.H. Gombrich's Symbolic Images is a good source for understanding how the mind relates to images, as are all his books for that matter. I'm currently reading the essay "Icones Symbolicae" and came across the following:

Warburg described as 'Denkraumverlust' this tendency of the human mind to confuse the sign with the thing signified, the name and its bearers, the literal and the metaphorical, the image and its prototype. We are all apt to 'regress' at any moment to more primitive states and experience the fusion between the image and its model or the name and its bearer. Our language, in fact, favours this twilight region between the literal and the metaphorical. Who can always tell where the one begins and the other ends? A terms such as 'a heavy burden' may be a dead metaphor, but it can be brought to life with the greatest ease by an orator or cartoonist.

Here's what you get when you plug denkraumverlust into google. Google's translator turns it into "thinking space loss", but a friend prefers "loss of mental space". Much like loss of detachment.


David Lodge is perhaps the first novelist to hit the books on consciousness studies. In this article he sets upon the "qualia" debate, or the problem of the irreducibility of first- person experience--how can the experience of the taste of pineapple be accounted for scientifically? He doesn't have the answer of course, but notes that novelists have been interested in qualia from the beginning, which was about the time the printing press was invented, and their interest coincided with that of philosophers in "memory, the association of ideas in the mind, the causes of emotions and the individual's sense of self":

'It is probable that the fairly recent invention and rapid development of printing contributed to that process. The increasing availability of books in which exactly the same story could be experienced privately, silently, by discrete individuals, was a marked departure from the usual transmission of stories in preprint culture by means of oral recitation or dramatic performance in front of a collective audience. The silence and privacy of the reading experience afforded by books mimicked the silent privacy of individual consciousness.'

submitted by on 2002-11-25T15:56 | Comment | Permalink

I've read other interviews where the guy talked the way he wrote. But this is LA Weekly, not Texte, and the results are comprendable. The on-topic highlights:

Q: What's the difference between knowledge and wisdom?
A: They aren't heterogeneous, and you can know lots of things and have no wisdom at all. Between knowledge and action there is an abyss, but that abyss shouldn't prevent us from trying to know as much as possible before making a decision. Philosophy is the love of wisdom. Philia is love and sophia is wisdom, so the duty to be wise is what philosophy is. Nonetheless, decisions don't depend exclusively on knowledge. I try to know as much as possible before making a decision, but I know that at the moment of the decision I'll make a leap beyond knowledge.

.....

Q: What's the most widely held misconception about you and your work?
A: That I'm a skeptical nihilist who doesn't believe in anything, who thinks nothing has meaning, and text has no meaning. That's stupid and utterly wrong, and only people who haven't read me say this. It's a misreading of my work that began 35 years ago, and it's difficult to destroy. I never said everything is linguistic and we're enclosed in language. In fact, I say the opposite, and the deconstruction of logocentrism was conceived to dismantle precisely this philosophy for which everything is language. Anyone who reads my work with attention understands that I insist on affirmation and faith, and that I'm full of respect for the texts I read.

submitted by on 2002-11-25T16:09 | Comment | Permalink

David M. Johnson gave an interesting and provocative talk at Atkinson College, York University on October 29. What theorists define as mind, he said, cannot be explained solely with respect to the brain, an organ which stopped evolving tens of thousands of years ago. The fixation with hard neuroscience leads people like Chomsky, Pinker, the Churchlands et al to equate mind with brain. But what they define as rational thought has a cultural history, culminating in the "Greek revolution" i.e. it took thousands of years of exosomatic, cultural evolution to produce the mental traits which brain reductionists assert require only biology. Johnson accuses them of not knowing their history.

There's nothing about the Greeks in the abstract linked above but here's a paper from the '80s on The Greek Origins of Belief. His book, < em>How History Made the Mind: The Cultural Origins of Objective Thinking, is slated for June 2003 release.

As usual, the suggestion that culture and natural selection have anything to do with each other caused a backlash of accusations of Social Darwinism and Euro- centrism. Johnson could defend better by considering the use of words like "human" in the title of a talk about how particular humans thought.

submitted by on 2002-11-25T15:58 | Comment | Permalink

Steve Talbott puts his finger on a distressing tendency in reports on neuro-science, and likely in neuro-science itself: to equate brain with mind. Thinking about your grandmother is secondary to the part of the brain that lights up when you think of her, or that they are one and the same. Says Talbott:

'Given a vague grasp of the fact that "we are psychosomatic organisms", many people -- scientists among them -- seem content to flop blithely back and forth between a brain vocabulary and a mental vocabulary as if there were no distinction between the two. What makes this an inexcusable lack of discipline is the simple fact that, as these vocabularies now exist, no one has the slightest idea how to translate a single term of the one language into a term of the other.'

I'm not sold on his conclusion, however:

'...we cannot understand perceiving -- the inner reality of perceiving -- in terms of the kinds of outer things given through the act of perceiving, such as brain tissues. We cannot understand the act as the result of its own results. We cannot understand as just another object the activity that constitutes things as objects.'

This assumes the opposite side in a debate over causes--that the lit up bits are an effect of thought. Talbott holds there is some "inner reality" to perception which is "categorically other" than things reducible to scientific laws. I agree, but locating it "inside" individuals only upholds the Cartesian doctrine he rails against in this piece.

submitted by on 2002-11-25T16:00 | Comment | Permalink

Although I agree with Gould about "a progressively more adequate understanding of the natural world" it's interesting that even if it's wrong, it's a requirement for scientific work ("No working scientist can be a relativist"):

'Strong relativism is nonsense. What you want to do is recognize the cultural embeddedness of science without negating what to me is pretty evident--the history of science differs from the history of other cultural institutions in that it produces a progressively more adequate understanding of the natural world (very fitfully to be sure, but progressive nonetheless). I must interpret that to mean we are achieving a more adequate understanding of nature. Some historians of science are close to the strong relativistist position, but no working scientist can be a relativist. Most people think that the reason for this is that scientists are so imbued with this grand goal of finding an ultimate truth. That's not why. It's exactly the opposite. It is because day-to-day scientific work is so tedious that unless you felt that the cleaning of the cages and petri dishes every day was actually leading to true, natural knowledge, why would you do it? If the history of science is nothing more than a changing set of views corresponding to altering social conventions, why do the hard work?'

submitted by on 2002-10-27T04:17 | Comment | Permalink

Anne Applebaum reviews two books about "the psychological landscape of contemprorary Russia", Nanci Alder's The Gulag Survivor: Beyond the Soviet System and Catherine Merridale's Night of Stone: Death and Memory in Twentieth-Century Russia.

'Even so, neither careerism nor fear quite account for the joy that some prisoners clearly felt upon being readmitted to the fold. By the 1950s, the Communist Party had, undeniably, been responsible for the false arrests of millions, the destruction of a generation of its own leaders, countless pointless deaths, and economic and moral damage impossible to calculate. Nevertheless, one ex-prisoner, who served thirteen years in a labor camp for non-crimes, described his sentiments as follows:

The most important factor that secured my survival in those harsh conditions was my unflinching, ineradicable belief in our Leninist party, in its humanist principles. It was the Party that imparted the physical strength to withstand these trials.... Reinstatement in the ranks of my native Communist Party was the greatest happiness of my entire life!

'Nanci Adler grapples with this conundrum--"allegiance to a belief system can have deep non-rational... roots," she writes--but is mostly interested in other things. To Catherine Merridale, on the other hand, the issue is fundamental, lying at the heart of her own investigation into Soviet history.'

'[Merridale] expected to find, for the most part, damaged people, unhealed victims, embittered survivors. She did not find them--or at least not as many as might have been expected. Instead, she found that the "imperial mentality" of the old Soviet Union had helped victims through their suffering:

To speak as a former Soviet citizen and a Russian is to speak--securely, if one chooses-- from a culture of endurance and heroism; it is to use the language of historical destiny, to talk (however ironically) of the audacity involved in leading the collective struggle for human liberation.... Some laugh about it now, but almost everyone is nostalgic for a collectivism and a common purpose that have been lost. Up to a point, totalitarianism worked.

'The very ideology that prevented people from talking about their pain also helped them, in other words, to forget that pain. If the silence imposed from above made individual "talking cures" impossible, it also forced people to grit their teeth and smile along with their neighbors?and they did. Eventually, they came to believe that they were smiling because they wanted to smile.'

submitted by on 2002-11-25T16:11 | Comment | Permalink

An article on how some psychologists are arguing against free will finds physiologist Benjamin Libet at the centre of things:

'What Libet did was to measure electrical changes in people's brains as they flicked their wrists. And what he found was that a subject's ''readiness potential'' - the brain signal that precedes voluntary actions - showed up about one-third of a second before the subject felt the conscious urge to act.

'The result was so surprising that it still had the power to elicit an exclamation point from him in a 1999 paper: ''The initiation of the freely voluntary act appears to begin in the brain unconsciously, well before the person consciously knows he wants to act!'''

Of course, it seems rather inflated to derive a case for determinism from this. Does free will require conscious awareness? I think I agree with Daniel Dennett that "it could be that the experience of will simply enters our consciousness with a delay, and thus only seems to follow the initiation of the action."

submitted by on 2002-10-20T19:46 | Comment | Permalink

'The human brain is not at its best when it confronts random, merely accidental facts. We perceive a face on Mars or see Jesus in a burnt tortilla. We believe basketball players get a "hot hand" even though streaks of success are a normal part of shooting their usual overall percentage. If disaster strikes us, we wonder if there was some cosmic reason we were singled out.'

By Taner Edis, based on his book, The Ghost in the Universe: God in Light of Modern Science.

submitted by on 2002-10-20T19:24 | Comment | Permalink

Lisa Belkin writes about coincidence and conspiracy. Interesting, but note the self-pitying explanation:

'The need is especially strong in an age when paranoia runs rampant. ''Coincidence feels like a loss of control perhaps,'' says John Allen Paulos, a professor of mathematics at Temple University and the author of ''Innumeracy,'' the improbable best seller about how Americans don't understand numbers. Finding a reason or a pattern where none actually exists ''makes it less frightening,'' he says, because events get placed in the realm of the logical. ''Believing in fate, or even conspiracy, can sometimes be more comforting than facing the fact that sometimes things just happen.'' '

Perhaps, but I prefer the more assertive reason: that the will to believe is too strong and seductive to deny; that certainty supports action and those who act tend to have an advantage over those who doubt, even if the doubtful are right. One more quotation involving the obligatory expert on cognition from MIT:

'For decades, all academic talk of coincidence has been in the context of the mathematical. New work by scientists like Joshua B. Tenenbaum, an assistant professor in the department of brain and cognitive sciences at M.I.T., is bringing coincidence into the realm of human cognition. Finding connections is not only the way we react to the extraordinary, Tenenbaum postulates, but also the way we make sense of our ordinary world. ''Coincidences are a window into how we learn about things,'' he says. ''They show us how minds derive richly textured knowledge from limited situations.''

'To put it another way, our reaction to coincidence shows how our brains fill in the factual blanks. In an optical illusion, he explains, our brain fills the gaps, and although people take it for granted that seeing is believing, optical illusions prove that's not true. ''Illusions also prove that our brain is capable of imposing structure on the world,'' he says. ''One of the things our brain is designed to do is infer the causal structure of the world from limited information.'''

submitted by on 2002-10-20T19:18 | Comment | Permalink

I still don't believe it. Source: Boing Boing Blog
submitted by on 2002-10-11T03:11 | Comment | Permalink

WG Runciman reviews Pascal Boyer's Religion Explained: The Human Instincts that Fashion Gods, Spirits and Ancestors: Natural selection is responsible for "for a kind and degree of imagination, and therefore credulity, which, over those many millennia, made those of our ancestors with theorybuilding minds more likely to pass on the relevant genes to their descendants than those without them. As a species, we are born not only to construct all sorts of beliefsystems out of what are sometimes the flimsiest materials, but also to retain whatever beliefs our local environment favours in the face of seemingly disconfirming evidence." But to account for specific beliefs (or non-beliefs), he says, we must look to the anthropologists, not the evolutionary psychologists.
submitted by on 2002-10-11T03:54 | Comment | Permalink

Wolpert: "I argue that the primary aim of human judgment is not accuracy but the avoidance of paralysing uncertainty... Might it not be that those with this disposition of thought survived better than those who did not have such beliefs? If that was the case, any genes linked with a propensity to believe would come to dominate in future generations.... such beliefs are our natural way of thinking and may be part of our genetic makeup because they are adaptive. We have a fundamental need to tell ourselves stories that make sense of our lives. We hate uncertainty and, for life and death issues, find it intolerable."
submitted by on 2002-09-17T18:01 | Comment | Permalink

E.O. Wilson: "The relative indifference to the environment springs, I believe, from deep within human nature. The human brain evidently evolved to commit itself emotionally only to a small piece of geography, a limited band of kinsmen, and two or three generations into the future. To look neither far ahead nor far afield is elemental in a Darwinian sense. We are innately inclined to ignore any distant possibility not yet requiring examination. It is, people say, just good common sense. Why do they think in this shortsighted way? The reason is simple: it is a hardwired part of our Paleolithic heritage. For hundreds of millennia, those who worked for short-term gain within a small circle of relatives and friends lived longer and left more offspring--even when their collective striving caused their chiefdoms and empires to crumble around them. The long view that might have saved their distant descendants required a vision and extended altruism instinctively difficult to marshal."
submitted by on 2002-09-17T17:34 | Comment | Permalink

Jerome Groopman on the weakness of "neurotheological" research. For one thing, it assumes an easy bridge between subjective and objective which does not exist:

'In fact, what is missing from neurotheology is precisely what all neuroscience demands: rigorously designed experiments. Such experiments always include controls that provide both a known positive result and a clear negative result, which should be null for the expected phenomenon. With this essential methodology in mind, we would want to analyze the SPECT-scan experiment done on the Tibetan Buddhist at the moment he feels united with the universe and relinquishes his sense of self. The positive control for the observed change in the orientation-association area would be an event when the human soul actually merges with the divine, since that would validate the hypothesis that the O.A.A. is fundamental to authentic connection with the deity. And that event iswhat? Is it a Cabalist unveiling the mystery of God through the mental gymnastic of numerology? Or is the positive control an exhausted Catholic penitent carrying a massive cross on his back along the Via Dolorosa, or a flagellant whipping himself in a Spanish rite? What do SPECT scans look like then? Forms of worship that demand mathematical calculations or the experience of physical pain would recruit different neural circuits from those used during serene Buddhist meditation or Franciscan prayer. Should we search for "a photograph of God" in these other brain regions during such mystical experiences?

'One is equally hard put to identify a negative control for the SPECT-scan experiments. That would require a nonreligious experience, when the brain is totally detached from the divine. If God is omnipresent, a cardinal concept in nearly all faiths, then every experience at every moment can have religious valence. Even doubting God is a part of faith, the Protestant theologian Paul Tillich argues. If that is so, then a SPECT scan done on me when I feel a cold emptiness after praying would not serve as a "negative control." Paradoxically, such alienation could be a key religious experience bringing me closer to God, even though my parietal lobe would appear to be metabolically "on," flaming yellow and red.'

submitted by on 2002-09-16T02:15 | Comment | Permalink

"Evidence from contemporary hunter-gatherers indicates that dreaming functions in a variety of ways, argues psychologist Harry T. Hunt of Brock University in St. Catharines, Ontario. Members of these groups generally view dreams as real events in which a person's soul carries out activities while the person sleeps.

"Hunter-gatherers' dreams sometimes depict encounters with supernatural beings who provide guidance in pressing community matters, aid in healing physical illnesses, or give information about the future, Hunt says. Individuals who are adept at manipulating their own conscious states may engage in lucid dreaming, in which the dreamer reasons clearly, remembers the conditions of waking life, and acts according to a predetermined plan.

"Dreaming represents a basic orienting response of the brain to novel information, ideas, and situations, Hunt proposes. It occurs at varying intensities in different conscious states, including REM sleep, bouts of reverie or daydreaming, and episodes of spirit possession that individuals in some cultures enter while awake."

....

"Sleep now typically occurs in single chunks of 7 hours or less. Yet as recently as 200 years ago in Europe, people slept in two nightly phases of 4 to 5 hours each. Shortly after midnight, individuals awoke for 1 to 2 hours and frequently reflected on their dreams or talked about them with others.

"Well before Freud's time, Europeans prized dreams for their personal insights, and particularly for what they revealed about a dreamer's relationship with God, says historian A. Roger Ekirch of Virginia Polytechnic Institute and State University in Blacksburg.

"Organizing sleep into two segments encouraged people to remember dreams and to use them as paths to self-discovery, Ekirch contended in the April American Historical Review."

submitted by on 2002-09-14T02:51 | Comment | Permalink

This article, unavailable to non-subscribers, is written by neuropsychologist Paul Broks about a patient who thinks he's dead. He argues that mind and world are not separate and that it is this supposed separation that hinders our understanding of consciousness. (At least that's what I remember, since I have no way of getting at the article now. If this is in fact what he does argue, then he is right.)
submitted by on 2002-03-01T06:16 | Comment | Permalink

"Contending that 'all perceiving is also thinking, all reasoning is also intuition, all observation is also invention', he attacked the established assumptions that words, not images, are the primary ingredients of thinking, and that language precedes perception. Rather, Arnheim argued, 'the remarkable mechanisms by which the senses understand the environment are all but identical with the operations described by the psychology of thinking'. Like scientific discovery, he wrote, artistic expression 'is a form of reasoning, in which perceiving and thinking are indivisibly intertwined. A person who paints, writes, composes, dances, I felt compelled to say, thinks with his senses.'"
submitted by on 2002-03-12T01:46 | Comment | Permalink

"There is a persuasive resemblance between gestalt principles and the Japanese-inspired aesthetics that Dow and others propagated. For example, the gestalt emphasis on the dynamic interplay of parts and wholes had been anticipated as early as the third century B.C. in China by a passage in the Tao Te Ching that states that although a wheel is made of 30 spokes, it is the space between the spokes that determines the overall form of the wheel. The phenomenon of reversible figure-ground has precedents in the yin-yang symbol and, in Japanese art, in the compositional equivalence of light and dark, called notan. The gestaltists' ideas of structural economy and closure (the tendency to perceive incomplete forms as complete) are echoed in the Japanese emphasis on elimination of the insignificant and in the ideas of implicitness and the active complicity of the viewer, because genuine beauty, as Okakura explained, 'could be discovered only by one who mentally completed the incomplete'."
submitted by on 2002-03-12T03:19 | Comment | Permalink




about this category

mindbrain

Lately, the study of brain, mind and consciousness has become very exciting and multi-disciplinary. Here I will gather items specifically related to perception and belief.