Sorry, But Your Soul Just Died
From neuroscience to Nietzsche. A sobering look at how man may perceive himself in the future, particularly as ideas about genetic predeterminism takes the place of dying Darwinism.
Being a bit behind the curve, I had only just heard of the digital revolution last February when Louis Rossetto, cofounder of Wired magazine, wearing a shirt with no collar and his hair as long as Felix Mendelssohn's, looking every inch the young California visionary, gave a speech before the Cato Institute announcing the dawn of the twenty-first century's digital civilization. As his text, he chose the maverick Jesuit scientist and philosopher Pierre Teilhard de Chardin, who fifty years ago prophesied that radio, television, and computers would create a "noösphere," an electronic membrane covering the earth and wiring all humanity together in a single nervous system. Geographic locations, national boundaries, the old notions of markets and political processes--all would become irrelevant. With the Internet spreading over the globe at an astonishing pace, said Rossetto, that marvelous modem-driven moment is almost at hand.
Could be. But something tells me that within ten years, by 2006, the entire digital universe is going to seem like pretty mundane stuff compared to a new technology that right now is but a mere glow radiating from a tiny number of American and Cuban (yes, Cuban) hospitals and laboratories. It is called brain imaging, and anyone who cares to get up early and catch a truly blinding twenty-first-century dawn will want to keep an eye on it.
Brain imaging refers to techniques for watching the human brain as it functions, in real time. The most advanced forms currently are three-dimensional electroencephalography using mathematical models; the more familiar PET scan (positron-emission tomography); the new fMRI (functional magnetic resonance imaging), which shows brain blood-flow patterns, and MRS (magnetic resonance spectroscopy), which measures biochemical changes in the brain; and the even newer PET reporter gene/PET reporter probe, which is, in fact, so new that it still has that length of heavy lumber for a name. Used so far only in animals and a few desperately sick children, the PET reporter gene/PET reporter probe pinpoints and follows the activity of specific genes. On a scanner screen you can actually see the genes light up inside the brain.
By 1996 standards, these are sophisticated devices. Ten years from now, however, they may seem primitive compared to the stunning new windows into the brain that will have been developed.
Brain imaging was invented for medical diagnosis. But its far greater importance is that it may very well confirm, in ways too precise to be disputed, certain theories about "the mind," "the self," "the soul," and "free will" that are already devoutly believed in by scholars in what is now the hottest field in the academic world, neuroscience. Granted, all those skeptical quotation marks are enough to put anybody on the qui vive right away, but Ultimate Skepticism is part of the brilliance of the dawn I have promised.
Neuroscience, the science of the brain and the central nervous system, is on the threshold of a unified theory that will have an impact as powerful as that of Darwinism a hundred years ago. Already there is a new Darwin, or perhaps I should say an updated Darwin, since no one ever believed more religiously in Darwin I than he does. His name is Edward O. Wilson. He teaches zoology at Harvard, and he is the author of two books of extraordinary influence, The Insect Societies and Sociobiology: The New Synthesis. Not "A" new synthesis but "The" new synthesis; in terms of his stature in neuroscience, it is not a mere boast.
Wilson has created and named the new field of sociobiology, and he has compressed its underlying premise into a single sentence. Every human brain, he says, is born not as a blank tablet (a tabula rasa) waiting to be filled in by experience but as "an exposed negative waiting to be slipped into developer fluid." You can develop the negative well or you can develop it poorly, but either way you are going to get precious little that is not already imprinted on the film. The print is the individual's genetic history, over thousands of years of evolution, and there is not much anybody can do about it. Furthermore, says Wilson, genetics determine not only things such as temperament, role preferences, emotional responses, and levels of aggression, but also many of our most revered moral choices, which are not choices at all in any free-will sense but tendencies imprinted in the hypothalamus and limbic regions of the brain, a concept expanded upon in 1993 in a much-talked-about book, The Moral Sense , by James Q. Wilson (no kin to Edward O.).
The neuroscientific view of life
This, the neuroscientific view of life, has become the strategic high ground in the academic world, and the battle for it has already spread well beyond the scientific disciplines and, for that matter, out into the general public. Both liberals and conservatives without a scientific bone in their bodies are busy trying to seize the terrain. The gay rights movement, for example, has fastened onto a study published in July of 1993 by the highly respected Dean Hamer of the National Institutes of Health, announcing the discovery of "the gay gene." Obviously, if homosexuality is a genetically determined trait, like left-handedness or hazel eyes, then laws and sanctions against it are attempts to legislate against Nature. Conservatives, meantime, have fastened upon studies indicating that men's and women's brains are wired so differently, thanks to the long haul of evolution, that feminist attempts to open up traditionally male roles to women are the same thing: a doomed violation of Nature.
Wilson himself has wound up in deep water on this score; or cold water, if one need edit. In his personal life Wilson is a conventional liberal, PC, as the saying goes--he is , after all, a member of the Harvard faculty--concerned about environmental issues and all the usual things. But he has said that "forcing similar role identities" on both men and women "flies in the face of thousands of years in which mammals demonstrated a strong tendency for sexual division of labor. Since this division of labor is persistent from hunter-gatherer through agricultural and industrial societies, it suggests a genetic origin. We do not know when this trait evolved in human evolution or how resistant it is to the continuing and justified pressures for human rights."
"Resistant" was Darwin II, the neuroscientist, speaking. "Justified" was the PC Harvard liberal. He was not PC or liberal enough. Feminist protesters invaded a conference where Wilson was appearing, dumped a pitcher of ice water, cubes and all, over his head, and began chanting, "You're all wet! You're all wet!" The most prominent feminist in America, Gloria Steinem, went on television and, in an interview with John Stossel of ABC, insisted that studies of genetic differences between male and female nervous systems should cease forthwith.
But that turned out to be mild stuff in the current political panic over neuroscience. In February of 1992, Frederick K. Goodwin, a renowned psychiatrist, head of the federal Alcohol, Drug Abuse, and Mental Health Administration, and a certified yokel in the field of public relations, made the mistake of describing, at a public meeting in Washington, the National Institute of Mental Health's ten-year-old Violence Initiative. This was an experimental program whose hypothesis was that, as among monkeys in the jungle--Goodwin was noted for his monkey studies--much of the criminal mayhem in the United States was caused by a relatively few young males who were genetically predisposed to it; who were hardwired for violent crime, in short. Out in the jungle, among mankind's closest animal relatives, the chimpanzees, it seemed that a handful of genetically twisted young males were the ones who committed practically all of the wanton murders of other males and the physical abuse of females. What if the same were true among human beings? What if, in any given community, it turned out to be a handful of young males with toxic DNA who were pushing statistics for violent crime up to such high levels? The Violence Initiative envisioned identifying these individuals in childhood, somehow, some way, someday, and treating them therapeutically with drugs. The notion that crime-ridden urban America was a "jungle," said Goodwin, was perhaps more than just a tired old metaphor.
That did it. That may have been the stupidest single word uttered by an American public official in the year 1992. The outcry was immediate. Senator Edward Kennedy of Massachusetts and Representative John Dingell of Michigan (who, it became obvious later, suffered from hydrophobia when it came to science projects) not only condemned Goodwin's remarks as racist but also delivered their scientific verdict: Research among primates "is a preposterous basis" for analyzing anything as complex as "the crime and violence that plagues our country today." (This came as surprising news to NASA scientists who had first trained and sent a chimpanzee called Ham up on top of a Redstone rocket into suborbital space flight and then trained and sent another one, called Enos, which is Greek for "man," up on an Atlas rocket and around the earth in orbital space flight and had thereby accurately and completely predicted the physical, psychological, and task-motor responses of the human astronauts, Alan Shepard and John Glenn, who repeated the chimpanzees' flights and tasks months later.) The Violence Initiative was compared to Nazi eugenic proposals for the extermination of undesirables. Dingell's Michigan colleague, Representative John Conyers, then chairman of the Government Operations Committee and senior member of the Congressional Black Caucus, demanded Goodwin's resignation--and got it two days later, whereupon the government, with the Department of Health and Human Services now doing the talking, denied that the Violence Initiative had ever existed. It disappeared down the memory hole, to use Orwell's term.
A conference of criminologists and other academics interested in the neuroscientific studies done so far for the Violence Initiative--a conference underwritten in part by a grant from the National Institutes of Health--had been scheduled for May of 1993 at the University of Maryland. Down went the conference, too; the NIH drowned it like a kitten. Last year, a University of Maryland legal scholar named David Wasserman tried to reassemble the troops on the QT, as it were, in a hall all but hidden from human purview in a hamlet called Queenstown in the foggy, boggy boondocks of Queen Annes County on Maryland's Eastern Shore. The NIH, proving it was a hard learner, quietly provided $133,000 for the event but only after Wasserman promised to fireproof the proceedings by also inviting scholars who rejected the notion of a possible genetic genesis of crime and scheduling a cold-shower session dwelling on the evils of the eugenics movement of the early twentieth century. No use, boys! An army of protesters found the poor cringing devils anyway and stormed into the auditorium chanting, "Maryland conference, you can't hide--we know you're pushing genocide!" It took two hours for them to get bored enough to leave, and the conference ended in a complete muddle with the specially recruited fireproofing PC faction issuing a statement that said: "Scientists as well as historians and sociologists must not allow themselves to provide academic respectability for racist pseudoscience." Today, at the NIH, the term Violence Initiative is a synonym for taboo . The present moment resembles that moment in the Middle Ages when the Catholic Church forbade the dissection of human bodies, for fear that what was discovered inside might cast doubt on the Christian doctrine that God created man in his own image.
Even more radio-active is the matter of intelligence, as measured by IQ tests. Privately--not many care to speak out--the vast majority of neuroscientists believe the genetic component of an individual's intelligence is remarkably high. Your intelligence can be improved upon, by skilled and devoted mentors, or it can be held back by a poor upbringing--i.e., the negative can be well developed or poorly developed--but your genes are what really make the difference. The recent ruckus over Charles Murray and Richard Herrnstein's The Bell Curve is probably just the beginning of the bitterness the subject is going to create.
Not long ago, according to two neuroscientists I interviewed, a firm called Neurometrics sought out investors and tried to market an amazing but simple invention known as the IQ Cap. The idea was to provide a way of testing intelligence that would be free of "cultural bias," one that would not force anyone to deal with words or concepts that might be familiar to people from one culture but not to people from another. The IQ Cap recorded only brain waves; and a computer, not a potentially biased human test-giver, analyzed the results. It was based on the work of neuroscientists such as E. Roy John 1 , who is now one of the major pioneers of electroencephalographic brain imaging; Duilio Giannitrapani, author of The Electrophysiology of Intellectual Functions ; and David Robinson, author of The Wechsler Adult Intelligence Scale and Personality Assessment: Toward a Biologically Based Theory of Intelligence and Cognition and many other monographs famous among neuroscientists. I spoke to one researcher who had devised an IQ Cap himself by replicating an experiment described by Giannitrapani in The Electrophysiology of Intellectual Functions. It was not a complicated process. You attached sixteen electrodes to the scalp of the person you wanted to test. You had to muss up his hair a little, but you didn't have to cut it, much less shave it. Then you had him stare at a marker on a blank wall. This particular researcher used a raspberry- red thumbtack. Then you pushed a toggle switch. In sixteen seconds the Cap's computer box gave you an accurate prediction (within one-half of a standard deviation) of what the subject would score on all eleven subtests of the Wechsler Adult Intelligence Scale or, in the case of children, the Wechsler Intelligence Scale for Children--all from sixteen seconds' worth of brain waves. There was nothing culturally biased about the test whatsoever. What could be cultural about staring at a thumbtack on a wall? The savings in time and money were breathtaking. The conventional IQ test took two hours to complete; and the overhead, in terms of paying test-givers, test-scorers, test-preparers, and the rent, was $100 an hour at the very least. The IQ Cap required about fifteen minutes and sixteen seconds--it took about fifteen minutes to put the electrodes on the scalp--and about a tenth of a penny's worth of electricity. Neurometrics's investors were rubbing their hands and licking their chops. They were about to make a killing.
In fact-- nobody wanted their damnable IQ Cap!
It wasn't simply that no one believed you could derive IQ scores from brainwaves--it was that nobody wanted to believe it could be done. Nobody wanted to believe that human brainpower is... that hardwired . Nobody wanted to learn in a flash that... the genetic fix is in . Nobody wanted to learn that he was... a hardwired genetic mediocrity ...and that the best he could hope for in this Trough of Mortal Error was to live out his mediocre life as a stress-free dim bulb. Barry Sterman of UCLA, chief scientist for a firm called Cognitive Neurometrics, who has devised his own brain-wave technology for market research and focus groups, regards brain-wave IQ testing as possible--but in the current atmosphere you "wouldn't have a Chinaman's chance of getting a grant" to develop it.
Science is a court from which there is no appeal
Here we begin to sense the chill that emanates from the hottest field in the academic world. The unspoken and largely unconscious premise of the wrangling over neuroscience's strategic high ground is: We now live in an age in which science is a court from which there is no appeal. And the issue this time around, at the end of the twentieth century, is not the evolution of the species, which can seem a remote business, but the nature of our own precious inner selves.
The elders of the field, such as Wilson, are well aware of all this and are cautious, or cautious compared to the new generation. Wilson still holds out the possibility--I think he doubts it, but he still holds out the possibility--that at some point in evolutionary history, culture began to influence the development of the human brain in ways that cannot be explained by strict Darwinian theory. But the new generation of neuroscientists are not cautious for a second. In private conversations, the bull sessions, as it were, that create the mental atmosphere of any hot new science--and I love talking to these people--they express an uncompromising determinism.
They start with the most famous statement in all of modern philosophy, Descartes's "Cogito ergo sum," "I think, therefore I am," which they regard as the essence of "dualism," the old-fashioned notion that the mind is something distinct from its mechanism, the brain and the body. (I will get to the second most famous statement in a moment.) This is also known as the "ghost in the machine" fallacy, the quaint belief that there is a ghostly "self" somewhere inside the brain that interprets and directs its operations. Neuroscientists involved in three-dimensional electroencephalography will tell you that there is not even any one place in the brain where consciousness or self-consciousness ( Cogito ergo sum ) is located. This is merely an illusion created by a medley of neurological systems acting in concert. The young generation takes this yet one step further. Since consciousness and thought are entirely physical products of your brain and nervous system--and since your brain arrived fully imprinted at birth--what makes you think you have free will? Where is it going to come from? What "ghost," what "mind," what "self," what "soul," what anything that will not be immediately grabbed by those scornful quotation marks, is going to bubble up your brain stem to give it to you? I have heard neuroscientists theorize that, given computers of sufficient power and sophistication, it would be possible to predict the course of any human being's life moment by moment, including the fact that the poor devil was about to shake his head over the very idea. I doubt that any Calvinist of the sixteenth century ever believed so completely in predestination as these, the hottest and most intensely rational young scientists in the United States at the end of the twentieth.
Since the late 1970s, in the Age of Wilson, college students have been heading into neuroscience in job lots. The Society for Neuroscience was founded in 1970 with 1,100 members. Today, one generation later, its membership exceeds 26,000. The Society's latest convention, in San Diego, drew 23,052 souls, making it one of the biggest professional conventions in the country. In the venerable field of academic philosophy, young faculty members are jumping ship in embarrassing numbers and shifting into neuroscience. They are heading for the laboratories. Why wrestle with Kant's God, Freedom, and Immortality when it is only a matter of time before neuroscience, probably through brain imaging, reveals the actual physical mechanism that sends these mental constructs, these illusions, synapsing up into the Broca's and Wernicke's areas of the brain?
Which brings us to the second most famous statement in all of modern philosophy: Nietzsche's "God is dead." The year was 1882. (The book was Die Fröhliche Wissenschaft [ The Gay Science ].) Nietzsche said this was not a declaration of atheism, although he was in fact an atheist, but simply the news of an event. He called the death of God a "tremendous event," the greatest event of modern history. The news was that educated people no longer believed in God, as a result of the rise of rationalism and scientific thought, including Darwinism, over the preceding 250 years. But before you atheists run up your flags of triumph, he said, think of the implications. "The story I have to tell," wrote Nietzsche, "is the history of the next two centuries." He predicted (in Ecce Homo ) that the twentieth century would be a century of "wars such as have never happened on earth," wars catastrophic beyond all imagining. And why? Because human beings would no longer have a god to turn to, to absolve them of their guilt; but they would still be racked by guilt, since guilt is an impulse instilled in children when they are very young, before the age of reason. As a result, people would loathe not only one another but themselves. The blind and reassuring faith they formerly poured into their belief in God, said Nietzsche, they would now pour into a belief in barbaric nationalistic brotherhoods: "If the doctrines...of the lack of any cardinal distinction between man and animal, doctrines I consider true but deadly"--he says in an allusion to Darwinism in Untimely Meditations --"are hurled into the people for another generation...then nobody should be surprised when...brotherhoods with the aim of the robbery and exploitation of the non-brothers...will appear in the arena of the future."
Nietzsche's view of guilt, incidentally, is also that of neuro-scientists a century later. They regard guilt as one of those tendencies imprinted in the brain at birth. In some people the genetic work is not complete, and they engage in criminal behavior without a twinge of remorse--thereby intriguing criminologists, who then want to create Violence Initiatives and hold conferences on the subject.
Nietzsche said that mankind would limp on through the twentieth century "on the mere pittance" of the old decaying God-based moral codes. But then, in the twenty-first, would come a period more dreadful than the great wars, a time of "the total eclipse of all values" (in The Will to Power ). This would also be a frantic period of "revaluation," in which people would try to find new systems of values to replace the osteoporotic skeletons of the old. But you will fail, he warned, because you cannot believe in moral codes without simultaneously believing in a god who points at you with his fearsome forefinger and says "Thou shalt" or "Thou shalt not."
Why should we bother ourselves with a dire prediction that seems so far-fetched as "the total eclipse of all values"? Because of man's track record, I should think. After all, in Europe, in the peaceful decade of the 1880s, it must have seemed even more far-fetched to predict the world wars of the twentieth century and the barbaric brotherhoods of Nazism and Communism. Ecce vates! Ecce vates! Behold the prophet! How much more proof can one demand of a man's powers of prediction?
A hundred years ago those who worried about the death of God could console one another with the fact that they still had their own bright selves and their own inviolable souls for moral ballast and the marvels of modern science to chart the way. But what if, as seems likely, the greatest marvel of modern science turns out to be brain imaging? And what if, ten years from now, brain imaging has proved, beyond any doubt, that not only Edward O. Wilson but also the young generation are, in fact, correct?
The elders, such as Wilson himself and Daniel C. Dennett, the author of Darwin's Dangerous Idea: Evolution and the Meanings of Life , and Richard Dawkins, author of The Selfish Gene and The Blind Watchmaker , insist that there is nothing to fear from the truth, from the ultimate extension of Darwin's dangerous idea. They present elegant arguments as to why neuroscience should in no way diminish the richness of life, the magic of art, or the righteousness of political causes, including, if one need edit, political correctness at Harvard or Tufts, where Dennett is Director of the Center for Cognitive Studies, or Oxford, where Dawkins is something called Professor of Public Understanding of Science. (Dennett and Dawkins, every bit as much as Wilson, are earnestly, feverishly, politically correct.) Despite their best efforts, however, neuroscience is not rippling out into the public on waves of scholarly reassurance. But rippling out it is, rapidly. The conclusion people out beyond the laboratory walls are drawing is: The fix is in! We're all hardwired! That, and: Don't blame me! I'm wired wrong!
From nurture to nature
This sudden switch from a belief in Nurture, in the form of social conditioning, to Nature, in the form of genetics and brain physiology, is the great intellectual event, to borrow Nietzsche's term, of the late twentieth century. Up to now the two most influential ideas of the century have been Marxism and Freudianism. Both were founded upon the premise that human beings and their "ideals"--Marx and Freud knew about quotation marks, too--are completely molded by their environment. To Marx, the crucial environment was one's social class; "ideals" and "faiths" were notions foisted by the upper orders upon the lower as instruments of social control. To Freud, the crucial environment was the Oedipal drama, the unconscious sexual plot that was played out in the family early in a child's existence. The "ideals" and "faiths" you prize so much are merely the parlor furniture you feature for receiving your guests, said Freud; I will show you the cellar, the furnace, the pipes, the sexual steam that actually runs the house. By the mid-1950s even anti-Marxists and anti-Freudians had come to assume the centrality of class domination and Oedipally conditioned sexual drives. On top of this came Pavlov, with his "stimulus-response bonds," and B. F. Skinner, with his "operant conditioning," turning the supremacy of conditioning into something approaching a precise form of engineering.
So how did this brilliant intellectual fashion come to so screeching and ignominious an end?
The demise of Freudianism can be summed up in a single word: lithium. In 1949 an Australian psychiatrist, John Cade, gave five days of lithium therapy--for entirely the wrong reasons--to a fifty-one-year-old mental patient who was so manic-depressive, so hyperactive, unintelligible, and uncontrollable, he had been kept locked up in asylums for twenty years. By the sixth day, thanks to the lithium buildup in his blood, he was a normal human being. Three months later he was released and lived happily ever after in his own home. This was a man who had been locked up and subjected to two decades of Freudian logorrhea to no avail whatsoever. Over the next twenty years antidepressant and tranquilizing drugs completely replaced Freudian talk-talk as treatment for serious mental disturbances. By the mid-1980s, neuroscientists looked upon Freudian psychiatry as a quaint relic based largely upon superstition (such as dream analysis -- dream analysis!), like phrenology or mesmerism. In fact, among neuroscientists, phrenology now has a higher reputation than Freudian psychiatry, since phrenology was in a certain crude way a precursor of electroencephalography. Freudian psychiatrists are now regarded as old crocks with sham medical degrees, as ears with wire hairs sprouting out of them that people with more money than sense can hire to talk into.
Marxism was finished off even more suddenly--in a single year, 1973--with the smuggling out of the Soviet Union and the publication in France of the first of the three volumes of Aleksandr Solzhenitsyn's The Gulag Archipelago . Other writers, notably the British historian Robert Conquest, had already exposed the Soviet Union's vast network of concentration camps, but their work was based largely on the testimony of refugees, and refugees were routinely discounted as biased and bitter observers. Solzhenitsyn, on the other hand, was a Soviet citizen, still living on Soviet soil, a zek himself for eleven years, zek being Russian slang for concentration camp prisoner. His credibility had been vouched for by none other than Nikita Khrushchev, who in 1962 had permitted the publication of Solzhenitsyn's novella of the gulag, One Day in the Life of Ivan Denisovich , as a means of cutting down to size the daunting shadow of his predecessor Stalin. "Yes," Khrushchev had said in effect, "what this man Solzhenitsyn has to say is true. Such were Stalin's crimes." Solzhenitsyn's brief fictional description of the Soviet slave labor system was damaging enough. But The Gulag Archipelago , a two-thousand-page, densely detailed, nonfiction account of the Soviet Communist Party's systematic extermination of its enemies, real and imagined, of its own countrymen, by the tens of millions through an enormous, methodical, bureaucratically controlled "human sewage disposal system," as Solzhenitsyn called it-- The Gulag Archipelago was devastating. After all, this was a century in which there was no longer any possible ideological detour around the concentration camp. Among European intellectuals, even French intellectuals, Marxism collapsed as a spiritual force immediately. Ironically, it survived longer in the United States before suffering a final, merciful coup de gr ce on November 9, 1989, with the breaching of the Berlin Wall, which signaled in an unmistakable fashion what a debacle the Soviets' seventy-two-year field experiment in socialism had been. (Marxism still hangs on, barely, acrobatically, in American universities in a Mannerist form known as Deconstruction, a literary doctrine that depicts language itself as an insidious tool used by The Powers That Be to deceive the proles and peasants.)
Freudianism and Marxism--and with them, the entire belief in social conditioning--were demolished so swiftly, so suddenly, that neuroscience has surged in, as if into an intellectual vacuum. Nor do you have to be a scientist to detect the rush.
Anyone with a child in school knows the signs all too well. I have children in school, and I am intrigued by the faith parents now invest--the craze began about 1990--in psychologists who diagnose their children as suffering from a defect known as attention deficit disorder, or ADD. Of course, I have no way of knowing whether this "disorder" is an actual, physical, neurological condition or not, but neither does anybody else in this early stage of neuroscience. The symptoms of this supposed malady are always the same. The child, or, rather, the boy--forty-nine out of fifty cases are boys--fidgets around in school, slides off his chair, doesn't pay attention, distracts his classmates during class, and performs poorly. In an earlier era he would have been pressured to pay attention, work harder, show some self-discipline. To parents caught up in the new intellectual climate of the 1990s, that approach seems cruel, because my little boy's problem is... he's wired wrong! The poor little tyke --the fix has been in since birth! Invariably the parents complain, "All he wants to do is sit in front of the television set and watch cartoons and play Sega Genesis." For how long? "How long? For hours at a time." Hours at a time; as even any young neuroscientist will tell you, that boy may have a problem, but it is not an attention deficit.
Nevertheless, all across America we have the spectacle of an entire generation of little boys, by the tens of thousands, being dosed up on ADD's magic bullet of choice, Ritalin, the CIBA-Geneva Corporation's brand name for the stimulant methylphenidate. I first encountered Ritalin in 1966 when I was in San Francisco doing research for a book on the psychedelic or hippie movement. A certain species of the genus hippie was known as the Speed Freak, and a certain strain of Speed Freak was known as the Ritalin Head. The Ritalin Heads loved Ritalin. You'd see them in the throes of absolute Ritalin raptures...Not a wiggle, not a peep...They would sit engrossed in anything at all...a manhole cover, their own palm wrinkles...indefinitely...through shoulda-been mealtime after mealtime...through raging insomnias...Pure methyl-phenidate nirvana...From 1990 to 1995, CIBA-Geneva's sales of Ritalin rose 600 percent; and not because of the appetites of subsets of the species Speed Freak in San Francisco, either. It was because an entire generation of American boys, from the best private schools of the Northeast to the worst sludge-trap public schools of Los Angeles and San Diego, was now strung out on methylphenidate, diligently doled out to them every day by their connection, the school nurse. America is a wonderful country! I mean it! No honest writer would challenge that statement! The human comedy never runs out of material! It never lets you down!
Meantime, the notion of a self--a self who exercises self-discipline, postpones gratification, curbs the sexual appetite, stops short of aggression and criminal behavior--a self who can become more intelligent and lift itself to the very peaks of life by its own bootstraps through study, practice, perseverance, and refusal to give up in the face of great odds--this old-fashioned notion (what's a boot strap, for God's sake?) of success through enterprise and true grit is already slipping away, slipping away...slipping away...The peculiarly American faith in the power of the individual to transform himself from a helpless cypher into a giant among men, a faith that ran from Emerson ("Self-Reliance") to Horatio Alger's Luck and Pluck stories to Dale Carnegie's How to Win Friends and Influence People to Norman Vincent Peale's The Power of Positive Thinking to Og Mandino's The Greatest Salesman in the World --that faith is now as moribund as the god for whom Nietzsche wrote an obituary in 1882. It lives on today only in the decrepit form of the "motivational talk," as lecture agents refer to it, given by retired football stars such as Fran Tarkenton to audiences of businessmen, most of them woulda-been athletes (like the author of this article), about how life is like a football game. "It's late in the fourth period and you're down by thirteen points and the Cowboys got you hemmed in on your own one-yard line and it's third and twenty-three. Whaddaya do?..."
Sorry, Fran, but it's third and twenty-three and the genetic fix is in, and the new message is now being pumped out into the popular press and onto television at a stupefying rate. Who are the pumps? They are a new breed who call themselves "evolutionary psychologists." You can be sure that twenty years ago the same people would have been calling themselves Freudian; but today they are genetic determinists, and the press has a voracious appetite for whatever they come up with.
The most popular study currently--it is still being featured on television news shows, months later--is David Lykken and Auke Tellegen's study at the University of Minnesota of two thousand twins that shows, according to these two evolutionary psychologists, that an individual's happiness is largely genetic. Some people are hardwired to be happy and some are not. Success (or failure) in matters of love, money, reputation, or power is transient stuff; you soon settle back down (or up) to the level of happiness you were born with genetically. Three months ago Fortune devoted a long takeout, elaborately illustrated, of a study by evolutionary psychologists at Britain's University of Saint Andrews showing that you judge the facial beauty or handsomeness of people you meet not by any social standards of the age you live in but by criteria hardwired in your brain from the moment you were born. Or, to put it another way, beauty is not in the eye of the beholder but embedded in his genes. In fact, today, in the year 1996, barely three years before the end of the millennium, if your appetite for newspapers, magazines, and television is big enough, you will quickly get the impression that there is nothing in your life, including the fat content of your body, that is not genetically predetermined. If I may mention just a few things the evolutionary psychologists have illuminated for me over the past two months:
The male of the human species is genetically hardwired to be polygamous, i.e., unfaithful to his legal mate. Any magazine-reading male gets the picture soon enough. (Three million years of evolution made me do it!) Women lust after male celebrities, because they are genetically hardwired to sense that alpha males will take better care of their offspring. (I'm just a lifeguard in the gene pool, honey.) Teenage girls are genetically hardwired to be promiscuous and are as helpless to stop themselves as dogs in the park. (The school provides the condoms.) Most murders are the result of genetically hardwired compulsions. (Convicts can read, too, and they report to the prison psychiatrist: "Something came over me...and then the knife went in." 2 )
Where does that leave self-control? Where, indeed, if people believe this ghostly self does not even exist, and brain imaging proves it, once and for all?
So far, neuroscientific theory is based largely on indirect evidence, from studies of animals or of how a normal brain changes when it is invaded (by accidents, disease, radical surgery, or experimental needles). Darwin II himself, Edward O. Wilson, has only limited direct knowledge of the human brain. He is a zoologist, not a neurologist, and his theories are extrapolations from the exhaustive work he has done in his specialty, the study of insects. The French surgeon Paul Broca discovered Broca's area, one of the two speech centers of the left hemisphere of the brain, only after one of his patients suffered a stroke. Even the PET scan and the PET reporter gene/PET reporter probe are technically medical invasions, since they require the injection of chemicals or viruses into the body. But they offer glimpses of what the noninvasive imaging of the future will probably look like. A neuroradiologist can read a list of topics out loud to a person being given a PET scan, topics pertaining to sports, music, business, history, whatever, and when he finally hits one the person is interested in, a particular area of the cerebral cortex actually lights up on the screen. Eventually, as brain imaging is refined, the picture may become as clear and complete as those see-through exhibitions, at auto shows, of the inner workings of the internal combustion engine. At that point it may become obvious to everyone that all we are looking at is a piece of machinery, an analog chemical computer, that processes information from the environment. "All," since you can look and look and you will not find any ghostly self inside, or any mind, or any soul.
Thereupon, in the year 2006 or 2026, some new Nietzsche will step forward to announce: "The self is dead"--except that being prone to the poetic, like Nietzsche I, he will probably say: "The soul is dead." He will say that he is merely bringing the news, the news of the greatest event of the millennium: "The soul, that last refuge of values, is dead, because educated people no longer believe it exists." Unless the assurances of the Wilsons and the Dennetts and the Dawkinses also start rippling out, the lurid carnival that will ensue may make the phrase "the total eclipse of all values" seem tame.
The two most fascinating riddles of the 21st century
If I were a college student today, I don't think I could resist going into neuroscience. Here we have the two most fascinating riddles of the twenty-first century: the riddle of the human mind and the riddle of what happens to the human mind when it comes to know itself absolutely. In any case, we live in an age in which it is impossible and pointless to avert your eyes from the truth.
Ironically, said Nietzsche, this unflinching eye for truth, this zest for skepticism, is the legacy of Christianity (for complicated reasons that needn't detain us here). Then he added one final and perhaps ultimate piece of irony in a fragmentary passage in a notebook shortly before he lost his mind (to the late-nineteenth-century's great venereal scourge, syphilis). He predicted that eventually modern science would turn its juggernaut of skepticism upon itself, question the validity of its own foundations, tear them apart, and self-destruct. I thought about that in the summer of 1994 when a group of mathematicians and computer scientists held a conference at the Santa Fe Institute on "Limits to Scientific Knowledge." The consensus was that since the human mind is, after all, an entirely physical apparatus, a form of computer, the product of a particular genetic history, it is finite in its capabilities. Being finite, hardwired, it will probably never have the power to comprehend human existence in any complete way. It would be as if a group of dogs were to call a conference to try to understand The Dog. They could try as hard as they wanted, but they wouldn't get very far. Dogs can communicate only about forty notions, all of them primitive, and they can't record anything. The project would be doomed from the start. The human brain is far superior to the dog's, but it is limited nonetheless. So any hope of human beings arriving at some final, complete, self-enclosed theory of human existence is doomed, too.
This, science's Ultimate Skepticism, has been spreading ever since then. Over the past two years even Darwinism, a sacred tenet among American scientists for the past seventy years, has been beset by...doubts. Scientists--not religiosi--notably the mathematician David Berlinski ("The Deniable Darwin," Commentary , June 1996) and the biochemist Michael Behe (Darwin's Black Box , 1996), have begun attacking Darwinism as a mere theory, not a scientific discovery, a theory woefully unsupported by fossil evidence and featuring, at the core of its logic, sheer mush. (Dennett and Dawkins, for whom Darwin is the Only Begotten, the Messiah, are already screaming. They're beside themselves, utterly apoplectic. Wilson, the giant, keeping his cool, has remained above the battle.) By 1990 the physicist Petr Beckmann of the University of Colorado had already begun going after Einstein. He greatly admired Einstein for his famous equation of matter and energy, E=mc2 , but called his theory of relativity mostly absurd and grotesquely untestable. Beckmann died in 1993. His Fool Killer's cudgel has been taken up by Howard Hayden of the University of Connecticut, who has many admirers among the upcoming generation of Ultimately Skeptical young physicists. The scorn the new breed heaps upon quantum mechanics ("has no real-world applications"..."depends entirely on fairies sprinkling goofball equations in your eyes"), Unified Field Theory ("Nobel worm bait"), and the Big Bang Theory ("creationism for nerds") has become withering. If only Nietzsche were alive! He would have relished every minute of it!
Recently I happened to be talking to a prominent California geologist, and she told me: "When I first went into geology, we all thought that in science you create a solid layer of findings, through experiment and careful investigation, and then you add a second layer, like a second layer of bricks, all very carefully, and so on. Occasionally some adventurous scientist stacks the bricks up in towers, and these towers turn out to be insubstantial and they get torn down, and you proceed again with the careful layers. But we now realize that the very first layers aren't even resting on solid ground. They are balanced on bubbles, on concepts that are full of air, and those bubbles are being burst today, one after the other."
I suddenly had a picture of the entire astonishing edifice collapsing and modern man plunging headlong back into the primordial ooze. He's floundering, sloshing about, gulping for air, frantically treading ooze, when he feels something huge and smooth swim beneath him and boost him up, like some almighty dolphin. He can't see it, but he's much impressed. He names it God.
Tom Wolfe has chronicled American popular culture for more than three decades. His best-selling books include The Electric Kool-Aid Acid Test, The Right Stuff, and The Bonfire of the Vanities.
This article can be found on the Tetrica website (link closed). Reprinted with permission.
Copyright © 2001-2008 OrthodoxyToday.org. All rights reserved. Any reproduction of this article is subject to the policy of the individual copyright holder. Follow copyright link for details.
OrthodoxyToday.org Commentary on social and moral issues of the day
Good News, Your Soul Hasn't Died Quite Yet
Alfred J. Freddoso
University of Notre Dame
My title is inspired by Tom Wolfe's celebrated essay, "Sorry, But Your Soul Just Died," originally published in 1996.(1) Fascinated by the eager horde of young scientists currently devoting themselves to research on the human brain, Wolfe reports how these enthusiasts have been convinced by technological advances that the human 'mind' involves nothing over and beyond the brain and that, in addition, the well springs of human behavior are to a hitherto unimagined extent the result of genetic hard-wiring and not of environmental factors such as, among other things, 'free will'--where 'mind' and 'free will', along with other venerable philosophical terms such as 'the self' and 'the soul' are all surrounded by what Wolfe aptly calls "skeptical quotation marks." (Those of us in the know will immediately recognize these terms as hallmarks of what eliminative materialists derisively label 'folk psychology'.) Even though Wolfe takes care to chronicle the cultural resistance to these claims, in the end he envisions a future Nietzsche announcing, not this time the death of God, but instead the death of the soul, and with it any lingering belief in freedom or immortality. God, freedom, and immortality--all jettisoned by Nietzsche and his imminent successor. So much for metaphysics as we have known it.
Below I will urge that the defective philosophical problematic at stake here has not essentially changed since it was set by Hobbesian physicalism (or materialism) on the one side and Cartesian dualism on the other. But this point, while valid enough, does not do justice to the gravity of our current predicament. For in this case philosophical assumptions, flawed though they be, have had a widespread cultural influence on the way we regard ourselves and on the way we behave toward one another. As Wolfe points out, in announcing the death of God Nietzsche was not so much proclaiming his own atheism as delivering the news that the educated elite no longer believed in God. So, too, the new Nietzsche will be announcing that the educated elite no longer believe in the soul or the self as conceived of by traditional western philosophy, religion, and popular culture. Still, Nietzsche had warned that disbelief in God would come at a price much steeper than its champions commonly recognized--a monition sufficiently borne out by the unprecedented human carnage of the twentieth century. Likewise, if the only alternative to a physicalist disbelief in the soul is Cartesian dualism, then it is arguable that no matter which of the two sides we line up on, we will be unable to articulate the metaphysical underpinnings of a philosophical anthropology that does justice both to our obvious dignity and distinctiveness as human beings and to our equally obvious continuity with other animals and, indeed, with the rest of the physical universe. Instead, we are likely to find ourselves oscillating incongruously between the poles of the "angelism-bestialism" syndrome described so powerfully (and so entertainingly) by Walker Percy.(2) Dualism denies our essential embodiedness and animality, while physicalism cannot account for the self that tries to study itself. In both cases, the human self is isolated from its physical surroundings and can very easily come to see itself as the sole source of whatever value and significance is had by an otherwise meaningless corporeal reality. In short, the last four centuries, despite the splendor of their scientific achievements, seem to have left many in confusion about just what human beings are and about just what value to assign to human life and action--a confusion which contemporary political liberalism simply (but not coincidentally) takes for granted, and the results of which are being played out in what Pope John Paul II has called the "culture of death."(3) In light of these developments, it is no surprise that the Holy Father should have begun his encyclical Fides et Ratio by invoking once again the Delphic prescription, "Know thyself," and that he should have gone on to urge Catholic thinkers to undertake with renewed vigor the project of philosophical anthropology.(4)
We might note in passing that this project is intimately tied to the thesis that knowledge is unified--a thesis which lies at the heart not only of Fides et Ratio but also of Ex Corde Ecclesiae, where the Holy Father puts the integration of all knowledge at the center of the ideal distinctiveness of Catholic higher education.(5) At a time of increasing fragmentation of the academic disciplines and in the face of widespread skepticism about the very project of bringing them together into a unified framework, he notes that each of the disciplines in its own way studies the human animal, so that a comprehensive philosophical anthropology must draw from all of them. Consequently, nothing I say below should be taken to disparage the natural and human sciences. Still, we must separate the sciences themselves from the anti-supernaturalist ideologies evident in the writings of certain prominent scientists and 'scientifically-minded' philosophers.
What we need, then, is the conceptual space to forge a metaphysical alternative to physicalism and dualism that gives us some hope of overcoming the intellectual, moral, and social consequences of them both.
2. Methodological preliminaries
In this case, old news is good news. And by 'good news' here I do not primarily mean Aristotelianism, though in my opinion some version of it has a crucial role to play. I mean rather the Good News of the Gospel. As Gaudium et Spes puts it in a passage cited again and again by the present Pontiff, "only in the mystery of the incarnate Word does the mystery of man take on light."(6) And the "mystery of man" involves not only the story of our salvation, but our very nature as human beings, a topic that the Church has repeatedly addressed in her official teaching. In Fides et Ratio the Holy Father cites one recent example:
"For its part, the Second Vatican Council offers a rich and fertile teaching concerning philosophy. We cannot forget, especially given the topics considered in this Encyclical Letter, that one whole chapter of the constitution Gaudium et Spes is a sort of compendium of biblical anthropology and serves as a source of understanding for philosophy as well. These pages speak of the value of the human person created in the image of God, explain his dignity and preeminence over other creatures, and highlight the transcendent capacity of his reason."(7)
The Holy Father had noted earlier that while many pronouncements of the Church's Magisterium on philosophical matters have been aimed at forestalling the damaging effects on the faithful of mistaken philosophical claims and theories, these pronouncements "are intended above all to challenge, promote, and stimulate philosophical investigations."(8) The passage just quoted from Fides et Ratio is a clear invitation to Catholic thinkers to construct philosophical anthropologies consonant with the Church's teaching.
But how shall we proceed? In another place I have argued that far from being bound by the hoary distinction between philosophers and theologians, Catholic philosophers ought to see themselves first and foremost as architects of comprehensive systems of wisdom that take revealed truths as first principles.(9) To be sure, mainstream Catholic thought has always maintained a qualified optimism about the power of natural reason to establish some important metaphysical and moral truths with a high degree of certitude, and one traditional task of Catholic philosophers has been to engage intellectuals who for one reason or another do not accept Christian revelation.(10) In Fides et Ratio, for example, "the concept of the person as a free and intelligent subject, with the capacity to know God, truth, and goodness" is itself counted as part of the "core of philosophical insight" that has been arrived at without the help of revelation.(11) Nonetheless, this optimism about natural reason is qualified precisely because for various reasons--some cognitive and some affective--what should be evident or plausible to natural reason is not always recognized as such without the gift of divine revelation.
Just such a situation exists, I submit, with regard to the metaphysical dimensions of contemporary analytic 'philosophy of mind'. The field is exhaustively divided between a handful of Cartesian dualists on the one hand and numerous physicalists (or materialists) of various stripes on the other, and for several decades the main action has been occurring in the intramural debates among the physicalists. Remarkably, some Christian philosophers, perhaps wary of the embarrassment of being labeled as dualists and thereby excluded from conversations with most non-believing philosophers and scientists, have suggested that Christians themselves should simply dispense with the soul altogether--or at least with talk of the soul. This, I am told by a reliable source, was the advice given to bishops by certain consultants at a recent meeting of the Committee on Science and Human Values of the United States Conference of Catholic Bishops. Acknowledging, in response to the pastoral concerns expressed by some of the bishops, that the designation 'materialist' might sound disconcerting to ordinary lay people, these thinkers suggested that Catholics call themselves "non-reductive physicalists" instead, all the while maintaining a discreet silence about the soul itself and about its immateriality and immortality. And all this, the bishops were assured, is consonant with the Catholic Faith. After all, the story goes, what Catholics really believe in is not the immortality of the soul--a vestige of Hellenism that St. Augustine and St. Thomas, unlike we ourselves, were prevented from spotting as counterfeit by culturally-induced blindness--but rather the resurrection of the body. And this latter doctrine is (well, to be honest, may be) compatible with physicalism.(12)
It is precisely episodes of this sort that lead me to insist that Catholic intellectuals should begin their inquiry into philosophical anthropology by carefully studying the teachings of the Church and allowing those teachings to play a normative role along with natural reason and experience. In fact, appeals to the teachings of the Church should in theory have preeminence, even if we grant that reason and experience must play a significant role in the interpretation of the sources of revelation. One danger here, characteristic of the excessively rationalistic tendencies endemic to certain strains of Catholic thought over the last two centuries, is that appeals to reason and experience will be allowed to trump the plain sense of authoritative biblical and ecclesiastical pronouncements. In other words, appeals to reason and experience are freely allowed to serve as checks on the interpretation of the sources of revelation, but the sources of revelation are not accorded the same power to check claims putatively founded on reason and experience. Instead, much too early in the game revealed doctrines are either abandoned or subjected to debilitating interpretations that empty them of their power to inspire and challenge young people and to fortify Christian witness. A second danger, closely linked with the first, is that philosophical problematics as defined by current intellectual elites will themselves fill the vacuum and come to provide indefeasible norms for inquiry. Once this happens, judgments of plausibility are altered in subtle ways that can lead us to make unwarranted concessions. In short, we run the risk of letting the current problematics shape and revise our own agenda even when, as in the case at hand, it is precisely the current problematic that stands in need of radical revision from a Catholic perspective.
We will be able to make intelligent assessments of faulty problematics and to offer credible suggestions for revising them only to the extent that our own first principles, including revealed first principles, are clearly understood and systematically elaborated.(13) This is simply an application to philosophy of the claim that it is theories as wholes which are the proper units of assessment in scientific inquiry. For aside from considerations of mere logical validity, the acceptability of central theoretical arguments depends almost entirely on those judgments of plausibility--what Newman called "antecedent probabilities" and what many analytic philosophers call "our intuitions"--that we bring to the assessment of key premises. But such judgments depend heavily on first principles. Perhaps much of any authentically Catholic anthropology can be argued for persuasively without recourse to revelation; perhaps not. But in order to find out, we first have to develop such a theory systematically in light of revealed principles. What's more, this procedure has the added advantage of forcing each of us to clarify with precision and to defend our interpretations of the deliverances of faith, instead of hiding behind ambiguity.(14)
In what follows I will examine the teachings of the Church on the human soul and comment briefly on certain key elements in the contemporary problematic in philosophy of mind.
3. The analogia fidei
I know from unpleasant experience that one who defends the plain sense of the Church's teaching on the human soul may expect to meet with fierce resistance from certain Catholic quarters.(15) Apparently, in the eyes of some it is permissible for the philosophically and theologically unsophisticated faithful to use "soul-language" in liturgical contexts and even within more straightforwardly doctrinal contexts, as long as they do not take such language with metaphysical seriousness. One is reminded of Averroes's attitude toward the miracle stories in the Quran: "Those stories are necessary for edifying the unlearned masses and fortifying their religious and moral practice, but we philosophers, who need no such props, know better, of course." I can imagine someone characterizing, say, the Feast of All Souls in just such a condescending way. (In fact, I don't have to imagine it; I've actually heard it.)
Special scorn seems to be reserved for Pope Pius XII's encyclical Humani Generis, which insists on the direct creation of the human soul by God and cautions that monogenism provides the only metaphysical background clearly consonant with the Church's teaching about the nature and transmission of original sin.(16)
The scorn notwithstanding, these points should remind us of the tight interconnectedness among central Catholic doctrines, the so-called analogia fidei. Consequently, we should anticipate that a denial of the immateriality and immortality of the soul will have wide-reaching consequences for the rest of Catholic doctrine.
This expectation is borne out by a perusal of the Catechism of the Catholic Church and of those many official creeds, conciliar statements, and papal teachings the Catechism draws from.(17) While none of these documents, including the Catechism, is meant to be a work of philosophy, together they are meant to lay out the first principles of the Catholic Faith in such a way as to make them both accessible to the ordinary faithful and also amenable to further unification and systematization by intellectually sophisticated believers. To be sure, a careful interpretation of Scripture and Tradition will require us to distinguish various degrees of theological certitude and to respect the freedom of intellectuals to speculate where the Church has not spoken definitively.(18) Still, we must exercise even greater care not to split the Faith into two Faiths, one for the vulgar and one for the learned, to use Berkeley's terms. There is just one Faith, and in the first place it is the Faith of the martyrs. We should keep this in mind as we explore the teachings of the Church on the soul.
4. Church teaching and the soul: direct creation
Let's start at the beginning. The Catechism traces the special dignity of human beings to the soul and its distinctive operations:
"With his openness to truth and beauty, his sense of moral goodness, his freedom and the voice of his conscience, with his longings for the infinite and for happiness, man questions himself about God's existence. In all this he discerns signs of his spiritual soul. The soul, the 'seed of eternity we bear in ourselves, irreducible to the merely material', can have its origin only in God."(19)
According to this statement, what is distinctive about the human soul is that it "can have its origin only in God" and that it is "irreducible to the merely material." These properties are enunciated more explicitly later on: "The Church teaches that every spiritual soul is created immediately by God--it is not 'produced' by the parents--and also that it is immortal; it does not perish when it separates from the body at death, and it will be reunited with the body at the final Resurrection."(20)
Before I comment on these two teachings separately, I want to point out that openness to truth and beauty, along with the other distinctive characteristics mentioned above, are said to be signs of the spiritual soul; it is clear that they are not exhaustive of its metaphysical reality. The soul itself is what confers human dignity, both because of its nature and powers and because of its origin, even when its characteristic operations are impeded by physical abnormalities or ailments--as, for instance, in the case of the severely mentally handicapped like my affable next-door neighbor Jack Spillner, a 56-year-old mongoloid who should have died forty years ago and celebrates his longevity by smoking prodigiously (and oftentimes my cigarettes, I might add). That it is the soul itself, given its origins, that confers human dignity is confirmed by the passage that inaugurates the discussion of the fifth commandment:
"Human life is sacred because from its beginning it involves the creative action of God and it remains for ever in a special relationship with the Creator, who is its sole end. God alone is the Lord of life from its beginning until its end: no one can under any circumstance claim for himself the right directly to destroy an innocent human being."(21)
Contrary to a suggestion once made to me in conversation, the teaching that the human soul is created directly or immediately by God means more than simply that God produces each human being as an singular entity qua singular--that is, in such a way that he intends to produce just that individual. In the Catholic intellectual tradition, it has been almost a truism that God produces every entity, including every effect of created or secondary causes, in just this way. In fact, Suarez cites God's unique ability to intend singular effects qua singular as an argument for the doctrine, which itself follows from the doctrines of creation and conservation, that God is an immediate concurring cause of every effect produced by secondary causes.(22) But neither can the teaching in question be taken to mean simply that God is a general concurring cause of each human soul. As just noted, this interpretation would once again not distinguish the human soul from any other of God's effects according to Catholic teaching; what's more, it is ruled out by the Catechism's explicit assertion that the parents, who are efficient causes of their child, do not produce the soul of their child. Rather, the teaching can mean only that some aspect of the organism effected by the parents--namely, his or her 'spirtual soul'--is not itself something that they directly produce. (Of course, the parents do not produce the ultimate particles that go into the constitution of their child, either. I assume that even Catholic 'non-reductive physicalists' would attribute the original creation of such particles directly to God.)
The teaching that God immediately creates the human soul embarrasses some scientifically-minded Catholic philosophers. The first thing to point out is that this does not distinguish it from many other Catholic doctrines, including the virgin birth, the miracles of Christ, his resurrection from the dead, and his real and substantial presence in the Sacrament of the Altar. And, in truth, the advance of scientific knowledge has less to do with it than does ideological naturalism and anti-supernaturalism of the sort that one can see in the most engaging popular presentations of current science--for instance, by Richard Dawkins, Stephen Hawking, and Stephen Jay Gould--and that has spilled over even to Christian thinkers in some cases. From such a perspective, the direct creation of the human soul by God is a leftover from an earlier era in which Darwinian evolution was unheard of and the complexity of the brain was not fully appreciated. In such times, so the story goes, the immaterial soul was postulated merely to fill in the gaps of woefully inadequate scientific theories--a 'soul of the gaps' to go along with the more famous 'god of the gaps' who was invoked to compensate for failures of explanation within the order of secondary causes.
In this connection, it is worth noting that while some early modern philosophers may have been guilty of making use of a 'god of the gaps', scholastic thinkers did not attribute the actions of creation, conservation, and general concurrence to God in order to make up for the deficiencies of their scientific explanations. Rather, they were spelling out--from above, as it were--the metaphysical implications of God's role as the Creator and First Cause of all being. Similarly, in the case of the soul, God's direct creation of the human soul is necessary on the assumption that the soul is both immaterial and non-eternal, since immaterial beings can come into existence only through creation ex nihilo.(23) Hence, the direct creation of the human soul is not posited to fill a gap in any biological or neurophysiological theory. Rather, it is a demand of the dignity and singular ontological status of the human animal. What's more, nothing we know about the nature of the brain or the evolution of the human organism rules out the direct creation of the human soul by God. Neuroscience may help us to understand various aspects of cognition and affection, but it can hardly be said to have explained how higher intellective functions are so much as possible. More importantly, as the Catechism passage quoted above adumbrates, the doctrine of the direct creation of the soul may well be the only metaphysical foundation that has a chance of enabling Christians to sustain within modern western cultures a firm and abiding respect for the sacredness of the lives of the unborn, the mentally handicapped, the elderly feeble-minded, the terminally ill, and other 'unproductive' and 'incovenient' human beings.
5. Church teaching on the soul: immateriality
What of the claim that the 'spiritual soul' is irreducible to the merely material? As is clear from the full context of Catholic teaching, this must be stronger than the sort of irreducibility of so-called 'mental properties' to physical properties that is typically posited by the 'non-reductive' version of physicalism known as 'property dualism'.(24) I will have a bit more to say about this theory below, since some contemporary Catholic philosophers have adopted it as an attractive alternative to belief in an immaterial soul. For present purposes it is sufficient to note that according to property dualism human beings have no immaterial aspect or component that might survive in the interim between death and the general resurrection.(25) But this conflicts with a number of important Catholic doctrines that have intimate connections with the immateriality of the soul.
The most obvious of these doctrines are, of course, the immortality of the soul, enunciated clearly by the Catechism in a one of the passages quoted above, and the doctrine that God creates the human soul directly. But this is merely the tip of the iceberg, since the immortality of the human soul is itself presupposed by a number of other important doctrines. Among them are the particular judgment of "each man in his immortal soul at the very moment of his death";(26) the existence of purgatory and, with it, the "piety and wholesomeness" of prayer for the holy souls in purgatory;(27) the intercession of the saints and their presence even now in the heavenly worship we participate in through the Mass;(28) the singularity of the privilege accorded to the Mother of God in being assumed body and soul into heaven;(29) the pre-risen Christ's descent into hell to liberate the souls of the patriarchs and other pre-Christian saints,(30) and even the very characterization of death as the separation of the soul from the body.(31) What's more, liturgical practice presupposes, reinforces, and invokes all these doctrines in one way or another. And I have not even touched on the many moral doctrines, especially those regarding the nature of grace and sin, that take for granted the higher powers of the soul that, as we saw above, the Church takes to be signs of its immateriality.
I suppose there are ways to alter the plain sense of such doctrines so as to make them compatible with disbelief in the human soul and in its immateriality and immortality. It is, as our political leaders are wont to remind us these days, a free country. But what would be the point of such an exercise? If there were a deep tension here between the apparent deliverances of reason and the apparent deliverances of faith, then there might be some excuse. But when we examine the quasi-philosophical ruminations of writers such as Dawkins and Gould, it is hard not to notice the extremely tenuous connection between their premises, which are usually drawn from the sciences, and their conclusions, which reflect an unmistakable drift in the direction of, as Dawkins puts it, an "intellectually fulfilling atheism" that has the deflation of Christian ideals and aspirations as one of its primary goals. And why should we buy into that? In short, finding out about the wondrous workings of the human brain or the intricacies of human genetics does not seem to create a conflict with faith unless the discoveries are combined with a strong physicalist ideology. Indeed, current scientific theories do not by themselves undermine even a carefully formulated Cartesian dualism, not to mention the Catholic Church's view of the soul. What's more, even though the doctrine of the immateriality of the soul entails that our higher cognitive and appetitive operations are not themselves operations of the brain, the anti-dualistic nature of the Catholic view of the human animal, to be spelled out below, should antecedently prepare us to expect that such higher operations will depend heavily on the normal functioning of the brain and central nervous system. So the fact that they are thus dependent and the discovery of the precise ways in which they are dependent are hardly an embarrassment for the Catholic perspective.
There is one last point about immateriality that is worth pondering. Some philosophers have insisted that it is no easier to understand how an immaterial subject can think than it is to understand how a material subject can think. Peter van Inwagen puts the point in a particularly forceful way:
".... it is the thinking itself that is the source of the mystery of a thinking physical thing. The notion of a non-physical thing that thinks is, I would argue, equally mysterious. How any sort of thing could think is a mystery. It is just that it is a bit easier to see that thinking is a mystery when we suppose that the thing that does the thinking is physical, for we can form mental images of the operations of a physical thing and we can see that the physical interactions that are represented in these images--that are the only interactions that can be represented in these images--have no connection with thought or sensation, or none that we are able to imagine, conceive, or articulate. The only reason that we do not readily find the notion of a non-physical thing that thinks equally mysterious is that we have no clear procedure for forming mental images of non-physical things."(32)
The traditional philosophical argument for the immateriality of higher intellective operations, usually attributed in its canonical form to Aristotle and recently resurrected in striking fashion by James Ross,(33) begins by pointing out the limitations of material representation. Aristotle, for instance, argues that the material construction of the human sensory organs accounts for the fact that each of those organs has an object limited both in kind and in extent, whereas higher intellective operations have an unlimited object. Ross, on the other hand, argues for the immateriality of intellective operations from our ability to understand what we mean despite the fact that material representations of our abstract thoughts do not disambiguate them from meanings distinct from, but closely related to, the ones we intend. In general, then, the argument is that, because of their peculiar nature, the indisputably distinctive human operations of theoretical and practical reason cannot have a material organ or medium and so, it would seem, require an immaterial medium instead--whereas this is not the case with sensation, imagination, feeling, memory, and the other cognitive and affective operations we share in common with animals that lack rationality. So because, in addition to its vegetative and sentient functions, the human soul has these higher intellective and volitional operations, it must be immaterial.
So the beginning of an answer to van Inwagen's contention is that higher cognitive and affective operations have a certain limitlessness and lucidity that bespeak immateriality. One is reminded of Ockham's assertion that mental acts signify naturally what spoken and written terms signify by convention through their subordination to mental acts. Within the Catholic intellectual tradition this point can be elaborated more fully by comparing human intellection with angelic (or intuitive) intellection, an exercise that anyone can engage in--witness Kant--but that is done better when one takes the existence of angelic beings and their roles in our lives as a metaphysical given.
I have spent this long on the existence, immateriality, and immortality of the human soul because nowadays physicalism seems to be a more tempting option to many Catholic thinkers than dualism is. But from a Catholic perspective dualism is just as wrongheaded and, in the end, just as pernicious as physicalism. Dualism treats body and soul as two separate substances or, at the very least, two antecedently constituted integral parts of an entity whose unity is per accidens; and it identifies the human self with just the immaterial soul. In this it runs afoul of the Catholic teaching that the soul is the form of the body and that the human body and the human soul are so intimately linked that they derive their identity from one another. Perhaps more precisely, the soul is the form of the human organism as a whole and, as such, makes it to be the sort of living substance it is. Thus, the human body and human soul are not two antecedently constituted integral parts, but rather (to use the scholastic phrase) complementary 'essential parts' of an organism whose unity is per se. The Catechism puts it this way:
"The unity of soul and body is so profound that one has to consider the soul to be the 'form' of the body: i.e., it is because of its spiritual soul that the body made of matter becomes a living, human body; spirit and matter, in man, are not two natures united, but rather their union forms a single nature."(34)
Some profess themselves unable to understand how it can be that a single organism or substance has both material and immaterial components, and as a consequence they treat the Church's account of the human being as a type of dualism. To be sure, the truth of this account depends on the acceptability of Aristotelian hylomorphism or at least something very close to it. I will not attempt to exhibit the philosophical virtues of hylomorphism, though I have argued elsewhere that it is no less viable today than ever as a philosophy of nature, notwithstanding mechanistic and reductionistic misunderstandings (and propaganda) to the contrary.(35) But it is important to remember that Aristotle treats the forms of living things at length precisely because living things exhibit the most impressive type of unity known to us in nature. And it is through the powers, tendencies, and operations of living things that we comprehend this unity and the principle (or form) that underlies it--regardless of whether we can form a clear image of this principle or of the unity of the organism. Given the coherence and plausibility of hylomorphism--at least in broad outline--the argument for immateriality is an eminently plausible, even if not absolutely compelling, line of reasoning. It does not, to be sure, solve every problem or illuminate every mystery surrounding the idea of an animal constituted as human by an immaterial form. But, then, as the copious recent literature in philosophy of mind attests, both dualism and physicalism are saddled with problems and mysteries of their own.
For present purposes, the important point is that the Aristotelian argument for immateriality cannot be turned into an argument for Cartesian dualism. First of all, this argument does not apply to merely sentient operations, whereas strict Cartesian dualism holds that all mental properties--sentient as well as intellective--must necessarily have an immaterial subject. Second, and more fundamentally, the Aristotelian argument is precisely that the higher and distinctive sort of unity exhibited by the human organism demands the immateriality of the form of that organism, whereas dualism destroys the unity of the human organism, positing an accidental composition of body and soul, neither of which draws its identity from the other. In the end, the Catholic position is that our higher intellective and affective operations, like their sentient and vegetative counterparts, are operations of the whole animal organism--and not of a quasi-angelic entity accidentally connected to a body, in the manner of the archangel Raphael in the book of Tobit.
What's more, the dualistic identification of the self with the soul is not only metaphysically implausible, but has the dire moral consequence of leaving the body bereft of more than merely external or instrumental moral significance, and this flies in the face of some of the most penetrating recent philosophical and literary work, including the work of Pope John Paul II, on the phenomenology and theology of the body.(36)
The upshot of the Church's position, then, is that neither dualism nor physicalism in any of its myriad forms is true. In this case, tertium datur. Human beings are unified substances with an immaterial formal principle. We are both continuous with other animals, because we share sentience with them, and distinctive among animals because of our higher intellective and affective powers. Neither angels nor beasts, but animals of a high order, and, if the Catholic faith is to be believed, animals specially honored by God in part, amazingly, because we have dishonored ourselves. Such are the first principles of the Church's teaching on the soul and the human being.
The contemporary problematic: the lure of property dualism
Yet when we turn to contemporary philosophical discussions in mainstream analytic philosophy of mind, we find no room at the inn for an account of the human being compatible with these first principles. I take as a representative summary of the current debate the treatment afforded it in Peter van Inwagen's Metaphysics, a book meant to introduce undergraduate philosophy majors or more mature general readers to its subject matter.(37) (In other words, it is the sort of book that, say, Tom Wolfe should have no trouble following.) Despite its introductory nature, however, van Inwagen's book is full of subtlety and sophistication, and is thus a good place to get a general sense of the contemporary problematic.
As van Inwagen sees it, the basic taxonomy of answers to the question "What kind of thing are we human beings?" contains just two contenders:
"The possible answers to the question we are trying to understand (at least the possible answers that are taken at all seriously today) are all forms of either dualism or physicalism."(38)
After explaining that 'physical' things are those made entirely of elementary particles, van Inwagen goes on to say:
"Let us call a 'non-physical' thing anything that has no parts that are physical things. The two classifications 'physical' and 'non-physical' are not exhaustive: an object composed of both physical things and non-physical things would be neither. We could call such an object 'composite'. I shall generally ignore the possibility of composites ....."(39)
Van Inwagen goes on to characterize dualism as the position according to which a human being is essentially a non-physical object though connected with a bodily 'organism', whereas physicalism is the position according to which a human being is an organism composed just of physical things. Notice here that what van Inwagen says about composites makes it clear that he is talking about integral parts; he does not even entertain the possibility that a human being might have complementary essential parts, one of which is a corporeal principle and the other incorporeal. In essence, he is simply accepting Descartes's own assessment of the situation--we are either essentially immaterial beings or essentially beings composed of just physical or corporeal parts. The debate then focuses, as it did in the seventeenth century, on the question of whether something composed of just physical things can be capable of thought, where 'thought' includes the whole gamut of sentient and intellective operations. Descartes concluded that such a thing is incapable of thought so understood, whereas van Inwagen joins the likes of Hobbes in disagreeing: "If human persons and human organisms are one and the same, then, since human organisms are obviously physical things, it follows that human persons are physical things."(40)
Notice the assumption here that human organisms are 'physical things', presumably in the sense defined--a sense that rules out the possibility that some organism might be constituted as an organism by complementary physical and immaterial principles.
This same assumption plays a central role in what van Inwagen tells us is the strongest argument for physicalism--namely, what he calls the 'duplication argument'. Imagine an elaborate science-fiction machine consisting of two chambers. When a physical object, defined as above, is put into one chamber, a duplicate is manufactured, quark for quark and state for state, in the second chamber. Now suppose we put a human organism into the first chamber. What would we find in the second chamber? A human organism, but without an immaterial soul, says van Inwagen. After all, the machine is capable only of duplicating objects that are physical through and through. Yet it is eminently plausible to believe that the duplicate would not only look and behave exactly like the original, but that it would have exactly similar mental states as well. Since the duplicate ex hypothesi lacks an immaterial soul, it follows that the original lacks a soul, too.
But, of course, even if this argument worked against dualism, it would not work against a view according to which there cannot be a human organism without an immaterial soul. On this view, if God intends to produce a human organism--whether directly or along with secondary causes like the duplicating machine--he creates an immaterial soul as its form.
I do not mean to suggest by these comments or by anything I have said above that there are no philosophical puzzles or mysteries that attend the metaphysical project of delineating with precision an account of the metaphysical status of the immaterial human soul that is consonant with Church teaching--though we can certainly use St. Thomas as our guide here. But as I noted above, every other theory in this area, be it physicalist or dualist, has problems that are just as intransigent and, I would argue, more so. I want to conclude by making a few comments about property dualism, the sort of 'non-reductive physicalism' that has attracted some Catholics.
First of all, it easy enough to see why, given the contemporary problematic, Catholic thinkers bound by that problematic might adopt property dualism. According to property dualism, a human being is a single unified substance with both physical and psychological properties, and the psychological properties are neither identical with nor reducible to the physical properties. So one who holds, in accord with Church teaching, that a human being is a single unified rational organism whose higher intellective properties are irreducible to physical properties, will find property dualism to be the only game in town. At any rate, it is the most benign form of physicalism, the place to which would-be Aristotelians operating within the grips of the standard picture will naturally gravitate.
But what exactly is the relation between physical and psychological properties according to the property dualist? Very often the psychological properties are said to 'supervene on' or 'weakly supervene on' the physical properties, where the (weak) supervenience of one property on another is said to differ both from identity and from the reducibility of the one to the other. The truth, of course, is that there are many competing accounts of reducibility and many competing accounts of supervenience, and so property dualism is perhaps best thought of as a cluster of positions, differing from one another according to differences in how they understand reducibility and supervenience. So the first thing to say about property dualism is that any proponent of it owes us a precise account of both reduction and supervenience.
But many questions still remain. Even if each psychological property is distinct from its correlated physical properties, does it follow that any given psychological event is distinct from every physical event? If correlated psychological and physical events are not identical with one another, do they bear causal relations to one another? If not, then exactly how are they related? What about the causal relations between psychological events and the physical events in general? Do physical events cause just other physical events, or can they cause psychological events as well? What about the other way around? Or are there two causally independent realms of events, the one physical and the other psychological? If so, are psychological events simply epiphenomenal? Finally, does exactly the same general account hold for both sentient and intellective psychological operations, or are they distinct from one another in some fundamental and characterizable way that captures the distinctiveness of human psychology in comparison with that of other animals endowed only with sentience?
Someone might retort that all theories, including ones consonant with the plain sense of Church teaching, have similar questions to answer, and that it is at least not obvious that property dualism will fare any worse than the others. Fair enough. My point, in the end, is not that property dualism has its problems. That is beyond dispute. Rather, I am simply wondering aloud why any Catholic philosopher would favor it over any and every theory that is consonant with the plain sense of Church teaching. If divine revelation is a great gift to us--a cognitive as well as an affective gift--and if the Holy Spirit has guided the Church in the determinations she has made on these matters, and if, further, there is no compelling intellectual (not to mention spiritual) advantage in abandoning the plain sense of the Church's teaching, then why do it?
In the meantime, the good news is that your soul is not dead.
1. The essay is reprinted in Tom Wolfe, Hooking Up (New York: Picador USA, 2000), pp. 89-109.
2. See, for example, Lost in the Cosmos: The Last Self-Help Book (New York: Picador, 2000, originally published in 1983), section entitled "A Space Odyssey (II)," pp. 225-262. Here is a representative passage:
"The modern objective consciousness will go to any length to prove that it is not unique in the Cosmos, and by this very effort establishes its own uniqueness. Name another entity in the Cosmos which tries to prove it is not unique.
"The earth self seeks to understand the Cosmos overtly according to scientific principles while covertly exempting itself from the same understanding. The end of this enterprise is that the self understands the mechanism of the Cosmos but by the same motion places itself outside the Cosmos, an alien, a ghost, outside a vast machinery to which it is denied entry" (p. 254).
3. For a defense of liberalism based on this sort of skepticism, see Gary Gutting, Pragmatic Liberalism and the Critique of Modernity (Cambridge: Cambridge University Press, 1999).
4. See Fides et Ratio (1998), no. 1.
5. See Ex Corde Ecclesiae (1990), Part I, "Identity and Mission," no.16. Unfortunately, American commentators on all sides have largely ignored Part I, which lays out in instructive and inspiring fashion the ideal of Catholic higher education, and concentrated instead on the legal requirements laid out in Part II.
6. Second Vatican Ecumenical Council, Pastoral Constitution on the Church in the Modern World, Gaudium et Spes, no. 22.
7. Fides et Ratio, no. 60 (my translation from the Latin).
8. Fides et Ratio, no. 51 (my translation from the Latin).
9. See "Two Roles for Catholic Philosophers," pp. 229-253 in John P. O'Callaghan and Thomas S. Hibbs, eds.,Recovering Nature: Essays in Natural Philosophy, Ethics, and Metaphysics in Honor of Ralph McInerny (Notre Dame, IN: University of Notre Dame Press, 1999).
10. I have explored in some depth the Holy Father's own attitude toward natural reason in "Fides et Ratio: A 'Radical' Vision of Intellectual Inquiry," in Alice Ramos and Marie George, eds., Faith, Scholarship and Culture in the 21st Century (American Maritain Association, 2000) (Washington, DC: Catholic University of America Press, forthcoming).
11. Fides et Ratio, no. 4.
12. For an expression of this perspective by a Christian who is not a Roman Catholic, see Peter van Inwagen,Metaphysics (Boulder, CO: Westview Press, 1993), pp. 177-178. Van Inwagen is careful to assert only that physicalism may be compatible with the doctrine of the resurrection. This is prudent, since the arguments for this compatiblity are far from compelling in light of worries about the continuity of personal identity.
13. Moral theory in general, and the theory of virtue in particular, is another salient context in which the very same principles of inquiry should apply.
14. To forestall one easy objection, I am not suggesting that the development of a Christian philosophical anthropology ought to proceed either without recourse to philosophical theories that have been developed without the aid of Christian revelation or in ignorance of contemporary debates. My point is rather the positive one that revealed truths should themselves play a central role in the full development of any autentically Catholic philosophical anthropology.
15. On the plain sense of texts, see Edward T. Oakes, S.J.., "Stanley Fish's Milton," First Things, no. 117, November 2001, pp. 23-34, in which Oakes reviews Fish's How Milton Works (Cambridge: Harvard University Press, 2001). On Milton's view, nicely brought out by Fish, polysemy and ambiguity are among the consequences of original sin and the tools Satan uses to induce confusion in us.
16. Pope Pius XII, Humani Generis (1950), esp. nos. 36-37. In no. 36 the Holy Father states explicitly, "The Catholic faith obliges us to hold that [human] souls are immediately created by God." In no. 37 he says of polygenism: "The faithful cannot embrace that opinion which maintains that either after Adam there existed on this earth true men who did not take their origin through natural generation from himas from the first parent of all, or that Adam represents a certain number of first parents. It is in no way apparent how such an opinion can be reconciled with that which the sources of revealed truth and the documents of the Teaching Authority of the Church propose with regard to original sin, which proceeds from a sin actually committed by an individual Adam and which, through generation, is passed on to all and is in everyone as his own."
17. In what follows I will cite only the Catechism without the footnotes that refer to countless authoritative statements of the past. A fuller treatment of these issues than I am able to give here would delve into those statements and their historical contexts.
18. Even though the nature and application of the so-called 'notes of theological certitude' are to some extent a matter of dispute, the analogia fidei itself, along with liturgical practice and the recognizable and recognized historical witness of the saints, serves as a strong constraint on novel interpretations of established doctrines. This is as it should be. Philosophical and theological speculation on the part of Catholics does not take place in a logical and historical vacuum, but is at the service of a community with both a history and a tradition. This explains in part why claims to recovery of neglected doctrine and practice will in general be taken more seriously than claims to originality.
19. Catechism of the Catholic Church, no. 33. The internal quote is from Gaudium et Spes no. 18.
20. Catechism of the Catholic Church, no. 366.
21. Catechism of the Catholic Church, no. 2258.
22. Francisco Suarez, S.J., Disputationes Metaphysicae 22, 1, 12.
23. I suppose that one could claim that an immaterial human soul somehow 'emerges' from a potentially human body, even though it is difficult to see exactly how this would work. In any case, any Catholic who posits an immaterial soul in the first place should not balk at its being directly created by God. Presumably, the real sticking point for one who balks at the direct creation of the soul is its very immateriality.
24. Actually, the title 'property dualism' could, strictly speaking, apply to any theory that holds that a single human substance has both physical and irreducibly psychological properties--including theories consonant with Church teaching. But in fact the term is used just to describe a position according to which the human substance is composed just of physical things, with no immaterial aspect.
25. One popular move to avoid positing an immaterial soul is simply to eliminate the interim between death and resurrection by claiming that death and personal resurrection are simultaneous, thus ascribing to all the blessed (and to the non-blessed as well) what, according to the plain sense of Catholic teaching, is a singular privilege of the Mother of God (Catechism of the Catholic Church, no. 966). But there seems to be little basis in either in either faith or reason for such a move. A very different reason for eliminating the interim is to calm worries about the coherence of the immaterial soul's existing on its own without any matter to inform. See James F. Ross, ".....Together with the Body that I Love.....," in Michael Baur, ed., Person, Soul, and Immortality: Proceedings of the American Catholic Philosophical Association (New York: American Catholic Philosophical Association, 2002). However, it seems to me that Ross does not offer a strong enough argument for abandoning the more traditional belief that the soul exists in the interim without any matter to inform.
26. Catechism of the Catholic Church, no. 1022.
27. Catechism of the Catholic Church, nos. 958 and 1030-1032.
28. Catechism of the Catholic Church, nos. 956-957, 1090, 1136-1139, 2683.
29. Catechism of the Catholic Church, no. 966.
30. Catechism of the Catholic Church, nos. 631-635.
31. Catechism of the Catholic Church, nos. 624, 997, 1005.
32. Metaphysics, pp. 159-160.
33. See James F. Ross, "Immaterial Aspects of Thought," The Journal of Philosophy 89 (1992): 136-150.
34. Catechism of the Catholic Church, no. 365.
35. See Alfred J. Freddoso, "Suarez on Metaphysical Inquiry, Efficient Causality, and Divine Action," in Francisco Suarez, On Creation, Conservation and Concurrence: Metaphysical Disputations 20, 21, and 22 (South Bend, IN: St. Augustine's Press, 2002), pp. 1-118. For a spirited defense of hylomorphism against certain recent philosophical tendencies, see James F. Ross, "The Fate of the Analysts: Aristotle's Revenge," Proceedings of the American Catholic Philosophical Association 64 (1990), pp. 51-74.
36. The area of sexual morality is one important place where the moral danger of dualism is especially evident. To the extent that I dissociate my "real" self and intentions from the movements of my body, I am prone to believe that my bodily movements, including sexual acts, have just the meaning I invest them with and no intrinsic meaning of their own.
37. Metaphysics, chaps. 9 & 10, pp. 149-183.
38. Metaphysics, p. 150.
40. Metaphysics, p. 151.