What part of “rant” don’t you understand? Look it up in the dictionary!
The other day I listened to a radio program about people with “mixed identity” who were either agonizing over which “identity” to identify with or bemoaning the difficulty of remaining “true” to one of their “identities”. At first I couldn’t understand why I found this disturbing, but soon it came to me: my whole life has been about trying to escape from my various “identities”.
I was born “white” in Florida to an upper middle class mother and a middle class father. Most of my friends growing up were typical “Southerners” — not all “white”, but all fond of fishin’ and huntin’ and fast cars and the like. So my early identity was what I now think of as “privileged Southern redneck”. When the local high school had a home football game, the whole town came out and sang “Dixie” to the Confederate flag to start the game. When I had learned enough history to begin to grasp the implications of that, I stopped going to the games. When I saw a cross burning on a stranger’s lawn, I was stunned. When the house my uncle built for itinerant Jamaican fruit pickers was burned down by the neighbors, I was furious but impotent.
But I got a break: I went away to prep school in Michigan and learned to think, write and talk like a “Yankee”. I repudiated my “Southern redneck” identity, although I still enjoyed going home to Florida for vacations, because that was while Florida was still relatively unspoiled, with wildlife and fish and remote beaches and swamps galore. My love for Florida (the land) was lasting, but I no longer cared much for the people or the culture. I became a “preppie”.
Needless to say, my friends in Florida didn’t care much for my “preppie” identity. After a while, I began to see their point. My “preppie” friends tutored me in the proper deportment of a true “preppie” — one must always perceive “townies” as fundamentally inferior, which understandably annoys hell out of same. I tried pretending to be a “townie” but no one was fooled. Meanwhile a lot of the other “preppies” still considered me a “Southern redneck”, which was below even “townies”. So I became a “jock” and tried to win a better identity by running the hurdles really fast.
The “jock” identity (and good grades) served me well, so I got into a decent college in Connecticut and became a “college boy” and soon a “frat man”. Those years were packed with developmental crises, as they are for most adolescents; I managed to assuage any angst I might have over my newly mixed identities by staying drunk a lot.
When I graduated I resolved a typical “What next?” crisis by applying to Berkeley for graduate school in Physics. That summer I had a spectacular adventure with model airplanes, lost glasses, a Rocky Hill CT police detective and a government shipment of guns. That and the looming Viet Nam draft prepared me well for Berkeley, where I got to work on my “hippie radical leftist” identity at the same time as my “physicist” identity. I was at People’s Park (as a spectator). I marched. I carried signs. I lived in a communal house. I smoked dope. I went to concerts at the Fillmore. I missed The Last Waltz. I got married for the first time and moved into an apartment a block and a half from the corner of Haight and Ashbury in San Francisco. I went to Sexual Freedom League parties. I tried to grow long sideburns; I failed at that, but did well at Physics. Pretty soon I had a PhD and a failing marriage. I told my Florida draft board exactly what research I was working on, and they granted me a deferment. Pretty soon I was 26 and no longer a prime draftee anyway.
Through a mixture of sheer luck and good judgment, I became involved in the development of a cool new experimental technique using muons that began to take off just as I got my PhD, at the same time as major new accelerators were coming on line in Canada, Switzerland and Los Alamos. I chose TRIUMF, the one in Canada. In 1973 we moved to Vancouver and I became a “hippie postdoc”. That was a satisfactorily ambiguous identity for a few years, during which I split with my first wife and lived in a communal house again. Then I became a “professor”, which involved a lot more work — like 100 hours per week for the first few years and at least 50-60 hr/wk until I retired at 65. Somewhere in there I became a “Canadian” by choice. I was still an “American” but I felt ashamed about that, whereas I was only mildly embarrassed to be a “Canadian”.
Somewhere in there it was not-so-patiently explained to me that I was an “oppressor”. As a “man”, I oppressed all women, whether I intended to or not. As a “white man”, I oppressed all people of color, whether I intended to or not. As an “old white man”, I oppressed all young people, whether I intended to or not. As a “old cis straight white man”, I oppressed all LGBTQ people, whether I intended to or not. As an “old cis straight white male intellectual elitist”, I oppressed all ignorant people, whether I intended to or not. Since I didn’t get a choice in the matter, I made myself a black T-shirt with “OPPRESSOR” on the front in big red letters, so that people wouldn’t think they needed to inform me of my identity. It didn’t work.
I guess this was when my distaste for “identity” peaked — or maybe it was that instinctive distaste that so infuriated me when all those awful identities were forced upon me. So I became an “angry old white man” on top of everything!
Pretty soon I ran out of resentment and started examining the logic of “guilt by privilege”. No, I never asked to be the beneficiary of advantages that could be traced back to colonialism or slavery… but I was, and I didn’t turn it down. If nothing else, that meant I had also inherited an obligation to do what I could to make up for those injustices. Is it possible to ever “make up for” such injustices? No, of course not. But how could I not try? The first step was to put away that “OPPRESSOR” T-shirt and the identity that went with it. Others may forever see that identity stamped on me, but I don’t have to give it power.
By the early 21st Century it became obvious to many people that Homo sapiens had phase separated into two subspecies: Homo individualis and Homo collectivus. The former was the end result of classical Darwinian competition for resources and progeny; the latter came about following the handoff of evolution from genes to memes. H. ind. was selfish, irresponsible and uncaring; H. coll. was generous, responsible and caring. Once the phase separation was more or less complete, H. ind. began to exhibit more extreme behavioral traits such as narcissism and a cynical disregard for (or even a total lack of awareness of) social norms and contracts. Meanwhile, H. coll. became obsessed with the perfection of society and haunted by their own failure to live up to expectations.
Into this world came a H. coll. named Anthony Ahol, who chose a career as a geneticist and formed the hypothesis that the difference between the two subspecies could have a detectable genetic signature. Anthony obtained DNA samples from thousands of persons who displayed all the characteristics of H. ind., even going so far as to disinter famous individuals from the recent past. His research was finally successful: the complex of genes common to most members of H. ind. was dubbed the Ahol factor, after its discoverer. Naturally the name was consistently mispronounced by most H. ind. folks, usually with a measure of pride.
Within a few years, Anthony began to wonder if there might be a “cure” for the Ahol factor. He experimented with various viruses that might attack the specific genes causing H. ind. characteristics, and eventually found a very effective strain. Unfortunately, although the virus simply deleted the Ahol factor in vitro, leaving the cell cultures otherwise healthy, when he accidentally dropped a contained of the virus and it escaped into the world, he found that it was not so benign to the complete organism: within a year, every member of H. ind. was dead. There followed a period of peace and prosperity unlike any the world had ever seen. Through cooperative effort and self-sacrifice, H. coll. managed to eliminate poverty, hunger and exploitation, cease production of greenhouse gases, replace fossil fuel-burning power plants, individual automobiles and roads with solar and wind power, self-driving free electric taxis and buses and quiet, high-speed light rail transit systems. All health care and education was free. No one tried to accumulate excessive wealth, everyone behaved responsibly and helped each other achieve their best possible selves.
Unfortunately, the previously noted negative correlation between standard of living (including education) and number of progeny continued until essentially no children were being born; within a century H. coll. was also extinct.
The trouble with “political correctness” is that it puts the cart before the horse. There is nothing wrong with trying to be sensitive to the feelings of others, cognizant of your own complicity in evil (even if only by inaction), appreciative of your unearned good fortune, responsible in your interactions with people – gasp – and the environment, wise and exemplary in your life choices, kind, generous, tolerant and caring. The problem arises when you try to think independently with all these caveats taking precedence over what you think or say.
I contend that you need to decide what you think first, and only then try to fit your conclusions into interstices in the lattice of mandatory or forbidden policies.
This is impossible, of course, because many of the lattice sites are defects: a lot of politically correct pre- or proscriptions are stupid ideas, born of a misguided or lazy compulsion to reduce everything to a few simple rules.
Hey, I know about this compulsion first-hand; I’m a Physicist! But, as Einstein once said, “Physics should be made as simple as possible – but no simpler!” The same goes double for politics.
So when you allow yourself to think independently of all the constraints of political correctness, you are bound to run into trouble: you will find that you don’t agree with the rules set by well-meaning, charitable “liberals” any more than you agree with the rules set by cynical, self-serving “conservatives”. This is bound to make your life difficult.
Well, duh! You thought life should be simple and easy?
The good news is that if you actually know what you think, you are less likely to be afraid to hear what others think. You may be able to hear them out, understand their point of view and argue amiably with them in a way that encourages them to hear what you have to say.
Sounds simple, eh? So how come it is a lost art today?
Look, that was a long time ago. I wasn’t even there. I’ve never rebelled against God. What does all that have to do with me?
Have you not benefitted from the fruit of the Tree of Knowledge?
Well, yeah, of course. But it would be stupid to just throw away knowledge, once it’s learned. Besides, I try to use that knowledge for the glory of God, and to make the lives of my fellow humans better.
So that they, too, become complicit in Sin?
Oh, come on. Seriously? A farmer uses a lever to move a rock, and that’s the same thing as taking an apple from a Snake?
The Children of Eve are damned, every one.
And there’s nothing we can do to atone for this ancient guilt?
Have you accepted the Savior?
Well, sure. Is that all I need to do?
Oh, okay, the tithe. Fine. I’ll donate 10% of my income…
Ten percent? You will give up all your worldly possessions, because they are all derived from the fruit of the Tree of Knowledge.
Fine. Take it all. Say, aren’t you a Child of Eve too? Where does that leave you?
Ever notice how some phrases that have always seemed innocuous suddenly get really offensive when you stop to think about what they actually mean? (Look up “rule of thumb” sometime if you want to spoil that figure of speech for yourself.) The one that’s started bothering me since I got old enough to be sensitive about such things is “premature deaths”.
Every time a solemn health bureaucrat reports a discovery that something is bad for you, you will hear these words: “Studies have shown that [the bad thing] has caused [some number or percentage of] premature deaths in the last [time unit].”
Well, that certainly makes us want to avoid [the bad thing], but have you ever asked yourself how many of the other kind of deaths it has caused?
What other kind? Why, the non-premature kind, of course. What shall we call them? “Post-mature” deaths? “It’s about time, you old geezer!” deaths? Hold on here – if you want to label a death as non-premature, shouldn’t you have to ask the dying person first? “Would you say that your imminent death is premature?” I’m betting you’d find that almost every death is premature.
“Oh no,” the prim health bureaucrat assures me, “that’s not what it means at all. We have calculated everyone’s life expectancy and the probabilities of dying from various causes, so we know when you would normally be expected to die, and of what, so if you die sooner than that, of that cause, then it’s considered premature.” (Was that a condescending smile for this pathetic ignoramus?)
Yes folks, your days are actually numbered – literally! Here’s a little secret I bet you didn’t know: when you’re born they tattoo a little “best before” date where you’ll never notice it – on the underside of your tongue, amongst all the veins and salivary glands and stuff you’d really rather not think about (sorry!). Moreover, it’s printed backwards, like a mirror image, so it is really hard to recognize as an expiry date. Why? So your dentist can check periodically to see how much longer you are good for. Yes, they’re all in on it, the dental hygienists too! Don’t let them catch you checking for it in the mirror.
Picture the doctor standing over your death bed with a clipboard and a watch: “Come on… come on… Dang! Missed the deadline! Oh well. Nurse, scratch out that check mark in the premature death box.”
Do they have a special “Post-Mature” ward at the hospital where they give token care to people who will no longer swell the ranks of premature cadavers when they kick off? Based on experience, I think maybe yes. But it isn’t a separate ward; everyone just knows….
OK, I’ve beaten this horse to death (there’s another phrase we might want to give up, especially around Animal Rights folks). I just hope I’ve generated enough “cringe factor” to discourage the use of this particular offensive term and encourage those humourless health bureaucrats to find some other way of expressing their statistical inferences.
You may not like this. We live in an era of excuses, and everyone has lots of them. I’m here to tell you that they are mostly illusory and are holding you back from a better life. You will probably think I’m just lacking compassion. I don’t mind if you come to that conclusion after you’ve heard me out and given my words some thought, but if you start with that assumption, we both lose.
Case in point: I am 71 years old, and I just had my first cataract operation last week. It wasn’t so bad. My eye’s still a bit sore and the new lens hasn’t completely settled into position yet, so my vision hasn’t really improved so far, but I’m confident it will soon.
The problem is, I was told not to lift anything heavier than 10 pounds for 3 weeks, and “strenuous exercise” is a no-no for at least that long. So I have to “act like an old man” for 3 weeks. Sounds easy, right? My knees and back could use some “down time” to recover from running hurdles.
But after less than a week of enforced lethargy, it’s already becoming a habit! Right now I feel weak and fragile — pretty much like the stereotypical 71-year-old man — and it’s hard to imagine doing one pushup, never mind my usual 22. If I didn’t have documented evidence that I can indeed run the hurdles in Provincial age-group record time, I’d find it fantastical.
Which puts me in a position to understand why so many older people firmly believe that athletic competition is a thing of their distant past; that they will never be able to drop those extra pounds; that heavy lifting would be insanely reckless; that they’d better hang on to all the handrails lest they fall and fracture that doubtless-fragile hip joint; that their walks should not be too brisk lest the ol’ ticker get stressed out and stop ticking. Hell, I’ve been advised of all those myths by family, friends and medical personnel, many times.
So without empirical evidence to the contrary, why would I question the stereotype? And if I did “act my age”, how long would it take to make the stereotype true? Longer than 3 weeks, I hope!
Here’s the thing: how can anyone acquire enough empirical evidence to the contrary to convince themselves that they can Do It? One can watch others Doing It and get inspiration from that, but it’s surprisingly (well, not really) difficult for people to draw conclusions about themselves from evidence about others. (That’s called a “failure of enlightenment effect” by Psychologists, I believe.) The only thing that’s going to convince you that you can Do It is Doing It yourself! (That’s called a “Catch-22“, I believe.)
If you’re like me, that means more than just Doing It once and patting yourself on the back. The conviction dies within days when I try to ignore societal stereotypes of what I can and can’t Do. I have to Do It as often as possible, and try to Do It better each time — or at least not worse over the short term. Perhaps I’m insecure. Well, if you’re not, this should be a lot easier for you!
Shall I run through an inventory of excuses? No, that would be both mean and pointless. Deep in your heart you know what actually prevents you from Doing It (whatever It might be for you) and what is just an excuse, doubtless backed up by a firmly entrenched stereotype. Pain is real. Bones do break. Fat is hard to burn off (my metabolism seems to convert every gram of carbs directly into an ounce of fat). Spines compress with age. (I found out last week at that I am 2.25 inches shorter than I was at 25. Over two inches! Ack! it must be bone-on-bone all the way down now.) Pulmonary embolisms (I’ve had two) reduce your lung capacity. Chemo has many impacts. Shit happens. You are definitely going to slow down with age; but that’s what the Age-Graded Tables are for!
As long as you give yourself a full list of meaningful and worthy “It“s,
You Can Do It.
Now for the surprise: I am not just lecturing old people. You younger folks have plenty of excuses too, and are prone to regard great accomplishments and heroic deeds as out of your reach, for reasons you can recite by heart. Most of them are perfectly valid as far as they go, which is usually not as far as you think. The most important lesson I have learned in my life is that
You Can Do Far More Than You Think You Can.
And you’ll be glad you did.
Warning: coarse language!
I just finished struggling with a triply-sealed container for the umpteenth time. This time I was wise enough to get out my needle-nosed pliers before I broke a fingernail. It made me wonder how many person-years of frustrating effort are wasted on this nonsense every minute. (See below for my estimate.)
“Wasted? Nonsense?” I hear the self-righteous Safety Nut crying, “Don’t you care about people being poisoned by domestic terrorists?”
“No? No?? NO??! What kind of monster are you?”
The kind who thinks. Every policy, if enforced, has consequences — some good (benefits) and some bad (costs). Ignoring either one is a fool’s errand. Many (if not most) people don’t want to think; they want to spout homilies.
“No price is too great to pay to save even one human life,” recites the idiot.
Of course it is! You are SO full of shit! If you really gave a damn about saving lives, you’d be giving all your money to the Food Bank, or (to save a lot more lives) to some Relief Fund for Refugees, or (to save the most lives) to medical or agricultural research. Maybe you’d even be doing something yourself.
I reckon I spend an average of about a minute a day cursing at jars and bottles with child-proof seals. (Some children must be very resourceful.) That means Americans are investing about 16 person-years per day in preventing poisoning by sick fucks. That’s just under 6,000 person-years per year. Assuming the putative poisoned would have an average of 60 years subtracted from their lives, we would need to be preventing at least 100 poisonings per year just to break even. There must be a lot of sick fucks out there.
We could quibble over the difference between living under frustration and being dead, but if you are sure being dead is so much worse, I’d be interested in hearing all about your experience of being dead.
If you lower the bus fare in a major city, some people will die as a result who would otherwise have lived. The question is not, “Will this save even one human life?” It is, “Will this do more good than harm?”
See also Price of Life.
Every unsolicited advertisment you see conveys an unsubtle message about what the advertiser thinks of you. This is especially true of the ads you get on the Web, since (unless you go to a lot of trouble to subvert it) there is software analyzing every keystroke and/or click of your mouse to determine what appeals might be effective on your subconscious. But (with some effort) you can block almost all advertising from your Web browser. The same cannot be said of broadcast media like TV and radio, which have to draw their inferences from which programming you are watching/listening to on which channel. I for one find it impossible to endure the ads that come with any sort of “action” movie or series, since they more or less explicitly declare the advertisers’ belief that I am a testosterone-drenched teenaged male moron. “Chick flics” are no better, as their ads scream, “You are a gullible and insecure woman, worried exclusively about your age, looks and popularity,” or, sometimes, “You will send money to anyone who shows a picture of a sad puppy.” Fortunately, I have a video recorder that allows me to fast forward over these insults.
Unfortunately, I have to buy food; and at least some of the time I have to shop in supermarkets. There I am trapped in checkout lines where I cannot avoid looking at the crassest tabloid garbage unless I close my eyes and try to navigate by touch alone — which entails its own hazards in that context.
Before starting this Rant, for once I bothered to check with Google to see what other people might have written on the subject. I was horrified to discover that most of the Web-accessible opinion on supermarket tabloids seems to accept the notion that they are harmless expressions of Western cultural tradition and/or useful sources of information that would otherwise be suppressed. (Yes, I enjoyed Men in Black too, but I didn’t take it seriously!)
I disagree. I think supermarket tabloids are the most profoundly insulting abuse in the entire arsenal of advertising insults. And I consider supermarkets that respect the “tradition” of shoving them in my face in unavoidable checkout lines to be unworthy of my business.
Surely I am not alone in this reaction. Surely there are enough others who feel as strongly as I do that we could mount a class action against all supermarkets for defamation of our cultural character. Surely…
Are you with me?
This is so obvious it’s embarrassing to be writing about it, but it’s also obvious that a large fraction of my fellow citizens just don’t get it; so I have no choice:
There is a difference between personal choice and epidemiology.
Wikipedia says, “Epidemiology is the study of the patterns, causes, and effects of health and disease conditions in defined populations. It is the cornerstone of public health, and shapes policy decisions and evidence-based practice by identifying risk factors for disease and targets for preventive healthcare.” I’m happy with that definition. Lots of people are charged with the responsibility of worrying about the epidemiological consequences of various choices; and so they should be.
Personal choice is different. It’s up to you whether you want to worry about something that might affect you adversely. Or it should be. These days I find that an awful lot of other people think it’s up to them whether I should worry about my risks.
A concrete example might help clarify this distinction: Project Gasbuggy. Back in middle of the 20th Century, certain parties proposed to use nuclear bombs to collapse salt domes underground, releasing vast quantities of natural gas that was (and still is) stored in such natural formations. The gas thus released would be radioactive, of course, so it was proposed to mix it with other natural gas to dilute the radioactivity to “acceptable” levels.
This proposal was rejected, but let’s suppose it had recently been implemented. Should you worry? Let me rephrase that: should you be worried for your own safety? Exposing millions of people to radioactive natural gas would probably cause thousands of “extra” deaths per year from cancer, so epidemiologically it would clearly be a Bad Thing. But your chances of dying of cancer would probably be boosted from about 30% to something like 31%. (These are not carefully calculated numbers, but it hardly matters for the purpose of my argument.)
Let’s face it: regardless of how much exercise you get, how meticulously you optimize your diet, how good your medical plan is or how carefully you avoid all dangerous practices and hazardous materials, you, personally, are going to die. You need to start by facing this fact. Once you have done so, you should realize that all you have any control over are the time and cause of your death. And not much control at that. It pays not to be foolish, but people are pretty foolish anyway. (Or are they? That’s another question.)
So, in my opinion, for the reasons stated above, you would be silly to worry for you own safety about many things that we all might agree would be unethical to impose on the population at large.
If you watch television, you can’t avoid hearing about how this pill or or that salve will miraculously cure your headache or skin rash or allergy, as long as you don’t mind a small risk of coughing up your lungs while convulsing and popping blood vessels in your brain. The warnings required by law in drug advertising are so dire that people make jokes about them, because taking them seriously would foster the worst kind of paranoia. (Somehow beer ads are exempt from such encyclopedic caveats; although alcohol is certainly a dangerous drug, it’s one we’re used to — and we still remember what happened when we tried to prohibit its use.)
Now I read about a study (see Wells & Kaptchuk) that shows how these warnings trigger a nocebo effect (see the eponymous Rant on this site): patients who are warned about possible side effects in the name of informed consent are significantly more likely to experience said side effects than those left blissfully ignorant.
It follows as night the day that all those warnings on TV about possible serious side effects are actually causing more such effects in the millions of viewers being warned. The legally mandated caveats are actually killing people! Surely the deep pockets of the pharmas are funding massive legal action to strike down these laws; if not, the first lawyer to think of it is going to make a lot of money.
Is there no way to retain the requirement for full disclosure without making more people sick? Sure there is: just provide all the information. Tell us how likely each of the side effects is; that information must exist, or it would be hard to justify requiring the warnings. The only problems are (a) viewers would have to acquire the wit to distinguish between a little and a lot (see my Rant on “Quantitacy” here) and (b) the ads would be several minutes long, unless the announcer learns to talk even faster!
Many neologisms have inherited the connotations of literacy, some of which are formed by the addition of a prefix, like “science literacy”; others are shameless perversions of English, like “numeracy”. The latter is generally interpreted as competence in basic mathematical skills such as addition, subtraction, multiplication and division. I have come to believe that there is a level of competence even more fundamental than numeracy, and I have invented a disgustingly postmodern name for it:
A majority of First World adults seem to lack this competence, and are failing spectacularly to teach it to their children. This is not a new phenomenon, but it is catching on.
The first example that comes to mind was in the 1960s, when selenium (Se) was identified as a cause of heavy metal poisoning (analogous to lead poisoning); showing its capacity to be proactive, the US Congress passed a law making it illegal to sell foodstuffs containing a detectable amount of Se. “Detectable” was an incautious word to use; at the time, it was possible to detect parts per million (ppm) of Se fairly easily. But Se detection technology improved until a few years later it was possible to detect parts per billion (ppb). Shortly thereafter it was found that Se is an essential mineral, making it illegal in the USA to sell food that would keep you alive. Oops.
A more recent example is the Fukushima disaster. Within a few months of the reactor meltdown in Japan, isotopes specific to that event were detected in sea water on the West coast of North America. In no time, many entrepreneurs got rich selling iodine at huge profits to panicked citizens who believed they were about to perish from radiation poisoning. None of these people thought to ask how much radiation was washing up on our beaches, or how sensitive the detectors were that made the identification. For comparison, they should inquire how much more radiation is routinely injected into patients’ veins for diagnostic radiation infusion scans at their local hospital.
Similar misunderstandings result from the increasingly widespread policy of zero tolerance, which see. The problem with zero tolerance is that zero is a very small number. Nature does not contain zero of anything. Not selenium, not radiation, not cyanide, not E. coli, not “insect parts and rat hairs”, not viruses or cancer cells in your body. Thank goodness, since at least some of these are now understood to be essential to life.
As a society, we seem to have given up any sense of balance or ambiguity, in favour of a binary sorting of all things, people and ideas into good or bad. This makes decision-making easy, but it makes wisdom impossible and it makes life hell for everyone. One might imagine that this is due to the bad example set by American politicians in recent years. The definition of Republican policy is “diametrically opposite to Democrat policy” and vice versa. Compromise is seen as weakness; s/he who compromises will not be reelected. Which lays the responsibility back at the feet of the electorate; perhaps we have exactly the government we deserve.
We all know about placebos, right? From the Latin placēbō, “I shall please”? When you take a placebo, you feel better because you think you’re going to feel better. Of course, it’s just your imagination, right? You don’t really feel better; it’s all a fake, right?
Not so much. A double-blind experiment on Parkinson’s patients with fluorine-18 labelled L-dopa vs. placebos showed that the placebos were effective at the specific goal of causing the brain to produce dopamine. The patients didn’t just “feel better”, their brains actually performed the same biochemistry that is stimulated by L-dopa. Think about that. Think very carefully. These people simply thought they were getting L-dopa; they were not trained in using the mind’s power over the body. What if they were?
Yeah, yeah, we’ve all heard that woo-woo stuff before, right? Okay, never mind. Just remember that the “Placebo Effect” is real. Officially real, as in, no drug trial that fails to take it into account is considered valid by… well, anyone. Now think very carefully about the implications of this established fact.
If you believe a pill is going to make you better, it will make you better. Perhaps not as much better as a pill that has a direct effect on your biochemistry, but genuinely better. If this is true, then the opposite might be expected to be true as well: if you believe a pill is going to make you sick, it will make you sick.
And not just pills. Also innumerable environmental poisons, radiation, even stuff that used to be considered harmless but has now been shown (or claimed, and believed) to be detrimental to health at some concentration. If you believe it will make you sick, it will make you sick.
This is called the “Nocebo Effect” (from the Latin nocēbō, “I shall harm”, from noceō, “I harm”). It has also been called the “Antiplacebo Effect” or the “Negative Placebo Effect”, but that seems weaker to me. I want to make a strong point about this response, not just a codicil to the literature on the Placebo Effect.
Today there is an epidemic of allergies and autoimmune disorders, especially among young people who have spent their whole lives being told of all the ubiquitous poisons in their environment, and have never been told that the human body is incredibly resilient and robust. Some of this may be genuine reactions to actual pollutants and unhealthy foods, but it must also be due in part to the expectation that everything we eat or drink or breathe is poisoning us.
Notice that I said, “…in part…”; I didn’t say that environmental poisons don’t exist, or that all maladies are caused by bad attitudes. But what you believe is a significant contributing factor to what you experience. This fact is no longer the property of New Age mystics. It is real science now. Unfortunately, we seem to have no idea what to do about it except to excoriate anyone who disputes the notion that we are all helpless victims with no agency in our own lives.