Quantcast
Channel: Psychology Today
Viewing all 51702 articles
Browse latest View live

Don't Forget to Remember and Remember to Forget

$
0
0

 

   Forgetting is as important as remembering in our lifelong effort to be or become “normal.” Remember when you were a teensy little baby in a general state of learning, of absorbing in your brand new memory unit everything going on in your new world. You didn’t know what it all meant, but you had 3 distinct feelings: pleasure (food, warmth, touch), discomfort (hunger, pain, angst) and fear (startle).

   When we emerge, everything is new and potentially dangerous. We’re leery of everything because we’ve never experienced it before. As time passes, we learn that we don’t have to fear certain things (food sources, warm caresses).  In one very important way, our lives are reduced to the simplicity of two tasks: 1. learning to fear certain things, and 2. learning to stop fearing certain things. Learning to stop fearing things is called “fear extinction” by social and brain scientists. It’s verrrrry important, that stopping part, and it’s what agoraphobes aren’t good at.

   I never really thought of getting over a fear in terms of “extinction,” but that’s in fact what has to happen, in a structural way, in a pact with your brain team - the amygdala, the hippocampus and the pre-frontal cortex (aka “Big” and “Brains”). In order for you top fearing something, there has to be a committee meeting in your brain. I’ll set it up for you.

   You are startled one morning as you’re on a walk you have to take.  A dog in a yard leaps, snarling, as you pass. Your adrenaline alarm goes off, partly because you’ve been bitten by dogs, and partly because your fight or flight system works the way it’s supposed to. You were in danger and you froze, planning your next move.  You understand the fence will probably protect you, so you circle widely and you pass to the diminishing sound of barking. Well, hell. Now what? Who knows who owns the dog in that big apartment building? The next morning, there’s no dog, but the day after your nearly serene walk is shattered once again by that #@!&%##!&! dog. Now each time you leave the house with anxious feelings, not knowing. This business has wrecked your mornings.

   You watch a TV movie one night about a jewel thief who throws a big bloody steak to the estate guard dogs and the next morning you take along some dog treats, turning the barks to wags. You are not at all afraid of the neighborhood dog any longer. Why? The pre-frontal cortex created the notion of taking doggie treats and began to romanticize the positive outcome of it all, completely forgetting there’d been anything about fear.

    The hippocampus had passed along the memory to Big after the first incident and kept the master mind informed. When “Brains” thought up the treat idea she had to then convince the hippocampus to eliminate the fear memory. Hippocampus isn’t as sophisticated as Big, but can understand that a smiling dog poses little risk. The amygdala started it all, of course, jumping back from the fear-provoking event and calling out the alarm to all systems. It’s sort of like that horn blasting chaos when everybody’s running around on the deck of a submerging submarine.  WHOOP! WHOOP! WHOOP!

   In mili-seconds everybody in the body knows there’s a lunging, barking dog and is ready to fight it – or whatever – thanks to the amygdala. Then the hippo had to tell amygdala that from now on no matter how many times the dog appears, nobody else in the brain is going to worry about it, so amygdala might as well not also. When the guard at the gate, the amygdala, cancels a fear memory, that baby is EXTINCT.

   I’ve come across several medical journal articles about that very subject – the extinction of fears. I’m not a science type, but I enjoy reading brain science and will pass along to you links to articles, as well as a summary of what you’ll be reading. Science articles can be difficult to read, but by doing so you’ll increase your knowledge of what afflicts you and help you figure out how to overcome it.

   This article from the “Biology of Mood & Anxiety Disorders” online journal reveals results from a study of the relationship between early, childhood abuse or trauma and structural changes in the brain. It was written by four medical researchers at Duke University and the University of North Carolina.  It may reveal something to you about the roots of your anxiety; at least it will deepen your understanding of you.

http://www.biolmoodanxietydisord.com/content/4/1/12


Learning Mindful Parenting From Dragon Moms

$
0
0

“Before you cross the street, take my hand.

Life is what happens to you while you’re busy making other plans.”

In John Lennon's "Beautiful Boy" we see a mundane example of mindful parenting, helping a child to live in the moment rather than in the future. The work of mindful parenting is in the quality of the attention we bring to each moment, and in the commitment to live and parent as consciously as possible. I feel that I am parenting mindfully when I am seeing my children clearly, as they are, without the veils of my own expectations, fears, and needs so that I can see what is truly called for in each moment.

But I live in a society in which so-called "tiger parenting" seems to be widely practiced as a way of achieving success for children. Popularized by Amy Chua in her attention-getting Battle Hymn of the Tiger Mother and an incendiary Wall Street Journal article, "Chinese Mothers are Superior," this style of parenting may be loathed by some while lauded by others as the only way to get your kids to the top. For these parents, it's the results at the end that matter. But when is the end?

For some parents, it's painfully obvious that the end is early. I have found great lessons in mindful parenting from those who raise children with terminal diseases. Emily Rapp calls them dragon moms who are “fierce, loyal and loving as hell.” Their experiences have taught them, “how to parent for the here and now, for the sake of parenting, for the humanity implicit in the act itself, though this runs counter to traditional wisdom and advice.” Rapp contrasts her way with tiger mothers who are “animated by the idea that good, careful investments in your children will pay off in the form of happy endings, rich futures.” She confesses that she is a dragon mom because her child has a terminal illness and is likely to die before his third birthday; she knows he has no future.

In The Power of Two: A Twin Triumph over Cystic Fibrosis, we meet another dragon mom, Hatsuko Arima, the mother of the twins. When her daughter Ana passed away last year after an incredible journey of 41 years surviving CF and two double lung transplants, Hatsuko received sympathetic condolences. But she confessed to me that while sadness was certainly there, she was also filled with joy from deep gratitude that Ana had received 41 years of life in which she had been able to accomplish so much. From Ana's birth she had been prepared to expect a short life for her child. Parenting with a consciousness that time is limited for one's child can give appreciation for the preciousness of each day.

Some people will dismiss the Dragon Mom as a special way of raising a child with a short life expectancy that can’t be the right way to raise a “normal” child. But Rapp ends her essay with her belief that she has learned something valuable for all parents, “Parenting, I’ve come to understand, is about loving my child today. Now. In fact, for any parent, anywhere, that’s all there is.”

Her message for me as a parent is that living for the future is an illusion: “parents who, particularly in this country, are expected to be superhuman, to raise children who outpace all their peers, don’t want to see what we see. The long truth about their children, about themselves: that none of it is forever.”

Many spiritual teachers tell us of the power of now, to be mindful, to be attentive. Good parenting may involve no more, and no less, than respecting and listening to your child. Those of us whose children do not have terminal illnesses live with the belief that our children have a future for which we must prepare them. But the reality is that this is not always true. Some of our children are taken from us long before we have planned. And for those who are not, parenting for the future may rob us of the presence required to see and listen to our children in a way that enables us to respond to them in the present moment.

In the past few years two of my friends have lost their young children; one in a car accident and the other to cancer. They both remind me to keep trying to discover a way of parenting that leads to deep connection, empathy, and love for both our children and ourselves. Mindful parenting is moment-to-moment awareness and appreciation of "everyday blessings." I try to balance the need to prepare children for a future, while reminding myself to never neglect living fully and loving them each day.

 

 

 

 

How Can the "Slender Man" Girls Be Competent to Stand Trial?

$
0
0

Yesterday, the two middle-school girls who allegedly tried to stab a classmate to death to appease a fictional online character known as Slender Man were ruled competent to stand trial.  

Waukesha County Circuit Judge Michael Bohren found Anissa Weier, 13, competent to stand trial after a three-hour hearing yesterday morning, despite differing conclusions from mental health experts. The experts did all agree that Weier apparently understood the seriousness of the charges. The other defendant, Morgan Geyser, 12, had been ruled incompetent back in August, and has been recieving psychiatric treatment ever since. Now the state psychiatrist in charge has announced that Geyser has since regained competency, despite being schizophrenic. Although Geyser's attorney originally objected to the state psychiatrist's finding, he dropped his objection before the court yesterday, agreeing that Geyser is competent to stand trial.   

This development has engendered much confusion on Facebook, Reddit, Twitter, and other social media websites over (1) how someone who is schizophrenic could be competent to stand trial; (2) how girls who are "crazy" enough to attempt murder to please an online meme could be considered competent; and (3) whether this ruling has a relationship with a "not guilty by reason of insanity" defense.  

To answer these questions, the first thing to understand is that legally, competency is not the same as sanity, mental soundness, or mental health. Competency is determined solely by whether the defendant can understand the nature and consequences of the criminal proceedings against her. Therefore, a schizophrenic defendant can be completely competent as long as she understands what a trial is, what a court is, that she is being tried for a crime, etc. even if she believes, at the same time, that Slender Man is real. One can be very mentally disturbed, and still be completely competent at the same time. For more details, see my previous article, "The Difference Between Competency and Sanity."  

The second thing to understand is that a competency hearing is completely independent from a finding of "not guilty by reason of insanity." Competency means that the court will proceed with trial, during which, "not guilty by reason of insanity" might be a defense used by the girls' attorneys. But do not be confused: there has been no ruling so far regarding "insanity." 

In sum, as long as the girls understand that they have been accused of a crime and are being tried in criminal court, then they are competent to stand trial — no matter what else they might believe. 

_______________________

 

Disclaimer: This article is for informational purposes only and not for the purpose of providing legal advice. You should contact your attorney to obtain advice with respect to any particular issue or problem. Use of and access to this blog or any of the e-mail links contained within the site do not create an attorney-client relationship between the author and the user or browser. The opinions expressed at or through this site are the opinions of the individual author and may not reflect the opinions of any law firm or Psychology Today.

Photo: Wikimedia Commons

Is Language an Instinct?

$
0
0

In my recent book, The Language Myth, I investigate one of the dominant themes that has preoccupied the study of language for the last 50 years or so: whether the rudiments of the human capacity for grammar—central to language—are innate. This idea originated with the research of the American linguist and philosopher, Noam Chomsky, beginning in the 1950s, and gathering momentum from the 1960s onwards. The idea, in essence, is that human infants are born equipped with a species-specific Universal Grammar—a genetic pre-specification for grammatical knowledge, that ‘switches on’ at an early point in the process of acquiring their mother tongue; and this being the case, it takes much of the pain out of language learning. From this perspective, human infants acquire language because they come with a hard-wired knowledge of aspects of grammar—although there is no meaningful consensus on what these aspects might amount to, even after over 40 years of looking. This enables a child, so the party-line claims, to ‘pick up’ their native language. I presented a very partial, thumb-nail sketch of just some of the relevant issues in a short popular science essay, published in Aeon magazine, here. And, I have discussed the issues further in a full length radio interview, available to be listened to here.

In a series of recent posts, summarised here, a number of distinguished linguists, who broadly adhere to Chomsky’s proposition that there is an innate Universal Grammar, suggest that I have either misrepresented the claim(s) associated with the research programme surrounding this hypothesis, and/or misunderstood it; and, in three specific cases that they draw attention to, that I have supported my arguments using findings which they claim to have been refuted—they appear, at least in one case, when discussing what is known in the jargon as Specific Language Impairment, to be referring to the short Aeon essay, rather than the fuller discussion in the book.

The Language Myth is written for a general audience—not specifically professional linguists—and takes the form of an evidence-based rebuttal of aspects of the world-view developed in the popular, best-selling books written by Professor Steven Pinker of Harvard University. Indeed, Pinker’s first popular book, The Language Instinct, published back in 1994, provides my book with its title, albeit, with a twist: The Language Myth plays on Pinker’s book title, which I cast as the eponymous ‘language myth’. Indeed, claiming language to be an instinct is self-evidently a myth, as first pointed out by psychologist Michael Tomasello in 1995—see his book review here.

But importantly, The Language Myth directly takes on what I see as the larger theoretical and ideological world-view of what I have elsewhere dubbed ‘rationalist’ language science. While my target is the presentation in Pinker’s various books, it necessarily encompasses more than just the research programme initiated by Chomsky and his co-workers.

It also addresses fundamental issues and questions in cognitive science more generally, and the range of Anglo-American linguists, psychologists and philosophers of the second half of the twentieth century who helped shape it. For instance, I consider the nature of concepts, our ‘building-blocks’ of thought—and whether these might be innate, in some meaningful sense—the relationship between language and the communication systems of other species; whether language, and the mind more generally, might consist of distinct, and enshrined neurological systems—sometimes referred to as ‘modules’—which evolved independently of one another, for a specific mental function; whether the human mind has its own innate mental operating system—sometimes referred to as ‘Mentalese’, or our Language of Thought; and whether language can, in some shape or form, influence habitual patterns of thought—sometimes referred to as the Principle of Linguistic Relativity, famously proposed by Benjamin Lee Whorf (and not to be confused with the straw-man argument for linguistic determinism—the idea that thought is not possible without language; thought clearly is possible without language, as we know from research on pre-linguistic infants, adults who have suffered language loss—known as ‘aphasia’—as well as studies on other species, who have often sophisticated conceptual capacities, in the absence of language; Whorf explicitly argued against linguistic determinism).

The rationalist world-view boils down to the claim that the linguistic and cognitive capacities of humans must ultimately, and at least in outline, be biologically pre-programmed: that there’s no other way, ultimately, to account for what appears to be unique to our species. In The Language Myth, I argue that there are six component ‘sub-myths’ that make up, and mutually inform and sustain this particular stance. I dub them ‘myths’, because they were proposed, in most cases, before any real evidence for or against was available. And since evidence has become available, most objective commentators would be hard-pressed to say that any of these ‘myths’ have much in the way of clear-cut evidence to support them—I take a slightly stronger position, of course; my assessment is that there is almost no credible evidence. So, here are the six:

Myth #1: Human language is unrelated to animal communication systems.
The myth maintains that language is the preserve of humans, and humans alone; it cannot be compared to anything found amongst non-humans, and is unrelated to any non-human communicative capability. And the myth reinforces a view that there is an immense divide that separates human language from the communicative systems of other species. And more generally, it separates humans from all other species. But recent findings on the way other species communicate, from apes to whales, from vervets to starlings, increasingly suggest that such a view may overstate the divide that separates human language and non-human communicative systems. Indeed, many of the characteristics exhibited by human language are found, to varying degrees, across a broad spectrum of animal communication systems. In point of fact, we can learn more about human language, and what makes it special, by seeking to understand how it relates to and is derived from the communication systems of other species. This suggests that although human language is qualitatively different, it is related to other non-human communication systems.

Myth #2: There are absolute language universals.
Rationalist linguistics proposes that human babies enter the world pre-equipped to learn language. Language emerges effortlessly and automatically. And this is because we are all born with a Universal Grammar: a pre-specification for certain aspects of grammar; whatever the ultimate form of these putative ‘universals’ might be—a universal being a feature of grammar that is, at least in principle, capable of being shared by all languages. Moreover, as all languages are assumed to derive from this Universal Grammar, the study of a single language can reveal its design—an explicit claim made by Chomsky in his published writing. In other words, despite having different sound systems and vocabularies, all languages are basically like English. Hence, a theoretical linguist, aiming to study this innate Universal Grammar, doesn’t, in fact, need to learn or study any of the exotic languages out there—we need only focus on English, which contains the answers to how all other languages work. But like the myth that language is unrelated to animal forms of communication, the myth of language universals is contradicted by the evidence. I argue, in the book, that language emerges and diversifies in and during specific instances of language use.

Myth #3: Language is innate.
No one disputes that human children come into the world biologically prepared for language—from speech production apparatus, to information processing capacity, to memory storage, we are neurobiologically equipped to acquire spoken or signed language in a way no other species is. But the issue under the microscope is this: the rationalist linguistics world-view proposes that a special kind of knowledge—grammatical knowledge—must be present at birth. Linguistic knowledge—a Universal Grammar that all humans are born with—is hard-wired into the micro-circuitry of the human brain. The view that language is innate is, in a number of respects, highly attractive; at a stroke, it solves the problem of trying to account for how children acquire language without receiving negative feedback, from their parents and caregivers, when they make mistakes—it has been widely reported that parents, for the most part, don’t systematically correct errors children make as they acquire language. And children can and do acquire their mother tongue without correction of any sort. Moreover, children have acquired spoken language before they begin formal schooling: children are not taught spoken language, they just acquire it, seemingly automatically. But, such a strong view arguably eliminates the need for much in the way of learning—apart from the relatively trivial task of learning the words of whatever language it is we end up speaking. The fundamentals of grammar, common to all languages, are, at least in some pre-specified form, present in our brains prior to birth, so the language myth contends. But as I argue in the book, a large body of evidence now shows these specific assumptions to be incorrect.

Myth #4: Language a distinct module of the mind.
In western thought there has been a venerable tradition in which the mind has been conceived in terms of distinct faculties. With the advent of cognitive science in the 1950s, the digital computer became the analogy of choice for the human mind. While the idea that the mind is a computer has been a central and highly influential heuristic in cognitive science, the radical proposal, that the mind, like the computer, is also modular, was made by philosopher of mind Jerry Fodor. In a now classic book, Modularity of Mind, published in 1983 whose reverberations are felt to this day, Fodor proposed that language is the paradigmatic example of a mental module. And this view, from the perspective of rationalist linguistics makes perfect sense. According to Fodor, a mental module is realised in dedicated neural architecture. It copes with a specific and restricted type of information, and is impervious to the workings of other modules. As a consequence, a module can be selectively impaired, resulting in the breakdown in the behaviour associated with the module. And as a module deals with a specific type of information, the module will emerge at the particular point during the life cycle when it is needed. Hence, a mental module, in developmental terms, follows a characteristic schedule. The notion that the mind is modular might, on the face of it, make intuitive sense. In our everyday lives we associate component parts of artefacts with specific functions. The principle of modularity of design is both a practical and sensible approach to the manufacture not just of computers but many, many aspects of everyday commodities, from cars to children’s toys. However, the evidence, as I argue in the book, provides very little grounds for thinking that language is a module of mind, or indeed that the mind is modular.

Myth #5: There is a universal Mentalese.
The language myth contends that meaning in natural languages, such as English, Japanese or whatever, derives, ultimately, from a universal language of thought: Mentalese. Mentalese is the mind’s internal or private language, and makes thought possible. It is universal in the sense that all humans are born with it. It is language-like, consisting of symbols, which can be combined by rules of mental syntax. Without Mentalese we could not learn the meanings of words in any given language—spoken or signed. But as I show in the book, Mentalese assumes a view of mind that is wrong-headed: it assumes that human minds are computer-like. It also suffers from a number of other difficulties, which make this supposition deeply problematic.

Myth #6: Language does not influence (habitual patterns of) thought.
While everyone accepts that language affects thought in the sense that we use language to argue, persuade, convince, seduce and so on, according to the myth, thought is, in principle, independent. The myth contends that the Principle of Linguistic Relativity—that systematic patterns in grammatical and semantic representations across languages influences corresponding differences in patterns of thought across communities—is utterly wrong. As I show in the book, not only does Pinker, and other rationalists mischaracterise the thesis of linguistic relativity—that the language we speak influences how we habitually think, categorise and perceive the world—he is also wrong in another way. Despite Pinker’s assertion to the contrary, there is now a significant amount of scientific evidence suggesting that, in point of fact, the linguistic patterning of our native tongue has indelible and habitual consequences for how we perceive the world. Of course, the question then arises as to how significant, in terms of influencing individual and cultural world-views, one takes this evidence to be. In a recent book, The Language Hoax, its author, John McWhorter, plays down the significance of the relativistic effects of different languages on the minds of distinct communities of language users. While I disagree with McWhorter’s position—and his review of the relevant evidence is at best partial—given the sophisticated methodologies that now exist for directly and indirectly investigating brain function during routine cognitive and perceptual processing, any objective commentator would be hard-pressed to deny the relativistic influence of language and non-linguistic aspects of mental function.

Ultimately, whether or not one accepts the general argument I make, in The Language Myth, boils down to one’s ideological as well as one’s theoretical commitments. Academic research, like any other human endeavour, inhabits a socio-cultural niche. And ideas arise from assumptions, and principles, sometimes explicitly rehearsed, sometimes not, cocooned within the institutional milieu that helps give them life, and sustain them. In terms of the specifically Chomskyan element(s) of the rationalist world-view that I argue against, my view is that perhaps most damaging of all, has been the insistence that the study of language can be separated into two distinct realms: ‘competence’—our internal, and mental knowledge of language—and ‘performance’—the way in which we use language. Chomsky’s position is that performance arises from competence—given his assumption that fundamental aspects of competence—our Universal Grammar—is, in some sense, present at birth. Hence, competence, rather then performance constitutes the proper object of study for language science. But I, and a great many other linguists, believe that the evidence now very clearly shows this perspective to be wrong-headed: our knowledge of language, so-called ‘competence’, in fact arises from use, from ‘performance’. And Chomsky’s logical error, as I characterise is, has held the field of (Anglo-American) linguistics back for too long.

My rationale for writing The Language Myth, and debunking the world-view presented in Pinker’s popular writing was the following. Pinker’s popular presentations of rationalist cognitive science, at least amongst undergraduate and beginning graduate students, and the informed lay audience, is arguably better known than the work of Chomsky, Fodor and the other leading lights of rationalist cognitive science. And his characterisation—whether one likes, or not, the analogy of language as an ‘instinct’, that Pinker coined—of language and the mind as, ultimately, biological constructions, is widely believed. Many of the standard textbooks, used in the stellar universities across the English-speaking world, promote Pinker’s works as essential readings. Moreover, they portray the sorts of arguments he promotes as established fact. Things are really not that clear-cut. At the very least, the (popularisation of the) rationalist world-view is on very shaky ground indeed. I, of course, didn’t write The Language Myth for committed rationalists; I don't pretend to be able to convince them--it appears, to me at least, that in the case of many such colleagues, their commitment is ideological, rather than being based on an objective and critical evaluation and appreciation of the voluminous evidence. And of course, while they may accuse me of being partial and/or prone to misunderstanding in my presentation, as I show in The Language Myth, the same accusation must then be applied to Pinker, but with several greater degrees of magnitude!

In my next few posts, I’ll be examining some of the evidence, for and against, each of the component myths that make up the rationalist world-view. And in so doing, I’ll also address some of the criticisms raised by Chomksyan colleagues who have objected to my portrayal of things. Whatever one thinks on these issues, these are fascinating times in the study of language and the mind, and an exciting time to be an academic linguist. And my advice to all objective and curious-minded people is to read The Language Myth, and make your own mind up. Some representative and high-profile reviews of the book are below, to give you a flavour of what’s in store.

Book review in The New Scientist 18 Oct 2014
Book review in the Times Higher Education 13 Nov 2014

 

Bad Leaders Hold Onto Power

$
0
0

There are two sides to leadership.  On the positive side, great leaders can make a big difference in the world.  They can inspire others to share a vision and to work together to achieve great things.  On the negative side, there are comforts that come with leadership roles including higher salaries, respect, and other perks.  So, when someone attains a leadership role, they are reluctant to give it up. 

Unfortunately, the behaviors that people may engage in to hold onto a leadership role once they have it can undermine the effectiveness of the group.  An interesting paper by Charleen Case and Jon Maner in the December, 2014 issue of the Journal of Personality and Social Psychology explored some of these behaviors. 

In one study, undergraduates were told that they were going to be put into small groups with the chance to solve puzzles for payment.  Each participant was housed in a different room, so that every participant could be led to believe that he or she was being put in a leadership role.  The participants were told they were given their leadership role because they scored well on a pre-test for the game.  Leaders were allowed to determine how the prize payment from the game was allocated to the players.

In the first round of the game, the participant solved a series of puzzles and was told their fellow teammates were doing so as well.  The leader was then given feedback that one of the other players did better than the leader in this first round.  After that, some people were told that the group was going to have a chance to interact via chat and they could decide to elect a new leader.  Other people were told that the group was going to chat, but the leaders were told that their position was secure.  A third group was told that the group would converse, and no mention of leadership was made.  The leaders were allowed to tell each team member how many chat messages they were allowed to send.

Finally, after the study, each person was assessed for his or her leadership style.  Some people tend to be dominant leaders in which they want to dominate others in order to be the leader.  Other people lead in a prestige-motivated way in which they gain the admiration and respect of those they lead.

The results are a bit complicated, though they make sense.   When people had a leadership strategy focused on dominating others (rather than gathering their respect), they limited access to the chat for the skilled team member in the condition in which their leadership could be challenged.  So, leaders protected their position from the most threatening team member when they felt they could lose their position. 

Limiting communication among team members is generally a bad thing to do, because team members (and particularly skilled ones) could provide advice that would help others.

A second strategy used a similar method, except that leaders had the option of determining the location where team members would sit.  In this case, leaders with a dominating leadership style whose leadership was in jeopardy would isolate the most talented team member from everyone else.  Those with a respect-based leadership style or dominating leaders whose position was not in jeopardy selected seating strategies that integrated the talented group member with the other members of the team.

These studies suggest that people who are prone to want to protect their power by dominating others will engage in behaviors that promote their own interests over those of the team in cases where their power is in jeopardy.  A limitation of these studies is that the participants were all college students who probably do not have a lot of leadership experience.  That said, these tendencies are likely to influence even more experienced leaders, and so they provide a tendency that leaders need to overcome to ensure that they act in the best interests of their team. 

Follow me on Twitter

And on Facebook and on Google+.

Check out my new book Smart Change.

And my books Smart Thinking and Habits of Leadership

Listen to my radio show on KUT radio in Austin Two Guys on Your Head and follow 2GoYH on Twitter and on Facebook.

Are Recommendations on TV Medical Talk Shows Valid?

$
0
0

There are many contexts wherein consumers must establish the veracity of specific information.  Take for example advertising claims.  Is the communicated information credible and trustworthy? The ease with which a consumer might gauge the veracity of advertising claims depends on whether they are considered search claims, experience claims, or credence claims (see for example Ford, Smith, & Swasy, 1988).  Search claims are verifiable prior to a product’s purchase; experience claims are verifiable only after the product is bought and used; and credence claims are difficult if not at times impossible to verify.  Irrespective of whether the advertised information is verifiable or not, it is expected that advertisers will engage in truthful advertising (in the American context, this is regulated by the Federal Trade Commission).  That said, I have written several articles about consumers’ proclivity to be duped by a wide range of quack products and services including the Q-Ray bracelet (see also my YouTube clip THE SAAD TRUTH_14), astrology, therapeutic touch, and coffee tasseography (“reading” one’s future based on coffee stains left on the inside of one’s cup; see also my YouTube clip THE SAAD TRUTH_12).

Protecting consumers from faulty, incorrect, or deceptive information arises in many other contexts beyond advertising.  Take for example the mental health industry.  Patients are generally under the misguided impression that all therapeutic approaches have been fully vetted scientifically while the reality is much more problematic. To those interested in knowing more about this matter, I highly recommend House of Cards: Psychology and Psychotherapy Built on Myth by Robyn Dawes.  While it came out in 1994, many of its points are still as valid today.  While many therapists are purveyors of utter nonsense, there is one professional group that serves as the sultans of BS: celebrities.  By virtue of their fame and fortune, they feel comfortable dispensing “expert” opinions on endless complex topics if only because of their grandiosity and narcissism (see my earlier Psychology Today article wherein I discuss how celebrities’ grandiosity and narcissism permits them to engage in such behaviors).  A classic example of the harmful effects of this phenomenon is the Jenny McCarthy debacle as relating to the MMR vaccine-autism link (see my earlier Psychology Today article on this issue here). McCarthy felt that she had sufficient medical and scientific knowledge to offer a clear and specific medical recommendation, namely that parents should refrain from having their kids receive the MMR vaccine.  Her bewildering arrogance has yielded tangible tragic consequences for she has convinced innumerable people to follow her advice.  Screw the recommendations of the National Institutes of Health…I am Jenny McCarthy!

While one should be weary of medical advice dispensed by celebrities, what about that made by celebrity doctors? How veridical and scientifically substantiated are their recommendations?  Christina Korownyk and her colleagues authored a recent study in The BMJ (stands for The British Medical Journal)wherein they examined this exact issue by analyzing the recommendations made on two television medical talk shows: The Dr. Oz Show and The Doctors.  The researchers taped 78 and 79 episodes of the two shows respectively (from January to May 2013).  Forty episodes from each show were then randomly selected and all of the recommendations were noted.  Subsequently, 80 strong recommendations were randomly chosen for further analysis.  Specifically, four of the researchers searched the scientific literature to establish the veracity of each of the given recommendations.  Specifically, they gauged the evidentiary support (or lack thereof) for a specific piece of advice and the extent to which it was believable.  Here are the key results:

For The Dr. Oz Show

1) The percentage of recommendations that yielded evidentiary support, contrary support, or no evidence was 46%, 15%, and 39% respectively.

2) 33% of all recommendations were categorized as believable or somewhat believable.

For The Doctors:

3) The percentage of recommendations that yielded evidentiary support, contrary support, or no evidence was 63%, 14%, and 24%.

4) 53% of all recommendations were categorized as believable or somewhat believable.

Bottom line: Roughly one-third and one-half of the recommendations dispensed on these two hugely popular medical TV shows are believable.  This is quite an astonishing set of findings and one that reminds us all of the following edict: caveat emptor (buyer beware) even in the context of the consumption of information!

Please consider subscribing to my YouTube channel, liking my Facebook page, and following me on Twitter (@GadSaad).

 

Source for Image:

http://bit.ly/1r3h1tE

Freedom from Harmful, Negative, Thinking

$
0
0

What’s a belief? It’s something that you accept as true. However, some realities differ from what you believe. At one time most people on the planet believed that the world was flat. Some believed that ships that venture outside of the sight of land would fall off the edge of the earth and into the waiting mouths of terrible monsters. Today, some believe that they can be safe and secure only by acting perfect. That’s a formula for insecurity.

Let’s take a cook’s tour into the world of beliefs. Perhaps you’ll find a way to discover and neutralize false beliefs that inhibit you. Perhaps you’ll find a way to build on a helpful belief system that feels emotionally freeing.

Can Two People See Things Differently?

Because we don’t always share the same beliefs, one person’s reality about a situation can different from that of another:

  • A socially-oriented person seeks opportunities to converse with others--even strangers in a grocery store. This person assumes that people are interested in carrying on casual conversations.

  • A socially-anxious person pretends not to notice an acquaintance in a grocery store.  This person feels apprehensive about rejection, and expects it.

Whose belief is right? Maybe a better question is which belief is more likely to be helpful and which belief is more likely to be harmful?

Although the grocery store situation is the same for both the socially-oriented and socially-anxious person, each has a significantly different set of beliefs about connecting with people in the store.

We’re not sure about the socially-oriented person’s beliefs. But it is a relatively safe bet to say that beliefs that lead to social anxiety are likely to do more harm than good.

Your Beliefs Are Guides

Dr, Albert Ellis, who pioneered rational emotive behavior therapy, proposed that we live our lives in three main ways: cognitively, emotively, and behaviorally.  These factors blend together. Each influences the other. That means,

  • How you think affects how you feel and what you do.

  • What you do can result in changes in your thinking and emotions.

  • How you feel can trigger thoughts that fit with how you feel and extend into actions that fit your emotions and thinking.

  • Changes in one area will, in one degree or other, affect what happens in the other areas.

Beliefs shape perspective. Some erroneous perspectives can get you into emotional hot water. Ellis prefers the word “irrational” to describe these harmful beliefs.

How do you know if a belief is irrational?  That’s a controversial issue. Here is one angle. A belief is irrational if it:

  • Interferes with the achievement of your constructive goals and interests. (You want to complete a degree but procrastinate on applying.)

  • Is unreasonable, unrealistic, and illogical in the context where it occurs.  (You think that society has it in for you, and you expect to be disadvantaged.)

  • Is grounded in false assumptions and overgeneralizations. (You believe that you have a fatal disease that your physicians have yet to detect.)

  • Negatively affects your relationships.  (You think that others should adore you for whatever you do, and you demand it.)

If you are irrationally getting in your own way of achieving your most meaningful and attainable goals, there are alternative ways of viewing a situation that better fit the facts and reality. Finding those alternatives, and believing them, is part of the solution. But first, you may have to map a cognitive, emotive, behavioral irrational pattern. For example, If  you fear failure you may procrastinate excessively and use procrastination as an excuse for falling short of what you can capably do. However, to develop this type analysis, you may first have to recognize when you are engaged in irrational thinking.

Recognizing Irrational Thinking

If your belief interferes with seeking treatment for a disease, it’s irrational. However, not all irrational beliefs are harmful. If you believe that crickets chirp chants they learned from monks, there is no proof for this.  Yet, this belief may not cause you any meaningful harm.

How do you recognize an irrational belief that feels real and true that is actually false and harmful? Recognizing negative patterns, by their consequences, is a good first step. Here are three irrational thoughts, examples of consequences, and alternative way of thinking:

  • You feel anxious and depressed. You view yourself as powerless to change. You do nothing meaningful to change. Is it possible that powerless thinking is a detour from taking steps that can make a positive difference?

  • You believe that people don’t like you—even people you’ve yet to meet. Believe this, and you’ve burdened yourself with a painful assumption that you can’t prove. Nevertheless, you actively avoid social situations. Is it possible to give others the benefit of the doubt?

  • You can’t stand not knowing what will happen next in your life. Believe that you must have certainty now, and you’ll suffer from the anxiety of uncertainty. You act as if you were risk-aversive and you play it too safe. Is it possible to accept that uncertainty is a normal part of life and that you’ll cross many unexpected bridges?

Changing harmful irrational belief systems—and their accompanying emotions and behaviors--may be simpler than you think.

Changing Irrational Belief Systems

If irrational beliefs can negatively affect how you feel and what you do, can rational beliefs reverse this effect? It depends on the belief.

Stanford University professor, Albert Bandura, describes self-efficacy as the belief that you can organize, coordinate, and regulate your thoughts, feelings, and behaviors to achieve meaningful goals. Bandura’s self-efficacy theory enjoys strong research support. 

Like most things in life, how you execute a self-efficacy solution makes a difference. For example, consider using self-efficacy as part of a psychological homework assignment. Pick a constructive goal, such as defeating powerless thinking. Work at executing a self-efficacy belief. You may show yourself that you are not powerless. For example, when you believe you are powerless to act, and you do act following a self-efficacy belief, you have an incongruity to resolve. You either are totally helpless or you can take corrective actions.

Let’s reverse our sample criterial for an irrational belief to create a rational, action, perspective. For example, does your belief:

  • Support the achievement of constructive goals and interests?

  • Sound reasonable, realistic, and logical in the context where it occurs?  

  • Fit with facts or testable assumptions?

  • Support positive relationships with significant others in your life?

If your belief reflects the above, great!  If not, then what can you do to substitute a rational perspective for an irrational one?

For more examples and exercises on combatting irrational belief systems and developing rational ones, click on The Cognitive Behavioral Workbook for Anxiety (Second Edition).

Photo: A Crack Between Worlds by Dale Jarvis

This blog is part of a series to celebrate the 100th and 101st year anniversaries of Dr. Albert Ellis’ birth. Ellis is the founder of rational emotive behavioral therapy and the grandfather of cognitive-behavior therapy.

Albert Ellis Revisited (Carlson & Knaus 2013) is the Albert Ellis Tribute Book Series centennial book.  The publisher, Routledge, offers a 20% discount on the book. Control click on this link: Albert Ellis Revisited. Type the code Ellis for the discount. The book qualifies for free shipping and handling. Bill Knaus’ royalties from this book go directly to the Denan Project  charity. When you buy the book, you are helping yourself by learning ways to live life fully, and you are helping bring irrigation, crops, and health care to destitute areas of the world.

For more information on rational emotive behavior therapy, click on Albert Ellis’ official website: Albert Ellis Network:  http://rebtnetwork.org/

For other articles in this centennial (and beyond) Albert Ellis tribute blog series, cut and paste any of the below http links to your server's http request header:

You Can’t Always Get What You Want:  http://www.psychologytoday.com/blog/science-and-sensibility/201411/you-can-t-always-get-what-you-want

 Do One Thing and Stop Procrastinating: http://www.psychologytoday.com/blog/science-and-sensibility/20141...

Steps to Overcome Public Speaking Anxiety:http://www.psychologytoday.com/blog/science-and-sensibility/20140...

When It Comes to Love and Romance, What's Fair? What's Not? :http://www.psychologytoday.com/blog/science-and-sensibility/20140...

Three Core Anxieties and How to Calm Them:http://www.psychologytoday.com/blog/science-and-sensibility/20131...

12 Key Ideas for Self-Liberation:  http://www.psychologytoday.com/blog/science-and-sensibility/201406/12-key-ideas-self-liberation

Ten Commandments to Stop Quick Ejaculation:http://www.psychologytoday.com/blog/science-and-sensibility/20131...

13 Tips to Make Self-Help Therapy Work for You:http://www.psychologytoday.com/blog/science-and-sensibility/20140...

Escape the Guilt Trap:  http://www.psychologytoday.com/blog/science-and-sensibility/201312/three-core-anxieties-and-how-calm-them

5 Mental Traps Relationships Can't Escape:https://my.psychologytoday.com/blog/science-and-sensibility/20140...

Six Calming Tips for Parenting Teens:http://www.psychologytoday.com/blog/science-and-sensibility/20141...

 

Black People Have a Right to Be Angry

$
0
0

Those in power don’t care if thousands of people disagree with the Brown and Garner verdicts as long as white supremacy remains intact. Peaceful protests mean that rich white people will continue to live comfortably. Marching and lying down in the streets does not change the system. Change will occur when the lives of white people are disrupted. If the rioters had destroyed affluent white neighborhoods, the “conversation on race” might have led to some action. When black people destroy their own communities, those in power get their goals met without having to do the dirty work themselves. The changes I’m recommending do not require violence. A psychological shift in the mentality of the majority is needed in order for lasting change to occur.

 

The President can’t speak his mind because he’s a pawn in a system run by white people. He’s smart enough to know that the protests were about more than two trials. African Americans lag behind whites on every quality of life indicator and have been dealing with oppression since the country’s inception. If the President admits it, white people won’t vote for his party in the next election. Worse yet, he realizes that black leaders who call for change, and even white leaders who seek to balance the power structure risk losing their lives. Although the costs are high, it would be nice if political decision-making could for once be motivated by humanity rather than greed.

 

Black people have a right to be angry and frustrated. Although white people think racism is a thing of the past, ethnic minorities (African, Latin, and Native Americans) disproportionately live in poverty. On average, they earn 20-35% less income than whites, receive an inferior education, are more likely to occupy dangerous jobs, and live in polluted, run-down neighborhoods. They have worse health outcomes, higher incidence of disease, and they die younger. Ethnic minorities know they live in a racist system but lack the power to change it on their own.

 

It’s time for white people to admit that they benefit from racism. They have benefitted in many ways throughout history such as profiting from minority exploitation, receiving a superior education, being allowed to purchase homes, securing loans at lower interest rates, being able to vote, and being granted citizenship when other racial groups could not. They haven’t had their families torn apart through slavery, unfair imprisonment, deportation, violence, or murder at the same rates as blacks, if at all.

 

There are too many instances of people being killed when they speak about racism. Everyone knows the famous examples such as Abraham Lincoln, MLK, and Malcolm X but community activists are targets too. Acouple weeks ago, after watching a documentary called “If These Halls Could Talk,” I learned that a California State University student, who had openly discussed racism in the film, was murdered. Certainly there are thousands more examples that haven’t been covered by the national or even local news. I learned about the CSU student because my colleague had heard about the incident through a friend.

 

Overt discrimination, such as Daniel Pantaleo killing Eric Garner, is not as common as the implicit mechanisms used by a majority of Americans such as doing nothing. Although peaceful protesting may offer psychological comfort to whites, more is needed to affect change. Establishing an equitable system will require white people to give up the privileges they’ve been granted simply by being born white. Structural pluralism must be used to promote ethnic minorities into positions of power. Each person, irrespective of race, should recognize his or her role in maintaining the current system and commit to changing it.

 

Malcolm X said, “I believe that there will ultimately be a clash between the oppressed and those that do the oppressing. I believe that there will be a clash between those who want freedom, justice and equality for everyone and those who want to continue the systems of exploitation." It is important to assess which side you are—equality or oppression—and if equlity, to take steps to make it happen. My career affords opportunities to address racism such as writing this post. I also teach a university course in which I discuss oppression and injustice and encourage my students to affect change through voting, running for office, mobilizing their community members, blogging, and joining clubs to fight racism. Given your position in society, what can you do to make it happen?

 


Writing Tip #5: Hearing Voices

$
0
0

I have been an FBI profiler, a forensic scientist, a law professor who digs up the dead, and a tattooed biker from Kentucky, and yet I’m none of these things. I’m a co-author and biographer, and the trick to making this work is to create an authentic voice.

To accomplish this – at least for me – it takes intense immersion as I try to experience my day-to-day world through someone else’s perspective. I read what they read, watch what they watch, listen to music they like, meet people they know, and visit places that are meaningful to them. (For fiction, this means total immersion in your character.) 

It’s fun, even exciting, but it’s all done in the interest of focus and voice. If I want readers to feel close to the people (or characters) I’m writing with or about, I must get close to them myself. While this intimate art can throw off your personal sense of balance, if done well you can fully tell a story through a voice not your own.

Let’s consider this notion of a writer’s voice, because it’s foundational to fiction, narrative nonfiction, and even certain technological pieces. Opinion columnists rely on a distinct voice, as do movie and book reviewers. Bloggers certainly need it if they want to maintain interest, and even how-to manuals benefit from a distinct and colorful attitude. So do memoirs, autobiographies and biographies.

Getting the voice right matters. It defines how characters, real or imagined, think and speak.

Voice conveys attitude, motivation, and credibility, providing the tone through which character and setting are rendered. If you have multiple points of view, as I did with the duo-memoir of Into the Devil’s Den, you work doubly hard to become both people. (The same holds true for multiple points of view in fiction.)

You learn their belief systems, their typical word choices, their cultural background, the parameters of their experience and education, and even how they use words in a sentence.

Ultimately, it’s the attitude that makes each voice distinct. 

Into the Devil's Den is a good example. The FBI, just recovering from the 1995 Oklahoma City bombing, was expanding its domestic terrorism program and needed informants to infiltrate the Aryan Nations, the most dangerous white supremacy group in the country. Special Agent Tym Burkey worked out of Ohio, where a particularly fiendish AN member, Ray Redfeairn, was using his pulpit to connect with other white power groups.

Burkey needed someone who could win over these paranoid militants, and circumstances brought him together with Dave Hall, a tattooed, 350-pound, six-foot-four former biker with a photographic memory and a firm sense of decency. Hall agreed to take the job

As he penetrated this violent society of hate mongers, he looked to Burkey to watch his back, while Burkey prayed that Hall would not be seduced by Redfeairn’s manipulations. Neither quite knew what to expect from the other, so to convey this tension I had the story unfold through their shifting perspectives and the way that their partnership evolved into an unlikely friendship. In the process, they helped stop another major bombing and an assassination. 

Hall had developed innovative strategies to maintain his role and avoid being “erased,” even as it battered his health and cost him relationships. He learned the specialized vocabulary, gestures, and mannerisms expected of insiders, and had to deal with suspicious members who tested his loyalty.  Because he was so good at it, he earned several promotions, which gave him unprecedented access to the top brass.

Here’s where the need for distinct voices occurs: What Burkey relates about the group that Hall infiltrates heightens the sense of suspense, because the reader gets privileged access to information –  and awareness of danger – that Hall does not know. The necessity for him of working blind spices his side of the tale with a heightened anxiety that only Burkey’s friendship can assuage.  Burkey just hopes he can keep Hall alive.

The task for me was to take what Hall had written as a daily journal and shape it into a suspenseful story with a clear narrative structure. I also interviewed Burkey, so that, in strategic places, I could slice in his perspective to advance the story without impeding the pace. This meant getting Burkey to talk about his feelings, too, because he had to grow beyond his role as an agent and come alive as a person. 

Hall’s was the easy voice, because he wrote the way he talked, in a "good ol’ boy" manner that was effortless to absorb. Burkey’s personality was more formal, although he was very easy to talk to as well, but it required more attention. The great thing about these two was how distinct their voices were. My job was to preserve that quality, so I made each a foil for the other whenever I could. Response and reaction were key interactions.

There’s one more angle on the mastery of voice I want to mention: protecting the voice through the editorial process. There were things that different editors wanted to change and sometimes I accepted this, but often I had to call Burkey or Hall, because I sensed the request violated who they were. Putting words into Burkey’s mouth that he’d never say, for example, or eliminating a peculiar phrase that Hall naturally used seemed to make them different from who they were. I had to fight for the integrity of the narrative. 

Sometimes I lost, but mostly I used the writer’s trusty friend, “stet” (leave it alone!), and got it through.

There’s even a chapter that has made grown men cry (and me, too). For me, it was another step in the art of crafting voice.

 

A NEW Paradigm to Fix the World - A Failure to Consider

$
0
0

In the iconic movie, Cool Hand Luke, the prison Captain played by Strother Martin uttered on more than one occasion the iconic line, “What we’ve got here is failure to communicate.”

That situation appears every bit as present today as it was then. The reason it continues is what underlies a failure to communicate. And that is a failure to consider.

Truly considering another person’s POV means comprehending it, understanding it and most importantly “feeling” it from their side. I think few would disagree with psychoanalyst Wilfred Bion’s suggestion for accomplishing the best possible communication which is to “listen without memory or desire.” By that he meant to listen without having a past or future personal agenda that you are trying to plug someone into or having a POV that you are listening selectively from in order to confirm it.

To improve any relationship deeply considering some one else’s POV is the very thing that we most need to do, but few want to do.

We all know that listening and considering what others are saying (as you would have them do to you), so why is it that most people resist it?

The simple reason is that as people are becoming more and more specialized in their thinking they become more steadfast in their resultant conclusions, beliefs and then more hard wired in the filter through which they view the world.

When you’re a hammer the world’s a nail. When you’re China, America’s a bully. When you’re America, you believe that you’re right which is why you believe the entire world should speak English.

Here’s the real snag. The more specialized and siloed your POV, the narrower the area in which you feel competent, confident and in control. This may explain why in many public companies the Board of Directors doesn’t even know the name of the CTO. That is because those Directors don’t want to understand how or why technology works, they just want it to work. CTO’s who make the mistake of talking too much in technological terms that are beyond the comprehension of business executives with hubris and can cause such executives to feel intimidated and stupid which they don’t take kindly to. That was exactly the case when in the 1990’s Jack Welch famously said, “I avoided the Internet because I didn’t know how to type.” This results in marginalizing people whose POV causes you to most lose your sense of competence, confidence and control.

Another reason we avoid considering another person’s POV, especially their feelings is the belief that if we did so, considered and then saw their beliefs as valid that it might cause us to loosen our grip on or to weaken our position (after which we’d get thrown under the bus by a superior when we returned from a negotiation who would rail against us saying that we rolled over) and make us vulnerable to them taking control over us.

Ironically anyone who has had the good fortune of seeing how useless remaining entrenched in a non-working POV is and then truly empathizing with the other side (a la Ronald Reagan’s “Call me Ron” olive branch to Mikhail Gorbachev which may have foreshadowed the ending of the Cold War) usually realizes that it opens doors much more than it opens you to an assault. That is because bared teeth begets bared teeth, while bared necks beget bared necks.

There is actually a neurological basis for this phenomenon. That is because empathy with the true desire to listen, understand and feel another person’s POV is a sensory function, whereas anger usually in retaliation for some perceived injury is a motor function. In general you can’t do both at the same time or in common parlance, you can’t walk in someone else’s shoes and step on their toes at the same time.

Interestingly there appears at least one area of our brain where sensory and motor function occur almost simultaneously. That is the part of the brain containing what are referred to as Mirror Neurons. Mirror neurons were first discovered in the 1990’s in macaque monkeys and referred to as “monkey see, monkey do” neurons. That is because they seemed to be involved when monkeys imitated other monkey or even other primates. Further study has revealed that they are also found in human beings and when functioning seem related to or possibly causative in not only imitation, but learning and empathy. They are what are responsible when we wince when we see or even imagine someone cutting themselves on a piece of paper or our imagining the sounds nails on the chalkboard.

When not functioning properly, recent research has also implicated them in autism or autistic spectrum disorders like Aspergers syndrome, where affected individuals don’t seem able to pick up on social cues and mirror other people.

My hope is that those of you who have not already bailed on this blog will consider it and add your comments, criticism, refinements and will feel free to rip it apart as you see fit. However if you do, please offer up your solutions to the stalemated and then too often violently retaliatory world we live in.

Michael Dowd Exclusive: The Future's Calling Us to Greatness

$
0
0


Michael Dowd is a bestselling evolutionary theologian and evangelist for an honorable relationship to the future. His book Thank God for Evolution was endorsed by 6 Nobel Prize-winning scientists, noted skeptics, and by religious leaders across the spectrum. He's been featured in The New York Times, LA Times, The Wall Street Journal, Washington Post, Newsweek, Discover, and on CNN, ABC News, and Fox News. He and his wife, science writer Connie Barlow, have traveled the country sharing The Great Story with thousands of people. Together they've spoken to more than 2000 groups across North America.

When I personally saw Connie and Michael present "The Great Story" almost a decade ago, I was powerfully moved. Here was an epic story that united what we know about the science of our universe and Life on Earth, with a reality-based, paradigm-shifting vision of humanity's place on Earth and in the cosmos. That story helps frame the central question of our time: How we can move forward as a species at this critical juncture in history to a place of global security, justice, and harmony for all?

If you haven't come across Michael and Connie's work before, you're in for a potentially life-changing experience. If you have, then you already know the epic potential for human capacity their legacy invokes. [For more on this work and others like it, definitely visit The Big History Project and Michael's review of it.]

When I learned recently that Michael had turned his attention to gathering the voices of many others working fiercely to enact the beneficial change (the harmony!) the world needs via the marriage of science, sustainability, and inspiration, I was thrilled. Here were the exact voices and ideas I've been eager to spotlight at Mothering Nature. Michael's new series of Skype interviews with dozens of the world's most respected leaders and visionaries is called The Future Is Calling Us to Greatness. It's part of the prescription the world requires for the systemic, cultural healing and the immediate changes we need to solve the climate crisis, the symptoms that cause it, and everything that is inter-related to these. Michael kindly agreed to let Mothering Nature have a sneak peek at some those interviews, as well as this exclusive interview with him about why he put together this symposium.

The Future Is Calling Us to Greatness speakers include: 

This is an unusual mix of people, but they are among the most important change agents of our time. And central to many of these discussions are the interlinkages between the world's greatest problems—such as climate change, poverty, the disparity between rich and poor, peak oil, sexism, racism, genocide, our tortured relationship with animals and the 6th extinction, and the other symptoms of a highly dysfunctional cultural system. People are now starting to see that when we get to the core of these inter-related problems, the solutions are widespread and systemically restorative. Which is why Michael told me in a phone call:

"Nothing has ever fired me up more than doing these interviews. I came away from many of them with tears of hope and joy. And believe me, I know the dire nature of these problems! These people and ideas give me hope and the energy to act in the face of all the scary stuff."

Join me now to learn more from Michael about this incredible series, how it came about, and why it's so vital to today's world. Then, in the weeks ahead, look for spotlights here at Mothering Nature of some of these powerful interviews before they're released on January 26 (FREE for two weeks). You may decide there's no more important way to start your 2015: for yourself, your family and friends, and, if you are a counselor or healer, for your clients. Likewise, if you're concerned or overwhelmed about Life on Earth, species extinction, climate change, and the other ills of our currrent culture—and you want to know how to help contribute to solving the greatest moral crisis of our time—this series is for you.

Greetings Michael. I’m thrilled to welcome you to Mothering Nature. So much of your life’s work dovetails with my mission: to help unleash an epic force for harmony on Earth. Could you share with us a little about your background and how you work to unleash that force for harmony?

Sure, Rachel. I’ve been a minister for 25 years and I have always had a passion for where science, inspiration, and sustainability intersect. I pastored three United Church of Christ congregations from the mid-1980s to mid-1990s. I then spent several years working at the cutting edge of eco-theology, environmental sustainability, and community organizing. And since April 2002, my wife, noted science writer and evolutionary educator Connie Barlow, and I have crisscrossed North America speaking in churches and colleges across the continent. We've spoken to some 2,000 secular and religious groups over the past twelve years—always focusing on what we call “the sacred side of science and an honorable relationship to the future.”

Two years ago climate change went from being a back burner issue for us to being front and center. Our climate change “come to Jesus moment" happened when we watched David Roberts’ 2012 TEDx talk: Climate Change is Simple. Now, and for the perceivable future, everything I do, everything I teach and preach, and pretty much everything I make time for, revolves around whether or not it contributes to a healthy future or not.

My great mentor, cultural historian and acclaimed “geologian” Thomas Berry, famously wrote, "As a species, our predicament and our way into the future can be summarized in three sentences: (1) Throughout the 20th and early 21st centuries, the glory of the human has become the desolation of the Earth. (2) The desolation of the Earth is becoming the great shame of the human. (3) Therefore, all programs, policies, activities, and institutions must henceforth be judged primarily by the extent to which they inhibit, ignore, or foster a mutually enhancing human-Earth relationship." So I guess you could say that that's my mission in a nutshell: to help make sure that my Christian tradition’s sense of mission, theology, doctrine, etc, is pro-future, rather than anti-future. To my mind, it all boils down to that distinction.

Part of the reason we’re in touch is because you’ve compiled a series of epic and transformative interviews called “The Future is Calling Us to Greatness” with some of the world’s most highly respected scientists, visionaries, and healers. These interviews yield exactly the sort of emergent, paradigm-shifting, and fiercely powerful support and information that Mothering Nature spotlights. They call forth and model the “epic force for harmony” that today’s world requires.

Please tell us about what lead you to do these interviews? Why are they so vital in today’s world? Why have you said this is “one of the most important projects of your life?”

Sure, Rachel, I’d be happy to. Throughout 2012, Connie and I have been teaching and preaching along the route of The Great March for Climate Action. We've addressed more than 100 groups this year, mostly churches but a few college and university classrooms as well, beginning in San Diego in late February and finishing up in Washington DC in early November. (Three were roughly 50 people who marched the entire 3,000 mile journey. Sometimes we were a week or two ahead of them, sometimes a week or two behind them, and sometimes speaking in rallies along the way).

The title of my main evening program throughout the year—a very bold and provocative multimedia program—was the same as this free online Symposium: “The Future is Calling Us to Greatness.”

(NOTE: Just a few days ago, Michael and Connie uploaded this hour-long visually stunning Keynote program to YouTube, which can be seen HERE. You can get a sense of how prophetic it is by the alternate title, “Reality Is Lord! A Scientific View of God and Why This Matters on a Rapidly Warming Planet.”)

Throughout 2014 I've also been interviewing many of the world's top experts in climate change, sustainability, and how to hold some really terrifying things in mind and heart without going crazy or wallowing in despair. How can we stay positive and engaged — and work together with people of very different backgrounds and beliefs — in the service of a just and healthy future for all.

So, yes, human psychology has been central to this project all along. Indeed, I asked the vast majority of my guests, if not all of them: “How do you stay inspired to act in the face of global scale challenges such as climate change, peak oil, the growing gap between the rich and poor, and so forth? What is it that helps you wake up each day motivated to do the vital work that you're doing?” The responses I got were deeply moving, as I’m sure you can imagine.

Are these interviews relevant to the world’s healers and psychologists? Why? Do you think there is anything more important today than providing professional healers with a powerful—unified but diverse—vision of how to act when the future calls us all to greatness? Why or why not?

I consider these interviews to be profoundly relevant to healers and psychologists of all stripe and color. Whatever our respective backgrounds and beliefs, we can all agree on the absolute necessity of fostering a just and thriving future for humanity and the larger body of life of which we are part. How to stay inspired and on purpose in the face of enormous obstacles – politically, economically, religiously, and otherwise – is something that all psychologists, psychiatrists, and others in healing and helping professions should know how to do based on personal experience, because, they, themselves, will be soon be called on to assist others in that same process.

When one looks at the world and human history through what I call “deep-time eyes,” it's not hard to imagine that our ancestors are rooting us on, and the future is calling us to greatness. I would go so far, in fact, to suggest that this may be one of the most useful and empowering beliefs a person can hold in the 21st century. Have and having had a 45-minute to hour-long one-on-one conversation with all these amazing thought leaders and activists, I can confidently say each of them would agree with that sentiment.

Are you aware that some in the research community (including The American Psychology Association, see Further Resources below) have documented a dire need in the world for psychologists and counselors to adequately support emergent psycho-social challenges related to climate change? How can the interviews you recorded help these practitioners and their clients?

No, Rachel, I was not aware of that research, but it doesn't surprise me at all! And frankly, I can't imagine any preparation more useful, more timely, and more important then psychologists and therapists carefully attending to these interviews and taking thorough notes – or, perhaps even better yet, marking up the transcripts.

What was one of the more memorable moments you had as you did these interviews?

For almost everyone, I asked this question: "If you could have dinner with anyone in the world, who would it be?" And people had so many interesting and thoughtful answers. But it was Bill McKibben who knocked me out of my chair. He said, "I'd really like to have dinner with my wife, my kid, and my dog." 

What inspires you the most about these offerings? If you could tell professional psychologists (and the rest of us for that matter!) one thing about why listening in to these interview will help change the world for the better, what would it be?

What inspires me the most about these conversations is simply this: no matter what our diverse backgrounds, beliefs, philosophies, theologies, political orientations or whatever, and no matter what our respective fields of expertise may be, we can all find a way to contribute powerfully to the most urgent moral crisis of our time.

Of all the things that I've done accomplished over the course of my career of popularizing the inspiring side of science and an honorable relationship to the future, nothing has inspired me more than these 55 interviews. And as anyone who takes time to watch or listen to even more than a few of them will discover, this is not hyperbole.

Thank you so very much for talking with us Michael. It's an honor to learn more about this series and bring these powerful thinkers and change agents to the Mothering Nature audience and beyond.

Visit The Future Is Calling Us to Greatness and be sure to watch Michael's short video sharing more about why he (and I) are so excited about this symposium. Then see where your heart takes you...And don't forget to tune in to Mothering Nature as we spotlight some of these interviews in the weeks ahead with a sneak peek at the material.

 

Further Resources

Join the Mothering Nature conversation on Facebook and Twitter

Note: I follow the policy of the LA Times and Popular Science (as further explained at CBSNews.com) and refuse climate-change denial comments. 

Psychologists’ Involvement in Torture and the APA

$
0
0

This blog curates the voices of the Division of Psychoanalysis (39) of the American Psychological Association. Frank Summers, PhD, ABPP, president of Division 39, submits this article.

---

Two recent events have once again raised the distressing issue of psychologists’ involvement in the Bush Administration torture program and the role of the American Psychological Association in it.  A New York Times reporter, James Risen, in his new book, Pay Any Price: Greed, Power, and Endless War, reveals new information on the APA’s conduct in forming its task force on the role of psychologists in detention settings in 2005.  The second and far more publicly discussed development is of course the recent release of the Executive Summary of the Senate Select Committee Report on Intelligence.   Making public the Executive Summary of the report of the Senate Select Committee on Intelligence is the first public admission by the US government that it has conducted a policy of torture in detention centers around the globe.   The report identifies two psychologists who contracted with the CIA to use torture techniques to extract information from detainees.  Although the names of the psychologists are pseudonymed, they are known to be James Mitchell and Bruce Jessen.

The release of the Executive Summary is a step toward transparency in an area of US foreign policy that the government has tried to hide from the American people since its inception in 2002.  Furthermore, it renewed national attention on our policy of torture, one of the dark episodes of US international relations.  For those two reasons, making the summary public is a welcome shift in government policy.

However, there are potentially negative consequences to the release of the report that although little discussed, need to be a focal point for any effort to uncover the truth about psychologists’ participation in the torture program.  While the systematic use of torture at Guantanamo is documented in the published summary of the Senate committee report, the role of psychologists other than Mitchell and Jessen is not mentioned.  Although it is possible that other psychologists are identified in the classified report itself which is three times as long as the executive summary, the public has been made aware only of Mitchell and Jessen, owners of a Washington based consulting firm.  They contracted with the CIA for $181 million to conduct "enhanced interrogations," $81 million of which was disbursed to their company before the contract was pulled.   While Mitchell and Jessen deserve all the public scrutiny and criticism directed to them for teaching, promoting, and using illegal and unethical torture techniques in their so-called "interrogation methods," the exclusive spotlight on them has the potentially deleterious consequence of diverting attention from the widespread participation of psychologists in consulting on, and in at least one case participating in, torture at Guantanamo and other detention sites.  

Disclosures of the involvement of psychologists dates back to 2004 with an article by Neil Lewis in the New York Times that a report of the International Commission of the Red Cross found that American trained psychologists and psychiatrists were consulting on the implementation of torture at Guantanamo Bay.   The responses of the two APAs, the American Psychiatric Association APsychiA) and the American Psychological Association (APA), are revealing.  Dr. Stephen Sharfstein, then president of the psychiatric association, stated unequivocally that there is no role for psychiatrists in detention centers, and he condemned any use of psychiatry in the Bush administration “enhanced interrogation” program.  By contrast, Dr. Stephen Behnke, the head of the APA ethics office, made no such definitive statement.  Dr. Behnke said there were many vague areas, such as how much light detainees received was “too much,” and how much heat or cold was “too much.”  To this day, the APA is the only relevant professional association that has not taken a definitive stand against its members taking part in interrogations in detention centers.

Meanwhile, evidence of psychologists’ involvement in abusive behavior continued to mount.  Time magazine (Zagorin, 2006) published a detailed log of the interrogation of Mohammed Al-Qahtani who was suspected of being the “twentieth hijacker.”  The 84 page log, covering a 50 day period in the winter of 2002-03, showed that the interrogators used extreme sleep deprivation, exposure to cold, prolonged standing, denial of bathroom breaks, and a variety of psychological manipulations, all of which are torture under international law, in an effort to extract information from Al-Qahtani.  The FBI reported that Al-Qahtani hallucinated and talked to non-existent people, behavior consistent with exposure to extreme psychological stress.  The log referred to a “Dr. L.” who both consulted to and was present at the interrogation.  Dr. Steven Miles (2006) and two bioethicists, Jonathan Marks and Gregg Bloche (2006), identified “Dr. L.” as Major John Leso, a counseling psychologist. 

The 2004 Office of Inspector General's report, released during the first term of the Obama administration, revealed that psychologists taught reverse engineered SERE (Survival, Evasion, Resistance, Escape) techniques at a conference in Fort Bragg in September, 2003, to JT-170 personnel, the staff at Guantanamo.  SERE techniques were originally devised during the Korean War to help our soldiers resist torture in case they were captured.  The OIG 2004 report states that the purpose of the conference was to teach JT-170 personnel to "reverse engineer" SERE tactics, that is, to deploy the torture techniques that the SERE program was designed to help our soldiers resist, in order to use those techniques at Guantanamo.  It was shortly after this conference that those torture techniques were deployed on a regular basis at Guantanamo.  The instructors at the Fort Bragg conference were psychologists, and the consultants who advised on how to use reverse engineered SERE techniques at Guantanamo were psychologists. While Mitchell consulted at Guantanamo, other psychologists did so as well.  Whereas Mitchell was an independent contractor, other psychologists were military officers deployed at Guantanamo.

New detainees were routinely put in isolation for two to four weeks, a period that constitutes torture under international law.   Psychologists were the primary professionals consultants on the management of prisoners.  In a Pentagon conference call with reporters, Dr. William Winkenwerder, assistant secretary of defense for health affairs, “made it clear that the Defense Department had come to rely more heavily on psychologists at Guantanamo than psychiatrists” (Risen, p. 195).  Winkenwereder went to say that the American Psychological Association supports the role of psychologists in interrogations in a way that the American Psychiatric Association does not. 

The United Nations Commission on Human Rights (2006) found: (1) widespread abuses of detainees “amounting to torture” in violation of the Geneva Convention; and, (2) the systematic breaching of professional ethics by health care professionals who, it concluded, have been “complicit in abusive treatment of detainees detrimental to their health” (p33).  The UN Commission was unambiguous in its statement that health care professionals who use their expertise to assist in ways that may adversely affect the physical or mental health of the detainee are violating professional ethics.  Despite these findings, the APA stated definitively that no psychologists were involved without conducting even a minimal investigation.  The president of the association at the time, Dr. Gerald Koocher (2006), wrote a scathing editorial condemning anyone who suggested anything to the contrary.

In what has to be one of the darkest moments of professional psychology, another psychologist ordered the torture of a child, Mohammed Jawad, who was 12 years old when incarcerated.  Rather than testify before the second war crimes tribunal at Guantanamo, the psychologist took the fifth amendment.  Jawad was imprisoned for allegedly throwing a grenade at US troops after his home was destroyed in a military battle.  He was never charged and after a six year imprisonment was released.

In brief, information from a variety of sources has documented the fact that psychologists played a primary role in the Bush Administration torture program (e.g., IOG, 2004; Bloch & Marks, 2005; McCoy, 2006; Sands, 2008; Miles, 2009).  Nonetheless, the response from APA has been to deny that any psychologists were participants in torture until the evidence was incontrovertible, and even then only  grudgingly admitted that “a few” psychologists participated in the torture.  However, the APA never issued an apology for its previous denials, nor for the denunciation of those who had correctly charged that psychologists were active participants.  To this day the APA has never admitted the primary role psychologists played in the Bush administration torture program.

The fact that the Executive Summary of the committee report only identifies Mitchell and Jessen has actually been beneficial to the APA leadership, which points out that Mitchell and Jessen are not APA members, and therefore it cannot take any action against them.  While this is true, Dr. Behnke has claimed that the APA has no connection whatsoever with Mitchell and Jessen.  However, that statement is belied by the fact that James Mitchell was invited to “invitation only” meetings held by the APA, at time with other organizations.  More importantly, as we have seen, Mitchell and Jessen were hardly the only psychologists to participate in the Bush administration torture program, and the spotlight on them has deflected attention away from the fact that the APA has done nothing about the psychologists who are APA members and for whom the evidence of their involvement is substantial in some cases and  incontrovertible in others.

The APA has shown no concern for the APA members for whom there is evidence of their participation.  In fact, several ethical complaints were filed against the aforementioned Dr. Leso.  The APA took approximately seven years and then announced it would not only not take any action on the complaints, it would not even bring the complaints to the full Ethics Committee.  The letter refusing to investigate the case, under the signature of Dr. Behnke, stated that the decision was that he did not believe there was sufficient evidence to bring the matter before the full committee.  Again, it must be emphasized that three bioethicists, conducting their own investigations, have identified Dr. Leso as the “Dr. L” who participated in the torture of Muhammed al-Qahtani.  Neither Leso nor anyone else has ever denied the identification of  “Dr. L” as Leso.

And that brings us to the second major recent development: James Risen’s revelations that the APA compromised its integrity by giving the military control over its choice of membership and direction of its key task force on psychologists and national security, the Psychological Ethics and National Security (PENS) task force.  The report gave cover to psychologists who participated in the Bush administration torture program by concluding that it was ethical for psychologists to participate in the interrogation process in detention settings.  Risen found emails from the computer of Scott Gerwehr who was copied on the communications between the APA , the CIA, and the Pentagon, that showed the great extent to which the APA’s actions and policies on national security were a function of the wishes of the Pentagon and CIA.

Risen found that after the Abu Gharib revelations, the APA convened secret meetings with the Pentagon and the CIA on how to cope with the public outcry over the photos of Americans happily torturing and humiliating detainees.  Risen notes that the APA looked to the national security apparatus for guidance and consultation rather than its own members.  When the APA decided to convene the PENS task force, it appointed military employees to six of the nine voting spots, some of whom participated in the application of torture techniques at Guantanamo.  The makeup of the task force was a clear conflict of interest that has been known for many years.   It was also known that although the purpose of the task force was supposed to be to study the ethical issues involved in psychologists’ involvement in national security settings, in fact Russ Newman, head of the practice directorate and husband of a military psychologist at Guantanamo, told the group that they had to “put out the fires of controversy.” 

What has not been revealed until now is that emails from Geoffrey Mumford, head of the APA science directorate, to Kirk Hubbard, the chief behavioral scientist at the CIA, thanked Hubbard for influencing the PENS process and he assured Hubbard that his views were “well represented by very carefully selected task force members” (Risen, p 200).  So, these emails are the smoking gun that shows why the PENS task force was so heavily weighted with those who had a clear conflict of interest.   The APA had made sure that the Pentagon’s interests were served by what was supposed to be an impartial investigation of ethical issues.  Instead, behavioral scientists from within the national security apparatus of the government played a large role in the composition and shaping of the PENS task force even before it was constituted.  APA officials worked behind the scenes with Pentagon officials to determine the direction of the task force so that it would support psychologists’ involvement in the Bush administration “interrogation” program.

In conclusion, I wish to underline the fact that the APA’s behavior for the past ten years has been to protect the right of psychologists to work in illegal detention centers.  However, in 2008, a referendum was passed by the membership of APA that prohibits psychologists from working in such settings unless for the specific benefit of the detainee or for an independent third party.  That is official APA policy, but the organization has done precious little to implement it.  With the revelations from Risen’s book we now have new clarity on the APA agenda.  It has been working hand in glove with the Pentagon and the CIA while adopting a policy of benign neglect to the wishes of its own members.

 

References

Bloch, G. and Marks, J.  (2005).  Doctors and Interrogators at Guantanamo Bay.  The  New England Journal of Medicine 353 (1), pp. 6-8.

Koocher, G.  (2006). President’s Column.  APA Monitor, Vol. 37, No.2, February, p.5.

McCoy, A. (2006).  A Question of Torture: CIA Interrogation from the Cold War to the    War on         Terror, New York: Metropolitan.

Miles, S. (2006).  Oath Betrayed.  New York: Random House.

Office of Inspector General (2004), Report to the US Congress 2004.  Washington, DC: The US Government.

Risen, J. (2014).  Pay Any Price: Greed, Power, and Endless War.  New York: Houghton-Mifflin.

Sands, P.(2009)The Torture Team.  New York: Palgrave-MacMillan.

United Nations Commission on Human Rights (February 15, 2006).  The Situation of  Detainees at Guantanamo Bay.

Zagorin, A. (2006) One Life Inside Gitmo.  Time, March 13.              

Zimbardo, P.(2006) Comments on the PENS Task Force Report.

Out of the Mouth (and Hands) of Babes

$
0
0

In 1977, the government of Nicaragua established the nation’s first school for the deaf. Twenty-five deaf children came to the capital, Managua, to enroll in the residential school. None of them could speak, and none of them had ever learned sign language.

The teachers didn’t know sign language either. Their attempts at teaching the children to read lips and to speak Spanish met with little success. Instead, the children invented their own sign language, which they used with each other as a secret code the adults couldn’t understand.

At first, the sign language was simple, with just a few hundred words and little grammar. But as each new cohort entered the school and learned the language from their peers, they increased the complexity of the grammar and added new words to the vocabulary. In about two decades—roughly the span of a human generation—this secret sign code developed into Nicaraguan Sign Language, a fully formed language that’s just as expressive as Spanish or English.

Girls using sign language

A similar process occurred in the case of Israeli Sign Language, which arose less than a century ago. Four generations of signers live in the Israeli deaf community today, totaling around ten thousand people. The oldest members of the community use a simpler form of the sign language, while younger members employ a much greater degree of grammatical and lexical complexity. By comparing the signing of younger and older members, we can see how the language has evolved across generations.

New spoken languages can likewise develop from scratch over the course of a generation. One such example is Hawaiian Pidgin. In the nineteenth century, English-speaking plantation owners brought in workers from East and Southeast Asia. To communicate, the workers developed a simple language known as a pidgin. Words were borrowed from English, Hawaiian, Japanese, and other languages.

The children of these workers developed their parents’ speech into a language they called “Pidgin.” Despite its name, Hawaiian Pidgin is in fact a creole, which is a fully formed language that has grown out of a pidgin. It’s still widely spoken in Hawaii today, and if you’d like to hear what it sounds like, click on this link to Kathy Collin’s open letter “Dear Prezadent Obama” in Maui Magazine.

In recent years, parents and teachers alike have bemoaned the common use of texting abbreviations by young people. The worry is that young people won’t learn how to spell properly if they get in the habit of writing CUL8R and W84M instead of “see you later” and “wait for me.” In fact, research shows that skill in the use of expressions such as these in text messages is positively correlated with reading and writing skills in the classroom. This is because an important part of being a skillful user of a language is knowing how to adjust style and formality to fit the situation.

All languages are in constant flux, as new words and constructions are introduced while old ones fade from collective memory. The elders lament how the younger generation is destroying the language. Yet they forget that, in their youth, they were also linguistic trendsetters, much to the chagrin of their parents.

Language innovation is never driven by the grownups, who are set in their ways and resistant to change. Rather, it’s the younger generation—still learning the language—that makes or reshapes it to fit their needs. Whether it’s creating a sign language from scratch or developing an efficient code for text messaging, it’s always the children who are the architects of language.

 

References

Aronoff, M., Meir, I., & Sandler, W. (2005). The paradox of sign language morphology. Language, 81, 301–344.

Meir, I., Sandler, W., Padden, C., & Aronoff, M. (2010). Emerging sign languages. In M. Marschark & P. E. Spencer (Eds.), Oxford Handbook of Deaf Studies, Language, and Education, Vol. 2. Oxford: Oxford University Press.

Plester, B., Wood, C., & Joshi, P. (2009). Exploring the relationship between children’s knowledge of text message abbreviations and school literacy outcomes. British Journal of Developmental Psychology, 27, 145–161.

Senghas, A., & Coppola, M. (2001). Children creating language: How Nicaraguan Sign Language acquired a spatial grammar. Psychological Science, 12, 323–328.

Senghas, R. J., & Monaghan, L. (2002). Signs of their times: Deaf communities and the culture of language. Annual Review of Anthropology, 31, 69–97.

 

David Ludden is the author of The Psychology of Language: An Integrated Approach (SAGE Publications).

The Wisdom of Science: Questions, Guesses, and Predictions

$
0
0

First you guess. Don't laugh, this is the most important step. Then you compute the consequences. Compare the consequences to experience. If it disagrees with experience, the guess is wrong. In that simple statement is the key to science. It doesn't matter how beautiful your guess is or how smart you are or what your name is. If it disagrees with experience, it's wrong. That's all there is to it.

– Richard Feynman (1918–1988), Nobel-laureate American physicist

As to science itself, it can only grow.

– Galileo Galilei, writing in 1632

What would the world look like if I rode on a beam of light?

Is the universe friendly?

– Albert Einstein (1879–1955), who created the theory of relativity to answer his first question.

What is Life?

– Title of a path-breaking book, published in 1944 by Nobel-laureate physicist Erwin Schrödinger, which helped spawn the field of molecular biology.

In the future, as in the past, the great ideas must be simplifying ideas.

– André Weil, French mathematician

Every sentence I utter should be regarded by you not as an assertion, but as a question.

– Niels Bohr (1885–1962), who began lectures with this warning.

The catastrophe of the atomic bombs which shook men out of cities and businesses and economic relations, shook them also out of their old-established habits of thought, and out of the lightly held beliefs and prejudices that came down to them from the past.

– H. G. Wells (1866–1946), British futurist, writing in 1914!

I believe that in about fifty years it will be possible to program computers to [imitate human beings] so well that an average interrogator will not have more than 70% chance of making the right identification [i.e., distinguishing between a human and a computer interlocutor] after five minutes of questioning. I believe that at the end of the century one will be able to speak of machines thinking without expecting to be contradicted.

– Alan Turing (1912–1954), English mathematician and computer pioneer, speaking to journalists in 1950.

How is it that I am a collection of a hundred billion nerve cells, yet I think and act as one?

– Rodolfo Llinás (1934–   ), Colombian neuroscientist

Will we, who have the knowledge of many ways, leave our children free to choose among them?

– Margaret Mead (1901–1978), American anthropologist. From the closing lines of her 1928 classic Coming of Age in Samoa.

We shall, sooner or later, arrive at a mechanical equivalent of consciousness.

– Thomas Henry Huxley (1825–1895), English biologist

Some recent work ... leads me to expect that the element uranium may be turned into a new and important source of energy in the immediate future. Certain aspects of the situation ... seem to call for watchfulness and, if necessary, quick action on the part of the Administration. ... [I]t may become possible to set up a nuclear chain reaction in a large mass of uranium, by which vast amounts of power ... would be generated. ... This new phenomenon would also lead to the construction of ... extremely powerful bombs of a new type....

– Albert Einstein, from a letter sent to President Roosevelt in August 1939, drafted for him by Hungarian-born physicist Leo Szilard. It led to the Manhattan project, the World War II effort to construct an atomic bomb.

On July 16, 1945, I was out in the desert in New Mexico ... awaiting the tests of the first large-scale release of atomic energy. The site chosen had been named "Journey of Death" hundreds of years ago. Nine miles away, there was a tower about one hundred feet high. On top of that tower was a little shack. In that shack was a bomb. This particular morning it was set to go off. The announcer said, "Thirty seconds"—"ten seconds"—and we were lying there, very tense, in the early dawn, and there were just a few streaks of gold in the east. Those were the longest ten seconds I ever experienced. Suddenly, there was an enormous flash of light, the brightest light I have ever seen. It blasted; it pounced; it bored its way into you. It was a vision which was seen with more than the eye. It was seen to last forever. You would wish it would stop. Finally it was over, and we looked toward the place where the bomb had been; there was an enormous ball of fire which grew and grew and it rolled as it grew; it went up into the air, in yellow flashes and into scarlet and green. It looked menacing. It seemed to come towards one.

A new thing had just been born; a new control; a new understanding which man had acquired over nature. That was the scientific opening of the atomic age.

The most serious thing about the atomic bomb, even today, is its amazing cheapness. I costs less than one-tenth as much money to destroy a square mile by atomic bombing as it does by ordinary bombing. Even the poor nation can afford to be destructive.

– I. I. Rabi (1898–1988), American Nobel-laureate physicist and scientific statesman. He won the betting pool, held by scientists witnessing the first atomic test, for the most accurate prediction of the energy released by the bomb. His implied prediction regarding nuclear proliferation looks equally prescient.

People will build a rocket ship which can fly so fast and go up so high, that it then stays up all by itself and circles the Earth like a moon. From such a flying apparatus one can then oversee all parts of the Earth as on a map.

– A note made in 1946 by Kurt Gödel (1906–1978) on a conversation with Albert Einstein during one of their daily walks in Princeton. Many mathematicians view Gödel's results in mathematical logic much as physicists do Einstein's in physics—as unsurpassed in their fundamental importance. Newton had foreseen artificial satellites as a theoretical possibility about three centuries earlier. With the wartime development of rockets, satellites would soon be a reality. The childlike awe, in what is a conversation—between two of the most original scientific minds of all time—is itself noteworthy.

I ask you to look both ways. For the road to a knowledge of the stars leads through the atom; and important knowledge of the atom has been reached through the stars.

– Arthur Stanley Eddington (1882-1944), English astronomer whose measurements provided early confirmation of Einstein's general theory of relativity. This is a prediction not of physics, but rather about the development of physics. As theories of the cosmos and the atom converge, it looks ever more prophetic.

The more important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplanted ...is exceedingly remote.

– A. A. Michelson (1852–1931), American physicist, writing in 1903. Surprisingly, there have been few generations in which such predictions haven't been made. Someone is always (erroneously) proclaiming the end of something: physics, science, history! The irony in this case is that it was precisely Michelson's experiments on light which set Einstein on the road to relativity.

If an elderly but distinguished scientist says that something is possible he is almost certainly right, but if he says that it is impossible he is very probably wrong.

– Arthur C. Clarke (1917–2008), English futurist, who originated the idea of communications satellites and authored 2001: A Space Odyssey

It is a curious situation that the sea, from which life first arose, should now be threatened by the activities of one form of that life. But the sea, though changed in a sinister way, will continue to exist; the threat is rather to life itself.

– Rachel Carson (1907–1964), American marine biologist and writer. Her writings, among them The Sea Around Us in 1951 and Silent Spring in 1962, played seminal roles in creating the environmental movement.

As long as there are sovereign nations possessing great power, war is inevitable.

All of these [research] endeavors are based on the belief that existence should have a completely harmonious structure. Today we have less ground than ever before for allowing ourselves to be forced away from this wonderful belief.

– Albert Einstein, pessimistic and optimistic

All history is the history of physics.

– Oswald Spengler (1880–1936), German author of Decline of the West. In this provocative assertion, Spengler is pointing out the link between technological innovation and political power. For example, pre-historic metallurgical discoveries ushered in the iron and bronze ages; the modern era has been dubbed the "atomic age" and the "age of information."

A butterfly's capricious flight may result in a tropical storm, not tomorrow, but one or two years down the road. This is why long-range weather forecasting is so difficult: everything, absolutely everything, must be taken into account. No perturbation can be deemed too small to have any influence.

– Ivar Ekeland (1944–   ), mathematician, explaining the difficulty of predicting certain behaviors of complex (chaotic) systems.

The machines will get good enough at dealing with complexity that they can start dealing with their own complexity, and you'll get systems that evolve.

We're building a machine that will be proud of us.

– Daniel Hillis (1956–   ), American computer scientist and co-founder of Thinking Machines Corporation

[Someday human intelligence] might be viewed as a historically interesting, albeit peripheral, special case of machine intelligence.

– Pierre Baldi (1967–    ), computer scientist

We hope to explain the entire universe in a single, simple formula that you can wear on your T-shirt.

– Leon Lederman, American Nobel-laureate physicist

From The Wisdom of Science

Twelve Reasons for Singing

$
0
0

Jingle…jingle…jingle. Tis the season…to sing. Just how many times have you sung “Jingle Bells” this season? Or does the tune spark memories from when you were a kid, bringing along that rush of holiday excitement? Are you teaching your own children, almost by osmosis?

I was interviewed recently about the psychological value of singing—and surprised even myself by all the benefits that occurred to me. So: instead of the 12 Days of Christmas, here are 12 Reasons for Singing:

1. No surprise, given my passion about the value of diaphragmatic breathing to mental health and optimal performance: The first benefit that occurs to me is that, to sing fully and completely, you need good breath support. It not only produces good sound—this kind of breathing gives your mind and body the positive effects of more oxygen and more complete exhalation of carbon dioxide.

2. Unless you confine yourself to singing in the shower, singing means community and interpersonal connection. Choruses come together to work on a program; they have a sense of purpose that is larger than any one individual. Each different voice type and timbre is needed—the whole is greater than the sum of the parts. The sense of community is also strengthened through the rhythm of chatting with each other, catching up during break times. Even with a different purpose, congregational choirs share these dynamics as well. When people gather together, even once a year around a piano or in full-fledged Messiah Sing-a-long mode, voices are raised together.

3. When you’re singing, it’s almost impossible to think about other things—the challenges of daily life, for example. You’re focused specifically on what you are doing. Being fully in the present moment allows us welcome distraction from other thoughts, issues, or burdens. Singing brings respite and refocus.

4. Singing offers the opportunity for mastery, for a sense of learning, growing, and accomplishing. Even if it seems painfully gradual at times, practice gets you closer to perfect. You become more aware of what’s involved in skill development, perhaps in an aspect of life different from your everyday professional life.

o One aspect of this developmental knowledge is that of being able to read music. A favorite bumper sticker of mine says it all: If you can read this, thank your music teacher. Sight reading is a wonderful skill, and it improves with practice.

5. Moving more into the cognitive aspects of singing, memorization uses different aspects of our brains. It’s something we’re accustomed to do as children; as adults, we usually find comfort in clutching our scores while staring down at those black splotches on the paper. Memorized music, though, demands that we pay attention to what we’re singing in a different way. In turn, we connect with our audience in a more intimate and direct way—there’s no black folder between the singers and the listeners.

6. Singing is really a very complex process, involving various areas of our brain: There is the linguistic aspect which is different from vocal production which in turn is different from the sense of line. Add different languages and you get yet more synapses firing, more brain elements in full use.

7. There is the emotional aspect as well: What does the music say, how does the singer convey it, what does it evoke in our memories, whether about the music itself or some totally unrelated memory? Does the music bring us to tears? To joy? To laughter? Accessing our feelings through music is especially true at this time of year.

8. Song is a form of communication. A lullaby soothes an infant; it communicates “you matter.”

9. Vocal music elaborates on the experience of meaning and symbolism that we encounter with poetry.

10. Since this is a blog about performing, there is also the performance aspect itself. The process of rehearsal is both critically important and valuable in itself—but is different from performance. Performing isn’t “just” about singing; it’s also about singing to others. How does the singer prepare for that interaction? What are the mental messages and self-judgments that the singer makes, both in preparation and during the performance?

11. For some people, the opportunity to be judged adds a special sparkle to performing. The popularity of various kinds of choral competitions, involving formal judging, was the basis for the TV series, Glee, a program that further popularized competitive choral singing.

12. Finally: this blog has often noted the similarities between different types of performers, especially athletes and performing artists. One of the central differences, though, has to do with audience. I would argue that even though audiences are very much part of the athletic endeavor, an audience isn’t necessary to the performance of the sport itself. With the performing arts, however, in a way performance doesn’t exist without the presence of an audience. Perhaps this blog is relevant to you as an audience member rather than a singer. Your experience and your presence is vital and central to the singer’s life as well. This is the broader sense of community; the interaction between performer and audience creates its own dynamic.

But don’t take just my word for it. Sir Paul McCartney was asked recently if he doesn’t get tired of singing the Beatles standards during performances. His response: “I rediscover [“Let It Be”] for this audience. Seeing the audience reaction freshens it up every time. [It’s as if] I’m looking back at this twenty year old kid, rediscovering the words as I sing them again, finding new meaning, remembering when I wrote it, when we recorded it.”

And don’t “just” take Sir Paul’s word for it, either. I sing with an auditioned chorus that, annually in December, goes to a number of local seniors’ residence to “donate” a mini-concert of holiday music, CarolShare. Our audience is truly appreciative, even if sometimes impaired enough that we may not experience direct feedback. But this year, just after one mini-concert, a resident gave our director a sketch he’d made of the chorus as we were singing. Another time, the recreational coordinator thanked our point person and commented that one of the residents had come up to her and said, “Boy are they good singers.” The point person smiled politely, thinking that while the comment was nice, it wasn’t exactly a profound statement. The Recreational Coordinator said, “I’m not sure you understand. The woman has Alzheimers—she hasn’t spoken for four months.”

Sing and be merry!

As always, if you would like further information or wish to be in contact with me, you can do so @ http://www.theperformingedge

 


Should You Share Your Toxic Thoughts With Your Partner?

$
0
0

Sharing your toxic thoughts from time to time can be very illuminating to both you and your intimate partner.  Opening up on this level can also bring you closer together. For list of the common, relationship destroying toxic thoughts see my recent post, Nine Toxic Thoughts That Can Destroy Your Relationship

By sharing you toxic thoughts, not only can you better understand yourself, your partner can better understand you. Taking this emotional risk frequently builds intimacy. 

One caution to keep in mind: When preparing to share toxic thoughts with your partner, you must proceed with sensitivity. Stress how much you desire to get closer as a couple. Express that you are doing this to build trust and deepen your love by getting working through your hidden frustrations and resentments.  Explain that coming up with alternative, more positive ways of viewing your partner is important for your own emotional health.

When you do disclose your toxic thoughts, first stress the positive qualities about your partner. Be specific and ask for the "green light" from your partner to discuss your toxic thoughts. Don't say, "You're a good guy, but I really get upset with you and here's why." Rather, say something like, "I value how many hours you put in and help out with the kids, I also appreciate how gentle and kind you are with me. Yet it bothers me that I still find myself losing track of what you really do because I get focused on what you 'should' know when I need space. Sometimes I feel smothered, but a lot of this may be my stuff. I'd like to share with you some what goes on in my head so that you can better understand what makes me tick, and what I'm working on so that I can think about you in healthier ways. Are you okay with me talking to you about this?"

Jasmine, as another example of courage in the emotional intimacy realm, found that sharing her thoughts with her fiancé Randy allowed her to get past a huge toxic thinking hurdle. She found herself stuck in the All or Nothing trap and doing some serious Label Slinging: "I can never give him enough. He's just a needy sponge that wants to suck me dry."

When Jasmine shared her inner angst with Randy, she was deeply encouraged by the positive results. She told me about this during a counseling session. "I was totally surprised, pleasantly surprised. Randy was incredible. He started to get defensive but I made it clear that I was not putting him on trial. Then he really thanked me for being honest with him. He said he could sometimes sense more attention from me when he was affectionate. He was silently frustrated and having some of his own toxic thoughts. He was actually saying to himself, "She never appreciates anything I do." Jasmine, elaborated, "So when I shared my toxic thoughts, he told me his and we came away from the conversation feeling excited because I don't think a lot of couples really share on this level."

Some of my clients prefer to privately work through toxic thoughts in their own mind. Do what works best for you. The goal is not to constantly tell your partner about every thought going through your head. You certainly don't want to call your partner while  she is in the middle of a meeting and say, "Honey I want to tell you that I'm having toxic thoughts about you right now, but I think I'm working through them! It's okay I just wanted you to know and now you can get back to work." Again, be sensitive to your partner and he she will be much more likely to accept what you're saying as strive toward truly getting to know each other and deepening your love. . 

 

Dr. Jeffrey Bernstein is a psychologist with over 23 years of experience specializing in child, adolescent, couples, and family therapy.  He holds a Ph.D. in Counseling Psychology from the State University of New York at Albany and completed his post-doctoral internship at the University of Pennsylvania Counseling Center. He has appeared on the Today Show, Court TV as an expert advisor, CBS Eyewitness News Philadelphia, 10! Philadelphia—NBC, and public radio. Dr. Bernstein has authored four books, including the highly popular 10 Days to a Less Defiant Child (Perseus Books, 2006), 10 Days to a Less Distracted Child (Perseus, 2007), Why Can't You Read My Mind?, and Liking the Child You Love, Perseus, 2009). 

 

 Image credit Pixabay, public domain

 

Better Than Happiness

$
0
0

It may sound strange coming from someone who has written dozens of blogs about happiness and taught a lot of seminars on the subject, to hear that happiness isn’t necessarily all that it’s cracked up to be. Or put another way, in terms of one’s overall quality of life, spirit, and degree of personal fulfillment, some things play a much more significant role than feelings of happiness. I’ll get to that in a minute.

In my freshman year of college I read a book that changed my life. It was at the time the most important book that I had ever read, and it continues to be to this day. It’s entitled, Man’s Search for Meaning and it was written in 1946 by the Viennese psychiatrist and neurologist, Viktor Frankl.

Frankl had been recently liberated from a concentration camp in which he had been imprisoned for several years, and shortly after receiving news that the Nazis executed his entire family, including his wife, pregnant with their first child, his brother, and both of his parents, as well as many other relatives.

What Frankl personally witnessed and experienced during his incarceration led him to a conclusion that to this day stands as one of the most succinct and profound statements ever written about the human condition. That is that “everything can be taken from a [person] but one thing: the last of the human freedoms—to choose one’s attitude in any given set of circumstances.” The circumstances in which Frankl lived during the war years, were beyond horrific. His writings were not simply expressions of a theory, but were grounded in his daily self-reflection on his own experience and his observation of countless other inmates and how they did or did not manage to survive unspeakable conditions.

Frankl discovered that the primary variable that influenced the likelihood of whether his fellow prisoners survived of perished had to do with the degree to which they were identified with a purpose larger than themselves, particularly one in which they saw themselves as contributing in some meaningful way to the enhancement of the quality of others’ lives. He claimed that those prisoners who suffered the physical and mental cruelties of the camps and managed to survive also tended to be the ones who sought and found the wherewithal to share the little they had, a comforting word, a crust of bread, or an act of simple kindness with others. Giving to others was of course not a guarantee of survival, but it was a way of sustaining a sense of purpose and meaning in the face of overwhelmingly brutal conditions. Without purpose of meaning, our life spirit diminishes and we become more vulnerable to physical and mental stressors.

While it’s natural to prefer happiness to suffering, Frankl recognized the paradox that a sense of purpose and meaning often is born out of adversity and pain, and he understood the potentially redemptive value in suffering. The recognition that there can be some good that comes out of our most painful experiences can be the central factor in the process of transforming suffering into purpose.

In the January 2013 issue of the Atlantic Monthly, in her article entitled, There’s More to Life than Being Happy, Emily Esfahani Smith writes, “Research has shown that having meaning and purpose in life increases overall well-being and life satisfaction, improves mental and physical health, enhances resiliency and self esteem, and decreases the chances of depression.” She goes on to state that according to recent research, “the single-minded pursuit of happiness is ironically leaving people less happy.”

Happiness is usually associated or confused with pleasure, which has to do with experiencing enjoyable feelings and sensations. We feel happy when a need or desire is fulfilled, when we get what we want. The researcher, Kathleen Vohs claims that “Happy people get a lot of joy from receiving benefits from others, while people leading meaningful lives get a lot of joy from giving to others.”

A 2011 study concluded that people who have meaning in their lives through a clearly defined purpose, rate their life satisfaction higher even when they were feeling bad, than those without a sense of purpose.

Several years prior to writing his groundbreaking book, Viktor Frankl was already living from a deep sense of purpose that at times required him to forego personal desires in favor of his commitment to fulfill other, purpose-driven intentions. In 1941 Austria had already been occupied by the Germans for three years. Frankl knew that it was just a matter of time before his parents would be taken away. At the time he was already distinguished internationally for his contributions to the field of psychology and had a widespread reputation. He had applied for and was given a visa to America where he and his wife would be safe from the Nazis, but as it became evident that his parents would inevitably be sent to a concentration camp, he recognized that he had to choose between rejecting his visa to America to help his parents make the painful and difficult adjustment to the camps, or to go to America to save himself and his wife and further pursue his career. After considerable deliberation he understood that his deepest purpose was in his loyalty and responsibility to his aging parents. He made the decision to put aside his individual pursuits, stay in Vienna and dedicate his life to being in service to his parents and later, to other inmates in the camps.

Frankl’s experiences during this time served to form the basis of his theoretical and clinical work that has since profoundly impacted on the quality of life for millions of people worldwide.

Viktor Frankl died in 1997 at the age of 92. He spent his post-war years continuing to embody his commitment to serve through his teaching, his writings, and many other forms of contribution to the welfare of humanity. His life served as a stunning example of one man’s extraordinary capacity to find and create meaning in a life that was at times characterized by indescribable physical and emotional suffering. He was literally living proof of the claim that we all have the power to choose our attitude in any given set of conditions, regardless of what the circumstances are, and that the choice that we make is the determining factor in the quality of our life. While there may be times when the ability to choose to feel happy doesn’t seem available to us, there is never a time in which we lack the ability to choose our attitude. Frankl’s life, more so than his written words, affirms that we all possess the power to make and act on this choice. It was, beyond any fragment of a doubt, a life well-lived.

Cognitive Deficit in Bipolar Disorder

$
0
0

One of the more overlooked aspects of bipolar disorder is the potential for developing a degree of cognitive deficit as part of the illness. This omission reflects the reality that mainstream print media’s portrayal of bipolar disorder mostly focuses upon the cycling of elevated and/or depressed moods which are the hallmark features of the disorder.

What we typically read are descriptions of mood elevation that reflect symptoms of high energy, lessened need for sleep, feelings of euphoria, grandiosity, impulsivity, elevated libido, etc. Similarly, on the depressed end of the mood spectrum, we read descriptions of low energy, low self-esteem, feelings of sadness, loss or emptiness, suicidal ideation, pervasive pessimism, low motivation, and all the other experiences we associate with feeling depressed. Mood typically receives the bulk of our attention when it comes to descriptions and discussion of bipolar disorder; however, in my sessions with individuals living with the disorder, it’s common to hear concerns about their lessened cognitive capacities. To be more specific, I’m referring to the experience of decreased cognitive capacity relative to the period of time before any sustained bipolar mood symptoms arrived on the scene.

Examples of the kinds of deficits reported are difficulties with linguistic working memory (word retrieval), difficulties with planning, prioritizing and organizing of behavior (executive functioning), problems with retention of what’s been read or listened to, as well as the experience of mildly dulled or slowed thought processes. For some with bipolar disorder, it’s like they’ve experienced a gradual decline of brain power from their previous baseline level of function.

Before I scare many readers, the key word in the preceding sentence is “some.” The research literature poses a wide range of figures pertaining to cognitive deficit in bipolar disorder, with studies showing incidence rates between 15% on the low end and 60 % on the high end. Granted, this broad a range doesn’t tell us much. Research samples that vary widely in relation to subjects’ age, symptom acuity, presence of comorbidity and differences in prior treatment backgrounds do yield different findings.

A key conclusion supported by numerous research articles is there appears to be a positive correlation between the presence of cognitive deficit and higher acuity bipolar symptoms. This means that those with histories of more acute bipolar mood symptoms are more likely to experience aspects of cognitive deficit. There are also important findings that point to the reality that individuals whose symptoms have been well-managed over the years will be less likely to experience cognitive impairment. Those who have experienced a more difficult course of their disorder due to treatment resistant symptoms, treatment non-compliance, and/or unhealthy lifestyle choices suffer more cognitive impairments.

A salient question is whether the manifestations of cognitive deficit symptoms are mood-phase-specific or do they represent some degree of impairment that persists and is independent of cycles of mania, hypomania and depression.

Most would agree that cognition is adversely impacted when one is acutely depressed. When acutely depressed, individuals often find that the alacrity and sharpness of their cognition feels like it’s been dialed down a few notches. Recall of written or spoken words can also become compromised. Consider the depressed student whose trying to complete a reading assignment the night before class. He reaches the end of the chapter and realizes he is unable to recall most of what he’s just read over the last 10 to 15 pages. The same can apply to retention of material that was conveyed during a class lecture. The student truly attempts to track what’s being said, but the material conveyed in the lecture just doesn’t stick.

Hypomania and mania also generate a broad range of cognitive alterations. Racing thought is a common experience during mood elevation, and the consequence of accelerated thought can again manifest as faulty memory and impaired focus. The individual’s thought content progresses so rapidly that it becomes difficult to hold onto specific thoughts or to maintain clear awareness of what he thought only a few minutes ago.

In addition to racing thoughts, an unusually large volume of thoughts can flood a person’s awareness during hypomanic/manic states. Too much happening concurrently in one’s consciousness makes it difficult to select or to prioritize effective responses. The hypomanic/manic individual may find that everything feels important, while concurrently new and even more important thoughts keep emerging. The experience is that of excessive mental activity and the consequence becomes manifest through behavioral responses that are poorly planned, prioritized and executed.

A different dilemma that sometimes comes along with mood elevation is the experience of becoming too focused. An example would be the individual who becomes locked on to an idea, a plan, or a project and continues with sustained focus far beyond what would likely occur in mid-range or even depressed mood. This sometimes yields an amazing burst of sustained focus and productivity in a short span of time. The problem is that the experience of being hyper-focused, or the loss of cognitive flexibility and adaptability, can also result in the individual’s failure to attend to important matters that really need attention.

There are multiple other examples of mood’s impact upon cognition, but at this point it should be clear that the polarities of mood elevation and depression have adverse impact upon memory, focus, thinking and planning. This should come as no surprise. In fact, it would be more surprising if mood intensity had little to no bearing upon cognition.

There seems to be a fairly broad consensus in the research literature that for some with bipolar disorder, the presence of cognitive deficit is not just a reflection of mood intensity, but an enduring element of the illness itself. The specific cognitive difficulties that present for an individual can be present during mid-range mood or even during sustained periods of remission. This is where the discussion potentially evokes anxiety for those with the disorder. I recall a young adult patient recently saying – “You mean, in addition to all of my mood craziness, I now have to worry about gradual loss of cognitive capacity? My best answer at this point is – Maybe.

My reading of the research coupled with years of bipolar treatment experience leads me to the conservative guestimate that about one third of those diagnosed with polar disorder, will likely struggle with some decline in memory, attention, executive functioning and retention of language based information. A third may seem high, especially if you fear you may be headed there. I encourage readers not to feel fatalistic, as the opposing range of two thirds is also high. The key reminder here is that those with a history of more acute instability are more likely to encounter some enduring cognitive difficulties whereas those on the lower end of the acuity continuum are less likely to struggle with sustained deficits. And with all of this, there is no guarantee either way. No doubt we will find examples of individuals with bipolar disorder whose experience is inconsistent with the trends being addressed in this blog.

Let’s shift to now to some of more pragmatic implications of what I’ve been saying.

First, how do you know if you do have any enduring cognitive deficit? The key here entails determining whether any of your difficulties with memory, language recall, attention, and concentration, and/or executive functioning (planning, organization and prioritization) are present during mid-range mood (when you’re not up or down) and/or during a sustained period of partial remission (mood state has remained fairly stable). If neither is the case, if your cognitive difficulties are present only during periods of mood intensity and then resolve once you’re back to baseline, then it’s safe to assume that your current status reflects cognitive issues that are mostly mood-phase specific. This is normal for most who live with bipolar disorder.

It’s also necessary to rule out the presence of neurologically-based diagnoses such as Attention Deficit Disorder. If you have bipolar disorder and you’re unsure about the presence of ADD, I suggest you see a professional who is knowledgeable about the overlap of these two entities. One of my previous blog posts “Misdiagnosis of Bipolar Disorder” (February, 2013) also speaks to the diagnostic distinctions between attention deficit and bipolar disorder.

If you do already know that you carry both diagnoses of ADD and Bipolar Disorder, then you’re faced with the complex task of figuring out what deficits come from what disorder as well as what degree of overlap may exist between the two. Frankly, these are tough differential diagnostic calls to make and doing so would require consultation from a neuropsychologist who is expert at assessing both. I guess the good news here is that if you already know you’re ADD, then you’ve already lived a life where you’ve had to adapt to some aspects of cognitive deficit. The cognitive deficits stemming from bipolar disorder are not going to present you with an entirely new set of challenges that are different from what you’re already used to living with and adapting to.

The next issue to consider is whether any symptoms of cognitive deficit may possibly be related to the medications you are prescribed. This too is difficult to sort out, as different people react to medications differently. Many who take one of the atypical antipsychotic medications often experience some cognitive dulling from the medication. But if your use of an antipsychotic was episode specific, prescribed during mania and discontinued once stabilization was achieved, or has been continued only on an as needed basis then you’ll be less prone to experience enduring adverse effects of the medication. Conversely, if you’ve been taking an antipsychotic on a daily basis over extended periods of time, the risks of enduring cognitive deficit are higher. That said, I also want to strongly caution readers that taking antipsychotic medication on a daily basis does not mean that cognitive deficit symptoms are inevitable. The amount and frequency of one’s dosing are important factors as is one’s susceptibility to medication side effect reactions. Ultimately, these matters should be raised and explored with your prescribing psychiatrist.

The same issues are applicable to the use of lithium as well as most of the other more commonly used mood stabilizers (anti-seizure medications). Lamictal or lamotrigene tends to be an outlier as it has a fairly low side effect profile; but that’s not to say it comes without any cognitive impact. It’s more that relative to the atypical antipsychotics as well as the other mood stabilizers typically used for bipolar disorder, its impact on cognitive functioning tends towards the lower end of the side-effect continuum.

Determining if your medications may be responsible for changes in your cognition should begin with an in-depth discussion of the issues with your prescribing physician. If he or she does not know the material with sufficient depth, it would be worthwhile to get a second opinion particularly from a psychiatric professional that specializes in the treatment of bipolar disorder.

What if you’re thinking that all the distinctions I’m referring to still seem fuzzy, and even after psychiatric consultation, you remain uncertain as to whether you suffer from bipolar-related cognitive deficit? I’d recommend you meet with a neuropsychologist who has a good grasp on the neurocognitive symptom profile associated with bipolar disorder. Undergoing a thorough neuropsychological assessment may help you to concretely identify whether you do have any enduring areas of deficit related to your bipolar disorder.

Another consideration in this discussion entails where you are with the course of your disorder. If you’re a young adult with relatively recent onset of symptoms (last few years), I imagine you may find this blog post to be concerning. That can be a good thing if it further promotes your resolve to make healthy lifestyle choices that can mitigate the destabilizing influences of your bipolar illness. Consistent sleep (7½ to 9 hrs./night), a stable sleep schedule, reliance upon a consistent daily schedule, consistent exercise, healthy diet and abstaining from psychoactive substances are all key elements which, if given sufficient priority, can make a positive difference in your capacity to manage your bipolar symptoms. The crucial implication here is that the sooner you can be successful with healthy lifestyle management the better are your chances of having a positive stabilizing impact upon your disorder.

Let’s move beyond assessment and prophylaxis and discuss the prospect that you’re sure that bipolar disorder has left you with areas of cognitive deficit consistent with what’s been discussed in this blog post. What are your options?

Unfortunately, I don’t have any “fix-it” responses. Deficits brought on from abnormal brain activity (mania, acute depression, rapid cycling, etc.) are similar to mild brain injuries. They don’t just self-correct. Instead the brain learns to adapt and compensate such that the injury is no longer evident through functional impairment. But, when the brain dysfunction occurs repeatedly over time, the extent of damage may not be adequately ameliorated through adaptation and compensation.

This is where acceptance becomes crucial. If you’re faced with some degree of limitation that’s not readily changeable, then you do what you can to accept what is. I know this sounds trite as well as much easier said than done. But the truth is there are some aspects of decline that we really do have to figure out how to live with and accept – all of us, bipolar or not. It certainly is the case with aging – we don’t have a lot of choice.

Is this different for the thirty-something individual with bipolar disorder who recognizes cognitive decline from the point when he or she first entered college? Yes and no.

The “no” entails the reality that the bipolar individual at age 33 may still be struggling with issues of acceptance in relation to his/her disorder, whereas the older individual who is primarily wrestling with age appropriate decline has had more experience with acceptance and adaptation.

Most of us typically do get better with acceptance and adaptation as we age. If we don’t, life gets a lot harder. And with regard to the “yes” – the bipolar individual has already had to accept and adapt to many things he or she probably didn’t anticipate before being diagnosed. The process of acceptance and adaptation has already begun earlier in the lifecycle than is the case for the majority of the population.

If there’s any good news in what I’m discussing, it’s the extent to which our technology-oriented culture is increasingly focused upon personal “apps” that help us to manage life’s complexities. Forgetfulness or planning difficulties can be lessened by the use of good scheduling apps, to-do apps and even more sophisticated project management apps. When writing and struggling to find the elusive but perfectly fitting word, you can allow a good thesaurus app to become your friend. If you’re finding you don’t always grasp the verbal content from classes or meetings, there are excellent non-obtrusive, user-friendly digital recorders that can serve as your back-up when your mind is drawing a blank. We’re even seeing increasing presence of apps that assist with self-monitoring of bipolar mood, energy, activity, sleep cycles, and medication use. And they’re getting better each year.

Now, do you really want to have to rely on technology to compensate for internal deficits? Of course not; you’d rather be on top of things. But that doesn’t mean it’s not a good strategy when “being on top” seems out of reach.

Sometime the process of accepting decline may even necessitate some significant life changes such as shifting employment roles or altering long term career goals that may require a higher level of functioning than an individual’s current capacities. If one doesn’t become a physician, a lawyer, a system’s engineer or an air-traffic controller, it doesn’t mean there aren’t other viable options that can provide a high degree of satisfaction. Even if one needs to step away from his or her high-level employment position and shift towards something more manageable, it is doable. The difficult part of this kind of downshift is being able to make the adjustment while not perceiving it as failure. Cognitive deficit stemming from bipolar disorder is no more your fault than impairment brought about by brain trauma. You don’t want it, you didn’t choose it and you can’t make it go away. That said, when a shift in your life’s activities represents a healthy adaptive choice, the new endeavor can still be an integral component of an overall picture of satisfaction and fulfillment.

Last, I encourage that you not lose sight of the fact that scientific based aspects of bipolar treatment are ongoing and evolving. Research is currently being conducted on cognitive remediation approaches for bipolar disorder. Newer intervention strategies are always in the research pipeline. And even in the absence of dazzling research outcomes, there’s the old adage that tells us “we grow wiser as we grow older.” It’s true. The role of maturation across the lifespan provides us with more potential for growth and healthy adaptation than most would ever imagine.

- - - - - - - - - - - - - - - - - - - - - - - -

Russ Federman, Ph.D., ABPP is in private practice in Charlottesville, VA (www.RussFederman.com). He is co-author of Facing Bipolar: The Young Adult’s Guide to Dealing with Bipolar Disorder(New Harbinger Publications). www.BipolarYoungAdult.com

Hack-o-Panic: Not Just for Sony Execs Anymore

$
0
0

Kim Jong Un Vists a Video Media Company
While North Korea denies master-minding the embarassing attack on Sony Pictures, there's no doubt that the movie company has sustained collossal damage to its finances and reputation. What may be less obvious is the psychological impact on people who have been the victims of hackers.

We know that cyberbullying has lead to many suicides, often of vulernable teenagers who were lured into sending explicit photos of their bodies. What we see in the Sony expose are revealing glimpses into the minds of Hollywood bigwigs. These snapshots are not very attractive.

There's no need to repeat the juicier leaked emails here because they've been widely circulated online and even in the mainstream media. There's also good legal reason not to rehash them since Sony Pictures, in a desperate move, has been threatening legal action against media outlets that continue to use what is clearly stolen information.

In an interesting blog post,

http://psyopregiment.blogspot.ca/2011/06/psychological-impact-of-...

retired senior US Army officer Lawence Dietz likens the aftermath of being hacked to having your home burgled.  "There is an irrational and emotional sense of personal violation even though the crime itself is non-violent and material possessions can always be replaced."

In fact, Dietz understates the case, since while your stolen TV set can be re-purchased with insurance money, how do you un-say something inappropriate you put in an email?

There used to be an "unsend" function in standalone corporate email systems like IBM's PROFS. It allowed you retract an ill-considered email if the recipient had not yet read it. And there's a little known feature in Microsoft Outlook that, in some circumstances, still allows you to cancel a message and keep someone from reading it.  However, it's best to assume that once you fire off an email it's irrecovably on its way to the recipient, and, as we are starting to realize, perhaps to the front page of the newspaper as well. 

As I explain in Technocreep (www.technocreep.com), the reigning Miss Teen USA, Cassidy Wolf, was the victim of a digital intruder who installed the Blackshades creepware on her computer. At will, he could turn on the camera on her laptop computer, which she often kept in her bedroom. Wolf did the right thing, and told her mother, who went to the police. The perpetrator was convicted and jailed.

Wolf described the terror she felt as a result of this invasion of her private space in a newspaper article http://www.dailymail.co.uk/news/article-2638874/More-90-people-nabbed-creepware-hacker-sting-victim-Miss-Teen-USA-describes-terror-watched-webcam-YEAR.html saying

'You would never think somebody would be watching you in your room and this guy had been. The thought of that just gave me nightmares.'

Teen beauty queens and Sony executives are not the first to have regrets over their digital footprints. President Ronald Reagan and Lt. Col. Oliver North undoubtedly regretted some of the things they said in emails, as did Bill Gates. A January 5, 1996 electronic memo from him was introduced as damning evidence in Microsoft’s antitrust trial.

In Technocreep I quote a drunken rant from then-Harvard student Mark Zuckerberg (which was introduced in a court case, hence is available.)  In it, he shows great disrespect for his future "users" writing "I almost want to put some of these faces next to pictures of farm animals and have people vote on which is more attractive."

I think it's fair to say that every single one of us has things in our email archive that we'd regret getting out to the public. While it takes some effort, it might be wise to delete things that you no longer need, or at least move them to an archive file which you keep on a USB stick in a locked drawer. You may need that Human Resources file from 2004 someday, but you're a lot safer if it's not spinning online. Of course, it probably is archived at some government agency, IT backup, etc. Still, why make it easier for the bad guys to find your dirt. And we all have digital dirt.

In a bizarre move, a promiment German politician is advocating a move back to manual typewriters to thwart electronic snoopery:

http://www.hotforsecurity.com/blog/are-typewriters-really-the-way-to-stop-cyberspying-germany-seems-to-think-so-9580.html

Switching to manual typewriters would simply raise other types of security concerns: who has access to the used ribbons? what about grabbing information from the typerwriter roller? 

Also, having a single copy of something is pretty rare nowadays and the minute those carefully typed pages are run through a modern photocopier they become electronic. In fact, malefactors have even stolen information from the memory banks of copying machines.

http://www.thestar.com/news/gta/2010/03/18/hightech_copy_machines...

Put that photocopier on an network, or the Internet, and, well, you're back to square one except with spelling errors because who can really type flawlessly on a manual machine?

Advice of Hollywood highrollers, and others: A much better plan is to simply keep your nasty thoughts to yourself, or express them face to face over lunch at Sardi's.  Just make sure that everybody's cellphone is turned off because, again as explained in Technocreep, there are plenty of ways for the bad guys to use those things to eavesdrop and record your conversations.

 

 

 

My Fear of Abandonment

$
0
0

After my mother passed away in 2002 I adopted a two year-old cat from a shelter. I wanted an older cat. I felt bad for them because they never got adopted. Everyone wanted kittens. I named her Zoe. After two years I thought she might like some company so I returned to the same shelter and adopted another two-year old cat. I named her Lucy.

They didn’t always get along as I had hoped. Zoe weighed about thirteen pounds and Lucy weighed about eight at her heaviest. Sometimes I’d hear the most God-awful screeching and then I’d see Zoe almost riding Lucy and this strange arrangement of cats would dash into the living room. I’d have to get the squirt bottle and aim the stream of water at the both of them to break it up.

One day in 2010, Lucy crawled into my closet and nestled in among my shoes. She refused to come out to eat, drink or use the litter box. She had always been the first one to jump on my lap when I sat down on the couch or to jump on the bed and get under the covers with me when I crawled into bed at night. I knew something was terribly wrong. The vet felt a mass in her belly and tests revealed that she had pancreatic cancer, the same illness that my mother died from. I put her to sleep the next day. She was suffering beyond what was humane.

I never got another cat. Zoe and I have been alone these past four years. She likes being queen of the roost and having all the attention. For the first year after Lucy was gone, Zoe occasionally wondered into the closet and sat down. I thinking she was looking for her sister.

Zoe is fourteen now and I know that cats can live a long time past that. But Zoe and I have gotten closer since my suicide attempt and subsequent hospitalizations this past winter. When I came home from the hospital, the second time (I was gone a week and had a cat sitter come to my home twice a day to feed her and clean the litter box), she expressed her anger at me by biting my calves — hard — below my yoga pants as I lay down to go to sleep the first night.

Now we have a bedtime ritual; as soon as I start closing the lights in the living room, she runs into the bedroom and jumps up on the bed and waits for me to get in. We lie down together, her body pressed up against mine. As I drift off, she gets up and goes to sleep in her bed. Invariably when I wake up in the morning — or when she wakes me, impatient to be fed, she has been sleeping with me since some time during the night.

I love her. I have no children, I live alone. She greets me each evening as I come through the door, tired from a long day at work and a long commute home. I feed her and talk to her as I put food in her bowl, then when she has had her fill, I pick her up and hug her and give her little kisses.

I can’t imagine when she is finally gone, coming home to an empty apartment. I had a scare with her last month. She ate very little for several days and I made an appointment with the vet. She was drinking and using the litter box. She was active and not lethargic. Then her appetite returned to normal and I cancelled the vet appointment. She’s been fine since.

But one night when she wasn’t eating I cried myself to sleep. Thoughts filled my mind that this was the beginning of the end, that she was terminally ill and that’s why she wasn’t eating. That I was going to lose her.

How do these little creatures become so important to us, so crucial to our physical and mental health and well-being? I don’t know. They steal their way into our hearts, whisker by whisker and take hold of our souls with a steel grip that is never loosened.

Zoe, have I said thank you?

 

Viewing all 51702 articles
Browse latest View live