Quantcast
Channel: Psychology Today
Viewing all 51702 articles
Browse latest View live

Young Men and Campus Violence

$
0
0

A great deal is being written and published in the mainstream press about males and sexual assault, especially young men on university campuses. I will leave it to the sociologists to explore the patterns of intimate partner violence, including the incidence of female-on-male sexualized aggression as compared to male-on-female sexualized aggression. It bears noting, though, that while it may seem surprising, the rates are about the same for both sexes. The widespread overuse of alcohol among high school students and undergraduates of both sexes is an important element in this, as are changing values regarding what constitutes acceptable comportment in society at large. Television reflects and reinforces a society heady with violence. All the while, it is important to bear in mind when studying these patterns that the individuals concerned are still psychosocially immature for much longer than young adults were even one generation earlier. Age 25 is the bew 18. This leads me to my topic.

 

The question regularly comes up whether there is a greater degree of innate aggressiveness in males than in females. This is important from a psychological point of view, since it would suggest, if true, that males are predisposed from birth to greater violence. After a few preliminaries on this question, we can consider the issue of sexual assaults on campus from a psychological perspective.

 

It is well known from infant observation and almost a century of child study that, at birth, males are more sensitive to stimulation than are girls. They are more edgy and responsive. Concurrently, they are more vulnerable to serious physical illnesses. By the time they reach early and middle childhood, boys are more kinetic. More active, their play style exhibits centrifugal patterns. They go off and away in seemingly random directions, tending not to congregate with other males in organized activity (including play) until later than girls do. This busy, active, exploring tendency is often enough read as recklessness, and so begins the general impression of boys as more restless, less manageable than girls—and more aggressive. About a year behind developmentally than girls of the same chronological age and on a later daily waking pattern than girls, boys are more difficult to organize, control and cause to submit to sedentary activities in school. They are therefore punished more often. All this is well known.

 

Boys also engage in what is termed rough and tumble play much more often than girls. We are reminded of puppies or kittens scrapping, sometimes biting each other. We see the young animals sometimes overreaching boundaries. They bite or swipe too hard. Then they stop, realizing that their eagerness and excitement have become excessive. Are they angry at each other, even when they tag a play pal with too much force? Not at all. And what about the boys? Is rough and tumble play a form of aggressiveness? Are boys angry at each other as they challenge, chase and wrestle one another? Not at all. No more than young animals. Read as displaying violent behavior, however, boys have been caricatured as aggressive, when in fact they are vigorously playful physically.

 

The tendency to rough and tumble play is both discouraged and exploited. Here boys are put in a double bind situation. They get the message early on that they must not be too active in certain settings, yet that very activity is sanctioned and encouraged in other settings; for example, in violent sports such as football. And if there is one sport that is quintessentially American, it is football. Boys and adolescent males are groomed to be tough and rough, but the play now is no longer spontaneous. Lines of offense are set up on the “playing” field and boys are trained to hit. Later on, for some, participation in boxing and even mixed martial arts is encouraged. Boys become immersed in the myth of powerful, dominant male as they are pressed to fight one another, this in necessary preparation for their possibly being called upon to fight for real in the military.

 

But is there is a natural line of development from hyperkineticism—the greater liveliness of boys, who enjoy running as fast and far as possible, and engage in rough and tumble play—to the focused hitting of violent sports and preparation to be a combat soldier? Not at all. Most boys would rather not fight, although shaming is more powerful than inhibiting and withholding directed fighting behavior when boys are encouraged, even taunted to show their dominance. In fighting with arms, it is well known, most soldiers have found it difficult to kill another man, unless in immediate self-defense of their own lives.

 

Training that exploits the tendency to rough and tumble play and transforms it into violence directed at other males cannot but generalize to behavior with everyone, females included under certain circumstance, especially when shaming once again is in effect. When there is more aggressive behavior in males, it has been inculcated and rewarded. Meanness and the capacity to go after another human being crosses the border between the sexes and, in some cases, is sexualized. This brings us to the issue at hand: violence against women on college campuses.

 

It should strike everyone as remarkable given the double bind young males are put in that there is not more violence against women. That incidents of such behavior occur more often involving college and university athletes should come as no surprise. Taught to hit on the field or court or on the ice, males are more likely to default to such behavior in situations in which controls of conscience and decorum are weakened—when intoxicated, for example. Or when they are shamed.

 

In a culture where hitting was not encouraged and rewarded with recognition, hero status, and scholarships, there would be less reason for boys to allow themselves to become accustomed to being aggressive. They are not so by nature. As we have seen, rough and tumble play is, well, play. Its motivation is excitement and pleasure in activity for its own sake, so-called functional pleasure. Developmentally, play segues into work, which produces its own functional pleasure, as Erik Erikson showed in the middle of last century. Hitting that leads to increase in status has come from a very different source.

 

Now how should we understand campus violence when perpetrated by males? First, we should recall that often the female partner has slapped or hit first. (That has its own social meaning and psychological significance.) Second, we should wonder: What recourse do ever less mature young males have when struck, especially in sexualized situations where performance masculinity is on the line for them and shaming is the consequence of not meeting expectations. Are males more sexually assertive than females? Certainly, they are expected to be, and there are deep evolutionary reasons for them to be so. Male sexuality has elements of capture, seizure and control of the female body that cannot be denied. Can partners share in taking the lead sexually? Certainly. But this takes experience and time for both partners and is not in the repertoire of most teenagers—male or female. And to repeat: Prolonged or protracted adolescence is now the rule rather than exception. Age 25 is the new 18.

 

Let us understand the social setting of the relative few incidents of sexual assault on campuses, given the psychological preparation boys and young males have experienced. We should expect many more, but happily there a very few such incidents. The broader question of whether sex (intercourse) is for the male a qualitatively different experience than it is for the female is again easy to answer. Yes, it is. But is that difference (assertiveness, control of the situation, performance expectations that cannot be met by faking it) a matter of innately greater aggressiveness. The answer is clearly, No. Are some males more disposed to being shamed into acting as though they were? It would seem to be so. This is very likely the quite small group who when also encouraged and rewarded for being seriously aggressive, not playfully active, with other males may sometimes lose control of themselves when taunted or hit by a female sexual partner. The hitting is the issue. Men: Don’t hit women, even though you may have been rewarded for hitting other human beings. Women: Don’t hit men. Putting hands harshly on another human being hurts, even if he is male.

 


The One Best Reason To Fall In Love

$
0
0

There’s a theory going round that we somehow gravitate toward romantic partners who enable us to resolve our unique childhood issues, whatever particular problems we couldn’t solve with our parents.

It’s a useful theory for motivating effort to work it out with our particular partners. After all, if we chose them for their unique capacity to help us, they must be right for us, even if they’re frustrating. Our partner choice is just the universe trying to teach us the unique lessons we need.

I’ve long doubted the theory’s accuracy because I don’t think our problems are so unique. If there were 500 unique problems, then finding the partner helped you with the one you had in childhood would be uncanny, but if there are only a handful of problems, it’s not uncanny at all.

These days I don’t think there’s 500 or even a handful. There’s only one problem. We all deal with it over and over, not just in childhood and love but everywhere.

We all want to feel sustainably safe and free, and gravitate toward commitments that promise security without compromising our freedom. Still, one person’s bid for greater safety and freedom can be a threat to another person’s safety and freedom. When you say, “I want to be loved (safe) for who I am (free),” it can sound to your partner like a take-it-or-leave-it threat, like what you’re really saying is “I demand that you compromise to accommodate me.” When we don’t feel safe or free we feel threatened and fearful.

Some say that love is letting go of fear. I don’t believe we can just drop our fears, nor should we. Letting go of fear with a greedy narcissist will ruin your life. The trick isn’t letting go of fear but fearing where fear is justified and not where it isn’t.

Trouble is, it’s hard to tell where to fear is justified, because fear and self-protection are often indistinguishable from greedy narcissism. Is our partners’ fierceness evidence of fear or is it a selfish grab for what isn’t theirs? Hard to tell.

Like I said, this is a problem not just in childhood and love but everywhere. An arms-buildup for self-defense is often indistinguishable from an act of aggression. Aggressors often claim they’re acting in self-defense. Even Hitler did. We end up in wars, big and small, catastrophic and petty in which each side is sure the other started it, that we’re merely defending ourselves against the other’s aggression.

I’ve dated broadly, hundreds of people, partnerships lasting 18 years and dates lasting six minutes. Each relationship was similar in this respect, compatibility punctuated by escalating conflict in which we each played out the drama of our childhood but more, the drama of the human predicament, competition for safe solid ground in the face of threat.

My love life history is the story of my trial and error experimentation with ways to glide through the conflict more efficiently. At first I thought I’d find partnership peace only when I found a partner who, like me could take responsibility for her fear-driven aggressiveness. In recent years I’ve recognized that I wasn’t as good at taking responsibility as I needed to be.

For a long time my quest was for a compatible partner. Eventually I realized I needed internal compatibility first. I needed to reconcile my demand for an equal partner with my gut’s desire for a partner who accommodated me by granting me an extra margin of freedom and safety.

There are right and wrong partnerships. I doubt I could ever make it with someone who never took responsibility for her fearful aggression. With her, I’d always be deemed the aggressor, which would make my fearful counter-aggression inescapable.  I’m best with a partner who, like me recognizes the human predicament, and doesn’t see herself as exempt from it.

These are scary times, scary to us individually--jobs, status and partnerships uncertain, and to us all collectively, Putin, global warming, the middle east; terrorists, counter-terrorists, and governments paralized by the ambiguous conflict over who's the aggressor and who's just defending their rights appropriately.

In times like these we’d expect to see a rise in both fear and aggression, and that’s what we’ve got. It takes a whole lot of work to resist the lurch toward self-defense at each other’s expense, but what we get instead is lots of lurch, people poor at distinguishing between real and imagined threats, people proud to indiscriminately stand their ground against all threats, real or imagined.

Romantic intimacy is perhaps the hardest exercise in resisting the temptation to lurch. After all, it’s so intimate, so much skin in the game when we risk that much union with anyone.

Some of us just plain aren’t up to the romantic challenge. That’s OK. Whether in partnership or not, we all get practice working with thisone universal issue. It shows up everywhere--at work, with our children, in our friendships, on the global stage.

Still, I end up thinking that everything I know about safety and freedom management and negotiation I’ve learned in love. I’m finally getting the hang of how to hang at close range without flying off the handle into aggressive self-defense.

Is Jodi Arias A Narcissist?

$
0
0

Is Jodi Arias a narcissist? In prior blogs I have examined and rejected many diagnoses that armchair psychiatrists have slapped on her. As I have discussed: No she is not a sociopath (though she shares many traits with OJ Simpson), doesn’t have borderline personality disorder, and is not a battered woman suffering from PTSD. The conclusion reached in these blogs was that she had no psychiatric diagnosis, rather: "She is evil and a murderer and now will be held accountable. Justice was served."

Not long ago Dr. Drew asked what my diagnosis of Arias would be if I had to give one. My answer was Narcissistic Personality Disorder (NPD), though she did not meet all the criteria. However when it recently came to light that Arias might be defending herself in the upcoming death penalty trial, it was time for a recalibration. Who else but a pathological narcissist would lay their life on the line in order to bask in the spotlight one last time? So, it’s time for an analysis of whether Jodi meets the criterion for narcissistic personality disorder.

When Jody was found guilty in 2013 for the first degree murder of ex-boyfriend Travis Alexander, she was publicly despised for her arrogance and lack of remorse. She reveled in the media spotlight while describing her sexual escapades, sparred with the prosecutor, changed her story repeatedly and dragged Alexander through the mud, all in an attempt to convince the jury she was not guilty. It was agonizing and humiliating, a TV train wreck that had millions tuned in.

Even though she was in jail Arias flourished and embraced all the attention -- negative though it was --and took advantage of every opportunity. She sold her drawings via her own website and eBay, had a Twitter account and conducted jailhouse interviews. But now, with the death penalty trial looming, the stakes couldn’t be higher.

What is the mindset here? She has already been found guilty and this time she's literally fighting for her life. Who in their right mind would not want the resourcefulness of a savvy defense attorney to minimize the chance of receiving a lethal injection? Does she really believe that she can offer a defense even close to that of an established attorney?

Frankly the logic here is irrelevant in Arias’s mind and the answer is quite simple: Arias craves the attention, more than she fears death. What better way for her to get all the attention than by self-representation? Now she can show the world just how savvy she is without having to share the spotlight with a pesky lawyer. Does the high school dropout think she can trump the seasoned prosecutor? Or is Arias confident she will connect with at least one male juror -- which by the way is all she needs- in order to escape the death penalty?

Arias is gambling with her life because she wants to be -- in fact NEEDS to be -- the star. She would love to face off with Martinez without any law experience, any college or even a high school diploma. All she has to show for an education is a GED, which she obtained while in jail. Her two defense attorneys, Kirk Nurmi and Jennifer Willmott, have been ordered to remain as Arias' advisory council. But she didn't listen to them before. How much will she listen now?

With this latest piece of information in mind let's take a look at the nine diagnostic criteria for Narcissistic Personality Disorder from the DSM 5. Keep in mind that of the nine listed symptoms, five must be present for a NPD diagnosis. Also, I have not evaluated Jodi and the following is not a true diagnosis, merely speculation based on the evidence at hand.

• Has a grandiose sense of self-importance: The stakes could not be higher. Arias, facing the experienced and effective prosecutor Juan Martinez, is trying to avoid the death penalty, yet considers self-representation. Enough said.

• Is preoccupied with fantasies of unlimited success, power, brilliance, beauty, or ideal love: She once made a self-comparison to Albert Einstein, but eventually decided (and bragged) that she was much smarter than he was. Even considering self-representation in the death penalty trial proves that she truly believes she is smarter and more savvy than any qualified attorney. Add another check.

• Believes that she is "special" and unique and can only be understood by, or should associate with, other special or high status people: Arias absolutely believes she is special but does not hang or seek out high status people. Travis, like her other boyfriends, was nice but by all accounts just a regular, good guy. No to this one.

• Requires excessive admiration: Jodi's crazy has resurfaced, this time by self-representation. She truly believes she can pull it off -- and perhaps she can. It just takes one juror to side with her. She's hungry for the world to marvel at her brilliance, and she's going to feed that need no matter the cost. During her time on the witness stand, Arias was cool and defiant, almost taunting prosecutor Martinez with the graphic sexual questions and discussions- all designed to garner the spotlight. Jodi thrives on being the center of attention. She wants to be admired (remember, she's smarter than Einstein), even if she's admired behind bars for the rest of her life.

• Has a sense of entitlement: Arias felt she deserved Travis and no one else did. She was extremely jealous when he showed anyone -- male or female -- attention. When he began seeing someone else, Arias slashed his tires -- twice. She also sent the other woman harassing emails. She wanted one hundred percent of Alexander's attention to be focused on her, and if it wasn't, she'd lash out, then lure him back with wild sex. Arias murdered Alexander and believes he deserved it. He was leaving her, after all (how dare he!), so she was justified in killing him. She has never and will never show remorse, because she can't. Check.

• Is interpersonally exploitative: Arias smothered Alexander with affection, but that wasn't enough to keep him. She hacked into his social media accounts and frequently looked in his phone to see who he was talking to. She played the victim (to bring out Alexander's protective side) by sending herself anonymous emails of a stalking nature. It was all a way to manipulate Alexander into staying in the relationship. Arias has a history of taking advantage of others. She stole her own grandmother's gun to commit the murder, then went out of her way to visit another man for an alibi. Yep, extremely exploitative.

• Lacks empathy: is unwilling to recognize or identify with the feelings and needs of others: Again, this relationship was all about Arias, not Alexander. The fact that he wanted out was irrelevant. She wanted him for her own, and therefore used whatever means necessary to win. When that failed a decision was made that no one else would have him. She coolly lied to the police as they questioned her whereabouts on the day he died without shedding a tear. To this day she fails to show even the slightest hint of remorse or any grief for the family. She truly has NO feelings for others.

• Is often envious of others, or believes that others are envious of him or her: Arias was jealous of Alexander and wanted to be the center of his attention. She was not envious of others, so No to this one.

• Shows arrogant, haughty behaviors or attitudes. If you want to see arrogant behavior from Jodi, take a look at social media activities she conducts from behind bars. She puts a hefty price on her "drawings", and there are those adoring fans who will buy them. She openly taunts Martinez, and her contempt for him is easily evident. You would think Arias would tone it down or at least pretend to have some sort of remorse for killing her boyfriend, but no. She is arrogant and defiant to a fault. Don't expect that to change any time soon. Haughty and egotistical attitude? Check and check.

In conclusion, Arias scored seven of the nine criteria needed for a NPD diagnosis. She went through life thinking the world revolved around her, and that she was really something special. How dare Alexander attempt to break up with her! Hell hath no fury like a narcissist scorned.

Arias once embraced death, proclaiming "Death is the ultimate freedom, so I'd rather just have my freedom as soon as I can get it." She changed that tune, however, soon after her guilty verdict was read.

NOTE: Just before I was set to post this blog, Arias changed her mind and is now seeking representation. Some legal newscasters say Arias knew all along she wasn't going to self-represent, that was just a way to manipulate Judge Stephens to dismiss Nurmi. But, in either case, it does not change my assessment --just one more example of her extreme manipulation at work and the mere fact that she even considered self-representation also speaks volumes. Also, she achieved the desired effect of basking in the spotlight yet again. The conclusion remains valid for a NPD diagnosis.

Judge Sherry Stephens, who granted the motion warned Arias she would not be allowed to seek self-representation again. She denied Arias' request for a new lawyer, saying her attorneys would remain Kirk Nurmi and Jennifer Willmott. Arias is discovering that while she may be able to manipulate various men in her life, these tactics just don't seem to work on the judge.

Whether Arias sought to represent herself because she wanted to get Nurmi off her case or to create the media buzz that resulted, the fact remains that we are fascinated by Jodi Arias, and she in turn seems to thrive on that fascination. Recently, the eyeglasses she wore during her 2013 trial went up for auction on her website. Starting bid is $500.

Arias is fed by her supercharged ego, and she thinks she's the superior being while the rest of us are fools, an afterthought. After all, Jodi knows what’s best for her—right? To the narcissistic Arias, the infamy and notoriety of being a killer is so much better than fading off into the obscurity of life in prison.

You can read my other blogs on Jodi Arias here:

 

Could Jodi Arias and OJ Simpson Be Soulmates?

Jodi Arias -- Guilty, Murder 1: A Psychiatric Analysis

Is Jodi Arias A Battered Woman?

Does Jodi Arias Have Borderline Personality Disorder?

Is Jodi Arias A Sociopath

Waking Up on the Wrong Side of the Desk—And Staying There

$
0
0

Most managers don’t give much thought to the experiences their employees are having right before they get to work. Maybe one employee sat in hellacious traffic and another quarreled with her teenage daughter. Someone else dropped a buttered bagel on his new shirt. Others spent time getting elderly parents ready for their daytime routine. Managers would do well to pay more attention to their staffers’ morning moods. My research with Steffanie Wilk, an associate professor at the Fisher College of Business at the Ohio State University, shows that start-of-day mood can last longer than one might think—and have a significant effect on job performance.

In our study, “Waking Up On The Right Or Wrong Side Of The Bed: Start-Of-Workday Mood, Work Events, Employee Affect, And Performance,”i we examined how start-of-workday mood serves as an “affective prime.” An affective prime—similar to the proverbial rose-colored glasses--is something in an environment or situation that orients you to see and respond to events in a certain way. Our work builds on research on affect (emotion) in organizations, a growing focus in recent years.ii In our study, we asked the question of whether start of day mood or “waking up on the right or wrong side of the desk” could follow employees throughout the day and influence their work performance.

To investigate how start-of-workday mood plays out in the real world, we studied full-time customer service representatives (CSRs) in an insurance company’s call center over several weeks. The company already had detailed performance metric tracking in place, and because the CSRs were at their computers much of the time, we were also able send them periodic pop-up surveys throughout the day to gauge how they were feeling. We studied their mood as they started the day, how they viewed work events such as customer interactions, how they felt after these interactions, and how this emotional response to customer interactions related to their performance. (We accounted for the reps’ temperament and the varying emotional content of the calls: for example, a customer directly reporting an auto accident might make for a more emotionally fraught call than speaking with an insurance agent.)

Both vicious and virtuous cycles emerged, linked to how employees felt at the beginning of the day. Reps who started out happy or calm usually stayed that way all day, and interacting with customers tended to further enhance their mood. For the most part, people who were already in a terrible mood didn’t really climb out of it, and felt even worse after interacting with positive customers.

Most importantly, we discovered strong performance effects when it came to quality of work and productivity. The employees who were in a positive mood provided higher-quality service: they were more articulate on the phone with fewer “ums,” verbal tics, and improper grammar. The employees who were in a negative mood needed to take frequent breaks from their duties to get themselves through the day. These small breaks piled up, leading to a greater than 10% loss of productivity.

How can managers use these findings to boost employee performance? Several interventions may nip in the bud what could be a damaging start-of-workday mood. Managers might send out gently humorous or morale-boosting emails in the morning, or hold a regular group huddle to provide support. Even just getting people to smile goes a long way. Psychologists have identified a phenomenon known as facial efference: the act of smiling, even if not spurred by feelings of happiness, actually causes people to feel good.iii

Managers can also allow people a little space first thing in the morning, for example to chat with colleagues before an early meeting. People also need time to “recover” or “reset” the night before,iv so managers may want to think twice before launching a late-night barrage of emails as this might set employees up for a bad start to the next day. And if an employee arrives a few minutes late, confronting him or her about it later on instead of immediately may yield a more productive conversation and a more productive workday.

Employees, for their part, may want to take a deep breath before walking in the door, creating an “intentional transition”. This might involve taking a different route to work, giving themselves a pep talk, or singing along to a favorite tune.

One interesting (and counterintuitive) finding was something we called “misery loves company.” Some CSRs who were feeling bad as they started the day actually felt less down after handling customers who were themselves in a bad mood. Perhaps this was due to perspective-taking in which they realized their own lives were not so terrible.

In any case, it’s clear that waking up on the right or wrong side of the desk does have an effect—for good or ill—on the quality and productivity of one’s workday. But, beyond strong coffee, there are several tactics managers and employees can use to help everyone make the most of the day.

References:

iRothbard, N. P., & Wilk, S. L. (2011). Waking up on the right or wrong side of the bed: Start-of-workday mood, work events, employee affect, and performance. Academy of Management Journal, 54(5), 959-980. doi: 10.5465/amj.2007.0056

iiBarsade, S. G., Brief, A. P., & Spataro, S. E. 2003. The affective revolution in organizational behavior: The emergence of a paradigm. In J. Greenberg (Ed.), Organizational behavior: The state of the science (2nd ed.): 3–52. Mahwah, NJ: Erlbaum.

iiiZajonc, R. B., 1985. Emotion and facial efference: a theory reclaimed. Science, Apr 5; 228(4695):15-21.

ivBinnewies, C., Sonnentag, S., & Mojza, E. J. 2009. Daily performance at work: Feeling recovered in the morning as a predictor of day-level job performance. Journal of Organizational Behavior, 30: 67–93.

Sonnentag, S. 2003. Recovery, work engagement, and proactive behavior: A new look at the interface between nonwork and work. Journal of Applied Psychology, 88: 518–528.

About Nancy Rothbard:

Prof. Nancy Rothbard is an award-winning expert in work motivation, teamwork, work-life balance, and leadership. She is the David Pottruck Professor of Management at the Wharton School of the University of Pennsylvania. She is also faculty director of "The Leadership Edge: Strategies for the New Leader,” an Wharton Executive Education program running November 17-20, 2014.

Prior to Wharton, Prof. Rothbard was on the faculty of the Kellogg Graduate School of Management, Northwestern University, and holds degrees from Brown University and the University of Michigan. She has published her research in top academic research journals in her field and her work has been discussed in the general media in outlets such as the Wall Street Journal, ABC News, Business Week, CNN. Forbes, National Public Radio, US News & World Report and the Washington Post.

Toys Are Made for Dogs to Rip: the Wookey Hole Cave Massacre

$
0
0

Plush toys, like teddy bears and fluffy animals, seem to be among the favorite toys for dogs. Waggle one in front of your dog's face and give it a toss and even dogs who have very little hunting instinct will usually race after the toy, grab it, and then, very often, start to dismember it. I often stop local thrift shops and Salvation Army stores to pick up a few small plush toys which are usually sold for less than a dollar. These are then sacrificed to my dogs who proceed to tear off their heads and limbs and scatter their filling about, much to my wife's dismay.

Because such toys are so irresistible to dogs, I sometimes use them in my dog obedience class, scattering them around the room and then asking students to heel their dog around the toys, while trying to keep their dog's focus on the task and not on the toy. One evening, during such an exercise, I watched a woman struggling with a young Labrador retriever who insisted on grabbing any toy which came within reach. In exasperation she finally called out to me and said "This is not fair. Even trained guard dogs sometimes find these kinds of toys too tempting to ignore." She then went on to tell me a story about a teddy bear, named Mabel, which had been owned by Elvis Presley and eventually fell prey to a fully trained Doberman pinscher named Barney. Her description sounded incredible to me so I decided to check it out. It turns out that the story was true. While this narrative may demonstrate how enticing such toys are for dogs, it also may be interpreted as a tragic tale of jealousy resulting in the murder of a plush toy.

The story unfolded in 2006 at a tourist attraction in Somerset England. The caves have been used since the Paleolithic era, and various artifacts and skeletons have been found in them. In modern times one of the attractions is the Witch of Wookey. This is a hunched outcrop of rock that looks something like a human figure. The tale that is told about this rock is that it is the petrified remains of a sorceress who was turned into stone by a Glastonbury monk.

In addition to the caves there are number of other attractions, including a collection of special and rare teddy bears. The collection attracts 300,000 people each year and is valued at $1.2 million. The insurance company felt that this set of toys was valuable enough for them to insist on hiring highly trained guard dogs to ensure its safety. Hence enter Barney, the Doberman pinscher.

The femme fatale in this story was a teddy bear named Mabel who had been made by the famous German toymaker Margarete Steiff in 1909. In the early 1970s Elvis Presley purchased it for his daughter Lisa Marie. Mabel normally lives at Maunsel House, which is a 13th century manor . It is owned by Sir Benjamin Slade who is a collector of Elvis memorabilia. Sir Benjamin purchased Mabel for $75,000 at a Memphis auction and she was lent to the Wookey Hole Cave exhibition.

The most valuable pieces at the exhibition are normally kept in glass cases. Since Mabel had just arrived she was on a work surface while she was being prepared to go on display. Barney's handler, Greg West, was on duty and saw Mabel lying there. Plush toys are almost as irresistible to humans as they are to dogs, and most of us reach out to touch or pet them when the opportunity arises. This was the case for West as well. A moment later Barney grabbed the bear and when the mauling was over Mabel's chest had a gaping hole and her head had nearly been torn off. But the trouble did not end there since the six-year-old Doberman pinscher was now in a plush toy frenzy.

Daniel Medley, the exhibition's manager described what happened next, "Once Mabel had been prized out of his jaws he [Barney] then went on a rampage. He was pulling arms off, heads off, and there was fluffy stuffing everywhere. Up to 100 bears were involved in the massacre. It was a dreadful scene."

Barney's handler Greg West struggled for some 10 minutes before he could get the dog back under control. At the end of that brief period approximately $35,000 worth of damage had been done to the array of toys.

So what happened to turn Barney the Doberman into a serial toy killer? Mister West tried to explain, "Barney has been a model guard dog for more than six years. I still can't believe what happened. Either there was a rogue scent of some kind on Mabel which switched on Barney's deepest instincts, or — it could've been jealousy. When it happened I was stroking Mabel and saying what a nice little bear she was."

Stanley Coren is the author of many books including: The Wisdom of Dogs; Do Dogs Dream? Born to Bark; The Modern Dog; Why Do Dogs Have Wet Noses? The Pawprints of History; How Dogs Think; How To Speak Dog; Why We Love the Dogs We Do; What Do Dogs Know? The Intelligence of Dogs; Why Does My Dog Act That Way? Understanding Dogs for Dummies; Sleep Thieves; The Left-hander Syndrome

Copyright SC Psychological Enterprises Ltd. May not be reprinted or reposted without permission

Where Did The Summer Go?

$
0
0

There has been a delay in my blog posts. A little ironic given the title of my blog, Don't Delay, I suppose. The editors of Psychology Today gave this title to my blog back in 2008; the first blog to deal with the topic of procrastination. In fact, I wouldn't say "don't delay." I delay all the time. So do you.

The thing is, not all delay is procrastination. Certainly my summer hiatus from blog writing was not procrastination. I had no intention of writing.

I have an intention today - a very clear one. I want to share some research results that are "hot off the press" as they might say in old-fashion newspaper speak. This research delineates 6 kinds of delay, two of which we consider procrastination.

My most senior doctoral student, Mohsen Haghbin, is months from completing his dissertation. His research has involved creating a better measure of procrastination (something I'll write about later, pardon the pun), as well as a Delay Questionnaire, which he calls the "DQ."

The DQ was constructed using many carefully crafted scenarios about different kinds of delay in our lives. Based on participants' ratings of these scenarios and analysis of patterns in these ratings, Mohsen identified 6 types of delay: Irrational/Needless, Hedonistic, Inevitable, Purposeful, Arousal, and Delay due to Psychological Distress.

Quite a list, isn't it? My summer writing delay with the blog was part Purposeful Delay (I wanted/needed/deserved a break from my computer) and part Inevitable Delay (or unavoidable) as my children's summer holidays meant a signficant change of focus for me as well. No procrastination. 

In contrast, the Irrational/Needless and Hedonistic types of delay define the procrastinatory space in our lives. We're either putting things off when it's in our own best interest to act because we're using task avoidance to cope (putting things off to future self) or seeking immediate pleasure quite explicitly (no intentions made even on important tasks while we're busy having fun). In both cases, we're "giving in to feel good," typically at a long-term cost to our goals.

For the sake of completeness (while keeping this brief), I'll note that arousal delay is highly related to hedonistic delay, and it does remind me that although we don't work better under pressure, some people use this pressure to motivate themselves. And, it's worth noting that Delay due to Psychological Distress is a special case, one that philosopher Al Mele argues keeps one apart from the criticism of procrastination or weakness of will. As he notes in his writing on weakness of will, only the non-depressed agent can be considered to have a weakness of will per se. When we face psychological dysfunction such as depression, delay may best be understood as a symptom.

I think Mohsen's research is incredibly important, and I will return to it in this blog later in the year - another purposeful delay as I await the results of yet another student's research on the fallacy of the notion of active procrastination. In the end, I hope to clarify further what we mean by procrastination in relation to other forms of delay, because delay takes many forms in our lives, some quite sagacious and others quite self-defeating.

What kind of delay interests or troubles you? Understanding the difference is important.

Do You Know Your True Strengths?

$
0
0

Garrison Keillor said it best in his oft-quoted sign-off, "Well, that's the news from Lake Wobegon, where all the women are strong, all the men are good looking, and all the children are above average."

In the Christian tradition, it's good to be "humble," as in Matthew 11:29 where Jesus says “I am gentle and humble in heart.”

Humble people, psychology researchers argue, see themselves clearly, knowing their strengths and weaknesses and not exaggerating either. Such self-awareness isn't especially common. Overconfidence is a big problem.

Men tend to overestimate their ability as investors, compared to women, and the most confident rack up losses.

Prisoners see themselves as more honest than average. In one recent study of 85 people in a British prison (aged 18 to 34, most jailed for violence or robbery) the prisoners overall said that they were more moral, kind to others, self-controlled, compassionate, generous, dependable, trustworthy, and honest than an average prisoner--and the "average member of the community."

Can that be true? Probably not. So be careful if you think you're outstanding in one particular way. Unless you have plenty of proof, the opposite may be true.

One of the least compassionate people I'd ever known used to boast about her compassion. I knew someone who considered himself a "genius," He could do some technical things very well but in other arenas seemed to have trouble thinking. I often saw him go blank and forget details as if he were elderly, though he was only 45. He admitted that he was worried about his memory, and it seemed obvious that his sense of superiority was a way of consoling himself for deficits.

As an editor, I get nervous if a writer tells me that his work is very good. Better writers have been humbled by the process. They're more likely to say, "I've done this well and these four other things not so well."  

This doesn't mean you should constantly criticize yourself. It implies that you should look for objective criteria. Set goals and meet them. Accept that you are average in lots of ways--simple math suggests that this will be true--and stronger in some respects and weaker in others. This may make it easier to tolerate other people's weaknesses and feel less envious of their strengths.

Opioids and Marijuana Laws

$
0
0

For several years now, pain researchers have been wondering about a question that lay folks, including federal government regulators, might dismiss as absurd: The idea that marijuana, far from creating more problems for people who use opioids (narcotics), might, at least in some cases, help prevent opioid overdoses.

The notion took hold several years ago in California, when oncologist Donald Abrams, chief of hematology and oncology at San Francisco General Hospital and a cancer specialist at the University of California, San Francisco Osher Center for Integrative Medicine at Mount Zion, began testing marijuana in chronic pain patients in a federally funded study. Patients inhaled vaporized marijuana three times a day for five days while taking their regular opioid pain relievers. In 2011, Abrams reported that pain was significantly reduced, by about 27 percent, when inhaled marijuana was added to the opioid regimen.

This finding suggests that marijuana may have a synergistic effect, enabling pain patients to get good pain relief with lower doses of opioids. It’s too small a study to constitute proof, to be sure, but it’s a scientific hypothesis that seems well worth pursuing — except to the feds.

Fast forward to late July 2014, when a different team of researchers, using a different approach, supported the findings. Dr. Marcus Bachhuber, an internal medicine fellow at the Philadelphia VA Medical Center and a Robert Wood Johnson clinical scholar, led a team that simultaneously analyzed opioid overdose deaths from 50 states between 1999 and 2010 and tracked the implementation of medical marijuana laws in the 10 states that had them during that period.

The findings, reported online in The Journal of the American Medical Association Internal Medicine, offer another bit of tantalizing evidence about marijuana’s role in mitigating the potentially lethal effects of opioids.

The team found that states that had legalized medical marijuana had a 24.8 percent lower average annual opioid overdose death rate compared to states that hadn’t. In 2010, that translated to about 1,729 fewer deaths than expected.

The Bachhuber team acknowledged that the finding is an association, not proof of causality, but added that if the relationship between medical cannabis laws and opioid overdose mortality is substantiated in further studies, laws permitting the use of marijuana as part of a comprehensive approach to pain relief might make sense.

Dr. Abrams agrees. In early August, he told me that the new finding “is consistent with the reality” of animal studies and his own work in human pain patients showing that “cannabis potentiates opioids — you can get away with less opioids if you add cannabis.” Particularly for patients in pain at the end of life who wish to communicate with their families and have trouble doing so if on high doses of opioids, doctors can “wean them off [opioids] by using cannabis.”

Abrams suggested to the National Institute on Drug Abuse (NIDA), that the agency consider a study on marijuana as adjunctive (or auxiliary) therapy for people in pain. The agency refused his request. It is longstanding NIDA policy to fund only studies of controlled drugs for their abuse, not their therapeutic potential. Abrams called that policy “unfortunate and short-sighted.”

In an editorial accompanying the JAMA study, Dr. Marie J. Hayes, a psychologist, clinical neuroscientist and addiction specialist at the University of Maine, wrote, “The striking implication is that medical marijuana laws, when implemented, may represent a promising approach for stemming runaway rates of non-intentional opioid analgesic-related deaths. If true, this finding upsets the applecart of conventional wisdom regarding the public health implications of marijuana legalization and medicinal usefulness.” Hayes and her co-author, Dr. Mark S. Brown, caution, however, that if medical marijuana laws afford a protective effect, it is not clear why.

In an interview with ABC News, Dr. Igor Grant, chief of psychiatry at the University of California, San Diego, and director of the Center for Medical Cannabis Research, suggests a synergistic, or “opioid-sparing” effect, much as Abrams believes. “This isn’t a new idea,” he said. “Physicians have used combination drugs for a long time, such as acetaminophen with an opioid. By putting several different pain medications together, they are able to reduce the overall opioid dose, and thus decrease the risk of overdose.”

By extension, changing marijuana laws may help not just individuals, but bring public policy more in line with medical science.

(Originally posted on WBUR’s Cognoscenti)  


How the Asian Pop Culture Boom is Feeding Eating Disorders

$
0
0

In our book, Marcia and I wrote about the “Western toxin effect,” in which developing countries begin to experience a rise in eating disorders, often spread through exposure to TV and western emphasis on appearance and physical beauty. Now, however, that phrase seems quaint. In Asia, for instance, pop culture has been completely co-opted by western ideals of beauty, and the epidemic of eating disorders is full-blown.

I realized this when I came across a chapter in Euny Hong’s new book, The Birth of Korean Cool: How One Nation is Conquering the World Through Pop Culture.

Hong tells the story of how South Korea, with head-spinning rapidity, rose from a poor, much-invaded nation to a pop culture supernova, dominating the world through its film, K-pop style of girl and boy bands, movies, electronic products and video games. But along with rise to prominence has come an obsession with appearance and plastic surgery.

South Korea is now the world’s plastic surgery capital, accounting for more procedures per capita than the U.S. or Brazil. The most popular procedures are double eye-lid surgery (adding a crease in the eyelid to make it look larger, rounder and more western) and rhinoplasty, often to make the tip of the nose pointier.

The most disturbing part of this trend, though, writes Hong, is “the increase in the number of young children requesting surgery.” Plastic surgeon Dr. Sewhan Rhee says it’s common to see Seoul “middle-school children get plastic surgery during their winter school break. It’s not considered weird. It’s considered normal. “

 Peer pressure and the desire to conform, those animating values of adolescent life, have resulted in “a surgical arms race,” Hong writes, “a one-upmanship among schoolchildren to look prettier.”

 So it’s no big surprise to learn that South Korea has for some time now been seeing a rapid rise in eating disorders. Way back in 1997, in fact, Los Angeles Times reporter Sonni Efron reported this article on the rise in eating disorders in Asia. In 2012 Georgia Hanias, in a Marie Claire article, “Anorexia: The Epidemic Japan Refuses to Face Up To,” reported that eating disorders were increasing more rapidly in Japan than anywhere else in the world.

The “toxin effect” has spread to young women of all socio-economic levels in other parts of Asia, including South Korea, Singapore and Hong Kong, even in countries where hunger is still an issue, such as India, Pakistan, and the Philippines.

In this YouTube video on eating disorders and thinness in South Korea, an American vlogger who covers the global invasion of Korean culture (or as it’s known in Korea, hallyu), notes, “K-pop has been hugely influential in the whole diet scene because people want to look like their favorite K-pop stars.” Many of these stars are known for their extreme diets. Popular looks include “chopsticks legs” or “lollipop head,” a big head fronted with a cute face and westernized eyes on top of stick-thin legs.

Korean culture is also one in which commenting and even bullying others about their size, shape, and appearance is not taboo. Japan is no different. During my time living there, I got used seeing friends or relatives greet one another with the comment, “Oh, you got a little fat, didn’t you?” Steph, the waif-like vlogger and host of the series “Hallyu Back,” recounts how she has been picked on by fellow teachers or students. “Any day I looked a little bloated,” she says, comments would range from, ‘Oh, fat teacher,’ to ‘Are you having a baby? ’”

The comments on her post are plaintive and alarming: A viewer, likely Japanese, whose handle is Taeyu95 writes, “I really want to go to Seoul next year but my body is holding me back. I’m short and very fat. 159cm (5’2.5″) and 49kg (108lbs). I want to drop to 39kg… And K-pop is very influential to me starting dieting. I just don’t wanna be called fat in Korea.” Note how objectively normal sized, even thin, this person is.

 Another comment, from “Kpoping,” reads, “I’m 4’11 and weigh 102 pounds. I would call myself fat. So I try to do the kpop diets and ulzzang diets. I’m the biggest out of my friends but I’m also the tallest. Also my parents say that I’m big boned. And I have been influenced so bad by Koreans and kpop idols.” “Ulzzang,” by the way, describes the pale skin of certain K-pop stars; fans follow their diet tips (“Don’t eat too much meats! Meats turn u brown!”) in the hopes of turning as pale as their idols.

 As you can see, the toxin has breached the hazmat suit. I wonder how soon it will be in our globalized world when even the farthest reaches of the globe are no longer immune?

Insanity Pain

$
0
0

Don’t take your love away from me,

Don’t you leave my heart in misery,

If you go, then I’ll be blue,

‘Cause breaking up is hard to do…

                             -Neil Sedaka

 

If you follow my writings, one of the first things you’d notice about me is that I embrace everything there is to know about the human condition.  Frankly put, I love everything vibrant and human about me – the good and the bad – and I’m a man who wears his heart out on my sleeve.  Transparency is important in my line of work (I’m a therapist who owns and operates a rehab); my clients can smell crap a mile away, so it behooves me to be honest with them whenever possible, and sometimes this means telling them about my own experiences, if only to diffuse their shame.

If you are going through a bad time, chances are, I’ve been there.  And survived.

I spent much of my adult life in a drug-induced haze.  Almost every experience I’d ever had was lived while under the influence of some kind of narcotic or other.  And the few that weren’t propelled by drugs were certainly driven by alcohol.  I was a mess.  So, you can imagine what it was like for me when I got out of rehab and was thrown into the world and forced to have real life experiences without the buffer of a foreign substance in my system.  Add to this the disjointed education I’d had (because I was a drug addict whose system of values lent itself to ferretting out the heroin I needed instead of figuring out how I was going to be a productive member of society) and you can see how my being let loose on the world was a recipe for disaster.

And then she walked in.

I want to say that it was love at first sight, but the truth of the matter is, most of what connected she and I was just chemistry and great timing.  She was beautiful and she liked me as much as I liked her.  We were an amazing couple and everyone loved us together.  There was passion and excitement and laughter – I immediately made her the center of my universe.  I was in love with her and in love with life.

And then she dumped me.

Words cannot explain how devastating that was for me.  I was crushed.  I mean, here I was, a self-made man who had survived a harrowing battle with drug addiction; I was upwardly mobile; I had a lot of friends; I was popular… and yet, she’d decided I wasn’t the man she wanted to be with.

She’d decided I was no longer worthy of her time and attention (it’s a miracle I didn’t just hurl myself into traffic that very afternoon).  The pain was excruciating – it struck deep at the core of who I was as a human being and it was debilitating.

I remember being in a place where our mutual friends had gathered and tearfully telling everyone how much pain I was in and how she was the culprit.  I was halfway through my diatribe, explaining how sorry I felt for her because she was a woman who clearly had intimacy issues when she got up and abruptly left the room: I’d vilified her. It felt good – it felt like vindication – but I was still a broken man; I was still a man apart.

This pain was worse than the pain I’d suffered when my father had died -- and it was a pain that knew no boundary; it infiltrated every aspect of my life.  I used to have to pull over in my car on the side of the freeway and sob over this profound loss.  We were so spiritually connected and our love had run so deep, how could she do this to me?  And how could I get her the help she obviously needed to work on her issues so that she could find it within herself to take me back, just take me back, PLEASE?

My friends, when we would gather, would draw straws to see who would have to walk me back to my car and listen to my tale of grief and woe.  My nights were spent alone, with me struggling not to drink or use drugs over this ruined relationship and my days were spent in a cold, dead fugue of anguish as I performed autopsy after autopsy on our failed union.  Where had I gone wrong?  I wanted to jump off a cliff.

Now, it’s important that I interject here that this deep, incredibly epic relationship that devastated me and ruined me and threatened to destroy the life I had worked so hard to build was only four months old.

This relationship wasn’t years in the making.  It was not as if we’d done more than date for a while and had a little fun and shared some wonderful days and nights together.  WE HAD ONLY BEEN TOGETHER FOR FOUR MONTHS.

So, why was I such a wreck?  What, exactly, had been the cause of what I had come to call my “Insanity Pain”?  Because it really was, you know: It was crazy what I was going through over a relationship which, in the great scheme of things, was just a blip in the radar of my life.

But my feelings weren’t imaginary; something was definitely going on there.  And I needed to figure out what it was.

I needed to do the work.

Now, in putting myself back together, I realized that my perception of events was skewed, partly, by my own victimhood.  She wasn’t the one with intimacy issues; she was getting along fine in her life.  She had a very clear idea of what she wanted and she was honest enough to not string me along, especially since it was so apparent that what we had was not going to work out.

I was the one with intimacy issues.  I was the one who couldn’t let go.  I was the one whose every iota of happiness was dependent on how she felt about me.

I was the one who’d made her the center of my universe. 

That wasn’t a relationship; that, my friends, was a HOSTAGE SITUATION.  The reason I was attracted to her in the first place was because she was a wounded bird and one of my character defects was a desperate need to rescue her; I wanted to take care of her – but she couldn’t take care of me. And I’ve discovered over time that I’m not the only one to have made this mistake.  I know this because we live in a world where phrases like “emotionally anorexic”, “avoidance addict”, and “love addict” exist.   We live in a world full of people who want relationships, but often times mistake sex and romance for intimacy.  Even worse, there are those men and women who have other ancillary issues (abandonment, oedipal, etc) which get activated when we are rejected or are subjected to a perceived rejection.

The key to overcoming these pitfalls – the key to not letting a break-up destroy us – I think, is understanding why we need connection and knowing that we are capable of living without it.  I know that sounds like an oxymoron, but we need to look at the science of it first and then delve into the aesthetics.

We experience our first dose of oxytocin – the pair bonding hormone – as infants when we are fed and cared for by our parents.  This experience primes us for future interactions and helps us “sense” when deep bonds are formed.

Unfortunately, oxytocin is also produced by our bodies – in massive quantities – during sex and, sometimes even, simply when we are held.  We feel connected and we mistake this for intimacy, but the truth of the matter is, intimacy is the ability to be truly vulnerable with another person and have that other person truly know you.  And this comes with time and intense vulnerability.

I had neither with the woman who seemingly destroyed me.  This was a four month love affair that was passionate and exciting but was, in no way, the love of my life.

Insanity Pain is a killer.  It overtakes us and convinces us that we’ve lost something truly profound and everlasting when the truth of the matter is, some relationships simply run their course and end.  And sometimes, whether we like it or not, some relationships are simply wrong for us.

There’s a saying, “I don’t care how hot they are, someone somewhere has had it up to here with their garbage.”  And I think that sometimes this is true also.  I have many clients who find themselves in abusive relationships and elect to stay in them because they believe the pain of separation is going to be too intense.  Now THAT’S Insanity Pain.  It is a pain that convinces you that it is going to be so insurmountable, you are better off suffering than being free to seek fulfillment elsewhere.

I don’t subscribe to this notion.  I subscribe to the belief that we are, each of us, born with the right to be truly happy and adored.  I believe that real intimacy is more than just being willing to share the same bathroom.

Intimacy is a partnership.  It allows us to feel safe and protected; it allows us to feel valued and understood; and it allows us to trust the other person with our anger and know that they aren’t going to leave you for voicing your feelings.  Intimacy makes for healthy marriages and relationships, and it creates the kinds of bonds that you would kill or die for because you’ve worked hard for them, and risked vulnerability to create them.

I am pleased to report that when that woman dumped me, she did NOT destroy me.  I did the work and figured out what my part in all of that Insanity Pain was and I stopped letting it govern my emotional state.  I also opened myself up to other relationships down through the years until I met and fell in love with my wife.  And, I can say with certain conviction, that we have more than just fleeting moments of intense passion; she is my partner and my friend, and we have devoted our lives to making one another happy while raising our three children.  I share my pain with her, and my joy.  And I trust her completely with my feelings, no matter what they are, because I know she will not judge me for them or abandon me because of them.

And that, my friend, is intimacy.

As for the rest of you reading this, who are in the throes of a break-up or who are walking through your own unique versions of Insanity Pain, I give you this tiny bit of advice:

Your destiny has never been tied to anyone who left you.

 

Why We Can't Stop the Depression Epidemic

$
0
0

Comedian Robin Williams’s death in August rocketed depression into the headlines, and his suicide became a defining moment when the nation would finally reckon with depression.

But this reckoning never happened.

At first, most everyone had something to say about Williams’s death. Much of the reaction was intensely personal. The sudden loss of a beloved entertainer to a mental-health struggle spurred the famous and the not-so-famous to reveal their own experience of depression, their own brushes with suicide, often for the first time. Heartfelt, lovely, overtly confessional editorials and blog posts issued forth. Individual stories of suffering (and redemption) became part of the collective consciousness of the event.

Williams’s death also aroused discomfort and confusion. People hungered for an answer to the question: why did he kill himself? The search for answer veered toward the very personal. Sometimes it felt like a feeding frenzy to find one man’s fatal flaw. Talking heads on Fox and CNN voiced psychiatric orthodoxy, referencing Williams’s “serious brain illness” or “genetic disease.” Other commentators seemed intent on blaming Williams’s death on vague and unnamable psychological forces referred to as “demons.” As time went on, most of the coverage filled in biographical details to explain his death: financial troubles, a career on the rocks, a recent Parkinson’s diagnosis. As these details filled in, it was striking how quickly people shifted their stance from “I can’t believe it” to “of course.”

In retrospect, the death of Robin Williams did not renew our dialogue about depression. It was not only that much what was said was clichés: “What a tragic loss,” “he suffered from a disease over which he was helpless,” “if only he had gotten the right treatment his life might have been saved.” It was that most of what was said was personalized, reactions and reflections upon individual lives. Although his death set the stage for a larger, deeper, wider analysis of depression in America, this analysis never happened.

Such an analysis was, and is, badly needed. Because on that that August day--as on every day in America-- more than 100 people died by suicide and 2,500 attempted suicide. For all the wall-to-wall coverage of Robin Williams, there was no serious attempt to explain why the suicide rate for adults has increased 25% since 1999. No serious new proposals were launched to address the problem of depression.

And the national conversation about depression slowly sunk back into its previous moribund state.

*************************************************************

Everyone knows, on some level, that depression is an important societal problem. Yet we have trouble connecting the dots. Best estimates are that about thirty-five million American adults will struggle with depression, nearly one in five people. Surely, someone you know is affected, whether it’s your teacher, your neighbor, your doctor, your friend. We tend to think of each person as struggling with unique problems. But focusing on the individual dots becomes a dangerous habit of mind when there are so many of them.

In fact, I don’t think it’s hyperbole to say that we’re the midst of a depression epidemic. Here are some reasons why such a strong term is warranted:

• Public health authorities recognize depression as a looming danger. According to oft-cited projections from the World Health Organization, the amount of disability and life lost due to depression will, by 2030, surpass than that from war, accidents, cancer, stroke, and heart disease. For boys and girls worldwide, depression is already the number one cause of illness and disability according to the WHO.

• According to our best epidemiological studies, depression is striking people at younger and younger ages. For example, the National Comorbidity study found that 18 to 29 year olds were already more likely to experience depression than those sixty and older, even though they have been alive for less than half as long!

• Indeed, the situation on America’s college campuses is particularly troubling. According to the Association for University and College Counseling Center Directors survey of counseling center directors, 95% of college counseling center directors report they are seeing a growing number of students with significant psychological problems at their center. A 2013 survey of college students found that 33% of women and 27% of men reported a period in the last year of feeling so depressed it was difficult to function.

• Antidepressant medication use has skyrocketed. According to the Center for Disease Control, antidepressant use has increased 400 percent since 1988. Eleven percent of Americans over the age of twelve takes an antidepressant. In the United Kingdom, so many people are taking Prozac that scientists are concerned that active metabolites in human urine are running off into water and are now affecting the behavior of wildlife.

One paradox about depression is that its personal and economic toll has actually grown just as more research and treatment resources have been poured into combating it. How can it be that, despite all the efforts aimed at understanding, treating, and educating the public about this condition, that rates of depression continue to rise? Why have our treatments plateaued in their effectiveness, and why does the stigma associated with this condition remain very much with us? Psychiatrist and New York Times columnist Richard A. Friedman recently wrote, “Of all the major illnesses, mental or physical, depression has been one of the toughest to subdue.”

Our highly personalized reaction to Robin Williams’s suicide, and our inability to sustain a national conversation about the depression epidemic after his death, provide important clues about our why we have made so little progress in addressing this problem.

*****

Jonathan Rottenberg is the author of The Depths: The Evolutionary Origins of the Depression Epidemic. Follow Jon on Twitter.

Revising the Golden Rule for Lasting Social Harmony

$
0
0

The Golden Rule, in some form or another, exists in many cultures, religions, and ethical codes. While the specific form varies from culture to culture and belief system to belief system, the basic idea is: treat other people the way you want to be treated.

At first glance, treating other people the way you want to be treated seems to be a sensible maxim for congenial social living. Upon closer inspection, however, assuming that the preferences of others are the same as your own can be recognised as the problem rather than the solution to cohesive group functioning.

In some ways, we seem to have an intuitive appreciation of the problems with the current form of the Golden Rule. When I offer to serve people coffee, for example, I do not treat them the way I like to be treated. I like my coffee dished up as a double-shot espresso so, if treating people the way I like to be treated was the key to satisfying interpersonal relationships, I would, without needing to ask, provide other people with a double-shot espresso at the time that I wanted to partake of the pleasures of this pungent and bitter sweet flavour.

Typically, however, when people offer coffee to others, they first ask these other people, how they wish to be treated. In other words, they ask the recipients of their coffee providing gesture about their own personal preferences regarding the delivery of the coffee. If one of my colleagues prefers his coffee served up as a skinny decaf latte while another prefers hers to be an Affogato, I don’t make any attempt to persuade them that they are making wrong choices. Curiously, neither do I wonder if I have travelled down an incorrect coffee selection path and if, perhaps, I should adopt my companions’ ways of thinking.

Perceptual Control Theory (PCT; Powers, 2005; www.pctweb.org) explains how we function according to our own internal desires, dreams, wants, goals, attitudes, beliefs, and motivations. The fact that we all have subjective specifications for preferred states of the world is constant but what these stipulations are varies from individual to individual. PCT suggests an improved version of the Golden Rule:

Treat other people the way they want to be treated (Robertson & Powers, 1990).

We are designed to conduct ourselves according to our own, not other people’s, standards and inclinations.

If we are going to solve some of societies’ most intractable problems we need to be more considerate, respectful, and accommodating of each other’s different goals and preferences. Successful social living arises when individuals are able to achieve their own desired states of being without limiting, preventing, or otherwise interfering with other people from doing the same thing.

In my book Control in the Classroom (Carey, 2012) I describe some general principles of curriculum delivery for every classroom and school based on the idea of treating people the way they want to be treated.

In Australia generally, and particularly in remote Australia where I live, there are long-standing disparities between Indigenous and non-Indigenous Australians on a range of different factors. A significant stumbling block in achieving sustainable improvements for Indigenous Australians seems to be that, generally, most of the current solutions to their problems are developed from the perspective of non-Indigenous Australians. Indigenous Australians’ ideas and opinions are often not sought and not heard by policy makers and planners.

Similarly, people experiencing severe psychological distress who are identified as having mental health problems often object to being forced to conform to the views and solutions of the clinicians who provide treatment to them.

Social groups will never experience lasting harmony when one section of the group is required to comply with the preferences of another section of the group.

Treat other people the way they want to be treated. The secret to successful partnerships, families, schools, clubs, communities, and societies is as simple and as complex as that.

References

Carey, T. A. (2012). Control in the classroom: An adventure in learning and achievement. Hayward, CA: Living Control Systems Publications.

Powers, W. T. (2005). Behavior: The control of perception (2nd ed.). New Canaan, CT: Benchmark.

Robertson, R. J., & Powers, W. T. (Eds.). (1990). Introduction to modern psychology: The control-theory view. New Canaan, CT: Benchmark.

The One Best Rule for Predicting People’s Behavior

$
0
0

Many of us don’t trust economists because their predictions are cold, biased or wrong. Still, the “dismal science” as economics is sometimes called may be the best source of psychological reasoning we’ve got, in part because it allows cold neutrality that therapeutic psychologists can’t afford. Psychologists often sound more like inspirational speakers or priests counseling virtue. Economists talk less about what we should do than what we really do.

Economists talk about money. Behavioral economists apply economic reasoning to all sorts of currencies--money, power, fame, beauty, efficiency and status, all sources of potential advantage.

Economists say “people don’t leave money on the table,” but that applies to all currencies. We rarely leave advantage on the table. If there’s a source of advantage dangling nearby, we tend to go for it, despite moral arguments against it.

Will we succeed in resisting the temptations of plastic surgery because it’s artificial? Will guys stop watching porn or using Viagra because it’s indecent or inauthentic? Will people forego driving their cars hither and yon because it’s bad for the environment? Will a relationship’s more powerful partner abstain from dominating because it’s oppressive? Will leaders exploit their advantage for personal gain even though it’s bad for society? Will the rich, famous and beautiful stay humble because that’s the better way to be? Will we stop eating bacon, that most delicious of foods, because its immoral or causes health problems?

As a general rule, no. A few might but not enough to make a groundswell of moral abstinence. Dangled advantage doesn’t go unexploited. When we see an opening, we go for it.

Compared to the vivid lusty temptation of advantage, moral doctrine is a weakling. Temptation whispers, “do you want me?” over and over, non-stop as long as it dangles. Morality responds “no!” but falters and then falls silent, replaced with rationalization. Temptation makes us pray for one good reason why it’s okay to take the advantage, and our prayers are inevitably answered.

Why do attractive people so often act like they’re entitled to more than everyone else? Don’t they have any moral sensitivity? Chances are, they have as much moral sensitivity as the next person, just more dangled advantage. They might resist a while, but the temptation is always dangling there, and little by little they slip into taking it with no negative consequences, so they slip some more. We all tend to take whatever advantages we can get away with.

Why, in divorce, do partners declare their exes' manipulative, immoral narcissists? More often than not it’s simply because, despite good intentions, the ex had openings, opportunities to take advantage that can’t be resisted for long.

One night years ago the couple was deciding what movie to see and one partner stumbles on a rhetorical move that worked to persuade the other, a stumble no gut can forget since it worked so well, a move that will work in other negotiations of greater significance for years to come.

If you want, you can take advantage of the opportunity to declare yourself high-minded, exceptionally moral in a world of immoral beasts.  But you will do so at a cost to your powers of prediction.

If you want better guesses at what people, you included, will do, keep one rule in mind: sooner or later people succumb to available temptations when they can get away with it.

If you want better moral outcomes, don’t employ moral suasion but instead design systems that make it so people don’t get away with it.

And apply the same reasoning to your own morality. I have a rule I work to apply: I can’t change myself but I can usually change my environment so it changes me. In other words, “I’m no better at resisting temptation than the next guy, so if I want to discipline, I have to design my environment so it doesn’t dangle temptation.

I have a porn blocker on my computer. I don’t bring lusty foods into the house. When I use too much social media I make it inaccessible. I don’t have a chair at my desk; I have a treadmill desk. And I make sure I partner with someone who doesn’t let me get away with much.

Quit Giving Your Kids a "Hall Pass" Through Life

$
0
0

With the growing difficulty of teenagers who ignore school, their parents, their obligations and think they can just pass on by, it is time for parents to take a look at what is happening here. We are giving our kids a “hall pass.” In our quest to have them love us – or at least pretend to, we are ignoring the need to play tough and get them to understand that they have obligations and responsible roles to play in school and the family.

Answer honestly as you are the only one in the room. How many of you ignore a poor grade and still give your kids the keys to the car so that they can go hang with friends instead of study? How many of you catch your kids lying to you about whether or not they really did talk to the teacher about some extra help? How many of you worry so much about them being socially liked that you will forgo a needed punishment to be sure they can attend a party or be with their friends? Let’s talk about drugs for a moment. How many of your kids smoke dope? How many of you demand that they stop and put a consequence to that? You’d be surprised at how parents lose a blind eye to what they believe is just the way it is right now. How many of you let your kids just leave the dinner table or not even show. This is the time that, traditionally, families shared their day and their issues. The real question is, how many of you really know your kids? 

Why are we trying so hard to manufacture ways to win the love of our children when we do not have their respect. They lie to us, they do not meet their obligations, they ignore us, they go into their room for hours at a time just texting away and we do nothing. Rather than start a disagreement, we have stopped setting rules for our kids, stopped paying attention to their grades, stopped giving them simple tasks at home, allowed a total mess in their room and in their lives. Permitting texting as the only process by which they communicate, we have stopped actually talking to our children. We have fallen into that parental exhaustion trap that has stopped us from getting the energy to guide out children by being kinder and gentler, but not by being specific in our expectations and tougher in our demands. Does this mean we don’t love our children? It means we probably love them more in our concern that they become better people, responsible people, decent students, and that they learned to love their own parents unconditionally because they know we are there to help not hurt the growing up process.

Giving your kids a hall pass and giving up on what you expect from them are not the tools that are going to take them strongly into the future. They are playing you and it is time for that to stop. 

Milgram at 50: Is It Time for a Rethink?

$
0
0

We all know that humans are capable of committing unspeakable evil. Perhaps the pinnacle of this human capability is the ability to commit genocides. After any particular group of people shows itself to be capable of enacting such extreme evil, we can come up with a slew of explanations that show why they could do such a thing, and by implication, why we could not. Maybe the people involved were mentally ill. Maybe they were brainwashed into believing that their victims were somehow subhuman. In the 1960’s some psychologists were grappling with how the holocaust could have happened and came up with a much more disturbing explanation. Perhaps, they wondered, the human instinct of obedience to authority was so strong that anybody would commit acts that they knew to be deplorable with a simple authoritarian order to do so. The idea seemed crazy, and many psychologists said so, but it was compellingly supported by a series of famous experiments conducted by Stanley Milgram. Those experiments, in which ordinary people were convinced to administer seemingly severe electric shocks to their fellow humans—created an uproar when they were first published 50 years ago. Then, and for decades afterwards, they were seen to imply that people do the bidding of those in authority regardless of the consequences of their actions.

 But what if we’ve been too quick to accept Milgram’s version of the events? What if it wasn’t the men in white lab coats that Milgram’s hapless subjects were obeying? New research in a special edition of the Journal of Social Issues by psychologists S. Alexander Haslam, Stephen D. Reicher and Megan E. Birney find that it isn’t obedience that induces people to engage in questionable behavior, but rather a desire to participate in a worthwhile project (To read a free copy of the article, click here).

Milgram’s experiments can never be replicated—no ethics panel today would ever allow it—so Haslam, Reicher and Birney developed an ingenious substitute. They gave participants a choice of five negative words to describe people whose onscreen image they were presented with. At first the images were of offensive groups such as the Ku Klux Klan, but they became successively more pleasant. As participants began to push back against choosing a negative epithet for nice people, they were given updated versions of Milgram’s original prods: a series of phrases used to induce participants to continue if ever they seemed reluctant.

 “The results showed that the most successful prod wasn’t a command, but rather an appeal to engage in and advance a scientific experiment,” says co-author Haslam. “The lesson is that people are not programmed to follow authority; they make active choices and are accountable for their decisions.”

Elsewhere in the Journal, researchers used newly discovered archival findings and experimental methods to identify theoretical and empirical problems with Milgram's account. They find that, rather conforming naturally, people can and do disobey orders and that authorities need to work hard to secure consent. Stephen Reicher, one of the co-editors of the special edition commented, “Milgram showed us that ordinary people can do extraordinary harm to their fellows. Now, 50 years later, this volume showcases new understandings of how this happens, of when people choose to obey toxic authority and of how they can resist.” To read the table of contents of the entire journal issue, click here.


OMG, What a Slacker Mom!

$
0
0

Come on, we’ve all been seared by it. The evil eye from some officious grandma in the supermarket who knows better than we do about our kids. Even if our kids are “typical” and just having a bad day.

Take the frustration, anger, and shame you feel when your “normal” four year old has a meltdown in front of the Family Size Cocoa Puffs shelf and some know-it-all tells you how to fix your little problem. Then amplify it by roughly 8 bazillion decibels and pray the dingy floors of Stop and Shop will open up and swallow you whole. Or better yet that they will devour the know-it-all, with toothsome relish.

That’s what those of us whose kids are not what is thought of as typical face on a regular basis. Trust me, dealing with a thirteen year old’s emotional outburst in the aisles of the sporting goods store you knew you shouldn’t have agreed to visit is not much fun--for you or the thirteen year old, who will feel sad and ashamed afterward, because it only confirms his knowledge that he is “weird.”

(As an aside, most children who act out in highly atypical ways probably can’t help it. Nor can their caregivers necessarily prevent it--even if said children have been receiving therapeutic treatment for years and years.)

So here, in the interest of parents of disabled children everywhere (but especially those of us whose children and teens suffer from psycho-emotional dysregulation), are six assumptions you should never make about the parent whose kid makes YOUR kid look like an angel. (Aren’t you lucky we exist, though?)

1. What a crappy parent--obviously wouldn't know a limit if it smacked her upside the head. Time for some tough love, baby!

2. Look at the way that guy's coddling his child. It’s clear who wears the pants in THAT family.

3. Typical. Mom’s on the phone while kid is wreaking havoc in public. Probably setting up a pedicure for tomorrow.

4. Oh, ho! Your teen is losing it and you’re TEXTING? Damn, what ever happened to good old “family values.” (Bring back the Cleavers, I say!)

5. This child clearly needs help. Why haven’t his parents done something about these behavioral issues?

6. This kid is disrupting all the well-behaved kids’ fun (or learning, or whatever). Doesn’t her mom understand how selfish it is to bring a child like that to the playground?

I think these thoughtless comments speak for themselves. But because I have insisted on claiming the last word since early childhood (my mother can confirm this, should you need confirmation), I am going to add a few thoughts before signing off.

First, please don’t think I am accusing everyone whose kids are “normal” of insensitive behavior. (Although, as I think I stated in an earlier post, if yours are TOTALLY normal you might want to have them checked out, in case they are cyborgs or something.

Some of you normal parents rock. So do some of you quirky parents with typical kids. And so on and so forth, in every possible combination and permutation.

Hey, I’m okay, you’re okay. It’s all good, right?

So then: some of you get it. But some of you make assumptions like the ones above and don’t even realize you’re doing it.

We know you don’t mean any harm (well, some of you don’t). But damn, does that moral judgment of yours hurt. Because disability is not a moral flaw. And if you think it’s hard to parent toddlers or teenagers, imagine what it’s like for us!

The final 75 of my way-too-many last words are these: when you see me and my kid and he seems to be falling apart, the person I’m talking to on my cell phone might just be his therapist. The texting? Actually, I am googling “what to do when your teenager has a panic attack in the produce aisle.” Or typing out “S.O.S!!!” to my poor husband Lars--who has seen far too many of those texts hit the fan while he’s on deadline at work.

Why Atheists Need an Afterlife

$
0
0

end of the worldNow that I have two little grandkids, I think about the future more than ever. The far distant future. It’s painful to consider that these innocents I know and love so intensely are likely to be profoundly affected by climate change, not to mention the ever-present nuclear shadow.

Try this thought experiment:

Suppose you knew that, although you yourself would live a normal life span, the earth would be completely destroyed thirty days after your death in a collision with a giant asteroid. How would this knowledge affect your attitudes during the remainder of your life?

That’s what the author of a new book suggests we do. Death and the Afterlife (Oxford) by Samuel Scheffler is based on the Berkeley Tanner Lectures and includes commentaries by four additional thinkers. Scheffler uses a variety of philosophical arguments to make his main point, which is that it’s not only our own lives and experiences that make life meaningful to us.

“Few of us,” writes Scheffler about the thought experiment quoted above, "would be likely to say: ‘so what? Since it won’t happen until thirty days after my death, and since it won’t hasten my death, it isn't of any importance to me. I won’t be around to experience it, and so it doesn't matter to me in the slightest.’"

In fact, he suggests, a lot of what keeps us busy now, activities that mean something to us, would become less important to us in such a situation, in a way that confronting our own deaths wouldn't cause. He also discusses how other scenarios would affect us, such as if infertility became universal, so no more generations would be born.

NOVELIST'S VERSIONS

Another way to conceptualize such profound questions is to read T.C. Boyle’s short story entitled "Chicxulub" (now collected in his new book, Stories II). As the protagonist of the story and his wife anxiously await word about their missing daughter, he muses, brilliantly, about Chicxulub, the huge asteroid or comet that probably knocked out the dinosaurs.

"The thing that disturbs me about Chicxulub . . .  is the deeper implication that we, and all our works and worries and attachments, are so utterly inconsequential. Death cancels our individuality, we know that, yes, but ontogeny recapitulates phylogeny, and the kind goes on, human life and culture succeed us. That, in the absence of God, is what allows us to accept the death of the individual. But when you throw Chicxulub into the mix—or the next Chicxulub, the Chicxulub that could come howling down to obliterate all and everything even as your eyes skim the lines of this page—where does that leave us?"

What I find amazing, thrilling even, is to compare how many words it takes to lay out a hard-to-refute philosophical argument (as in Scheffler’s book) versus how fast it is to slam readers in the gut with an experience they can relate to instantly: imagining the death of their child.

I tried to do something like this, obviously in a much less gifted way than Boyle managed, in Kylie’s Heel. Here’s what Kylie thinks when something terrible seems to have happened to her only child:

At least now I no longer need to worry about my son or the fate of the world. When you have a child, you yearn for the world to thrive, but now all the looming catastrophes have lost their power to terrify. Fire, ice, fallout. The outcome of the game no longer matters to me.

Does that seem selfish to you? Surely it is, but isn't it understandable in the midst of grief?

Related to that, I read a recent post in the NY Times in which Gary Gutting, a professor of philosopy, interviewed Jay Garfield, who teaches philosophy and the humanities, about the "requirements" of Buddhism. Discussing a belief in future lives, Garfield says:

"This suggests one way for a Buddhist not taken with the idea of personal rebirth across biological lives to take that doctrine as a useful metaphor: Treat the past reflectively and with gratitude and responsibility, and with an awareness that much of our present life is conditioned by our collective past; take the future seriously as something we have the responsibility to construct, just as much as if we would be there personally."

Cool.

Copyright (c) by Susan K. Perry, author of Kylie’s Heel

Follow me on Twitter @bunnyape

Why Psychology Thinks You Are Average

$
0
0

This blog attempts to highlight one of the central conceptual issues facing the field of psychology today, which is the relation between the “individual” and the “aggregate.” The individual refers to the specific person, you, me, your spouse, your child, etc. The aggregate refers to a group, and especially group averages and statistical comparisons made based on those group averages. The central conceptual issue at hand can be phrased in terms of a question: what do aggregate data actually say, exactly, about specific individuals?

This question is central because much of mainstream psychology has adopted the assumption that one can simply apply aggregate truths to the individual. In other words, psychology essentially assumes that each individual (you, me, whomever) is explained by the “average” at the level of the aggregate. However, as has been spelled out in a series of powerful papers by Dr. Jim Lamiell, this assumption is seriously suspect.

Let’s begin by noting how common aggregate analyses are. They are everywhere when you look for them. That is, the design of much research goes something like this: Group A is compared with Group B on some variable, which could either be a natural category (e.g. gender) or a treatment intervention (e.g. cognitive behavior therapy compared with psychodynamic therapy). The dependent variable means and standard deviations of the two groups are reported, and a test of significance shows the difference between the means to be unlikely as the product of chance, upon which the claim is made that there is a general and lawful relationship between the group difference and the outcome difference. This procedure of inference is so common that it can be considered the way mainstream psychologists derive “truths” about their subject matter.

For clarity, let’s work with a specific example. Not long ago a popular Psychology Today blog reported on research that showed that fathers who had daughters showed a decrease in their support for traditional gender roles compared to fathers who had sons (and men without children). In one such study providing support for this claim, researchers analyzed gender attitudes in over two thousand men and women over time both before and after becoming a parent, and based on the data they gathered, they concluded that “having a daughter (vs. having a son) causes men to reduce their support for traditional gender roles, but a female child has no such effect among women” (Fitzgibbons & Malhotra, 2011, p. 209).

I recall noticing this finding when it was reported on PT because it did not resonate with me personally (although I was not particularly surprised that it was true at the level of the aggregate). A bit of background offers a context as to why. My first child, Sydney, is a daughter (now fifteen). What caught my attention is that I personally would be very surprised if MY attitudes toward gender roles were more traditional 17 years ago, prior to her birth. Why? Because fifteen to seventeen years ago I was on the tail end of my deep immersion in—and devoted commitment to—a strong feminist ideology. Although I still consider myself a feminist, I have “softened a bit” since that time regarding my knee-jerk reaction against any kind of gendered ideology. As a function of this personal history, although I would currently score very low on the scale the researchers used to measure traditional gender role ideology, I would likely have scored even lower prior to the birth of my daughter.

One interpretation of the facts above would be simply to say that I “am an exception to the rule” discovered by the researchers. After all, most men have not been deeply immersed in feminist ideology just prior to the birth of their first daughter, so my unique history then might have created conditions that resulted in the exception of the rule.

But this interpretation gives rise to the question of what exactly is the rule and to whom, exactly, does it apply? To answer this, let’s first take a look at the precise data the authors used to make the claim that when fathers have daughters it influences their ideas about gender. The authors report on the aggregate differences for the 1000+ men who took the measure before and after having a child. For men who had a girl, the mean pre-post difference on the scale was 0.65 of a point on the scale. So, the “rule” generated from the aggregate analysis is that for fathers having daughters, their self-reported endorsement of gendered roles decreases just over a half a point on the scale the authors used.

Yet finding an aggregate change of 0.65 of a point on this scale does NOT mean that all the men or even most of the men moved 0.65 of a point. Interestingly, we can KNOW that NONE of the men actually changed 0.65 of a point. Why? Because the scale is scored in whole numbers, thus the smallest difference that could be measured is a change of 1.0 point on the scale. This reality makes concrete a point that Dr. James Lamiell makes over and over in his writing, which is that aggregate differences DO NOT warrant claims to knowledge about all individuals within the aggregate. Indeed, what is found for the aggregate might actually represent something that applies to no one. He writes:

“But as far as we can justifiably claim to know on the basis of the statistical 'rule,' every single individual in the designated population could be an exception to the statistical 'rule.' In other words, the problem is not that the aggregate 'rule' won't apply to every one. The problem is that the aggregate rule cannot knowably be said to apply to any one. The rule quite literally applies to no one.”

In short, average differences between groups DO NOT necessarily reveal causal forces that apply uniformly to individuals. Indeed, the “average man,” who is supposedly revealed in aggregate differences, is essentially a mathematical fiction. Interestingly, this point has been known for a very long time. In 1867, the philosopher and mathematician Moritz Wilhelm Drobisch (1802-1896) wrote:

“It is only through a great failure of understanding [that] the mathematical fiction of an average man . . . [can] be elaborated as if all individuals . . . possess a real part of whatever obtains for this average person” (Drobisch, 1867, quoted in Porter,1986, p. 171).

Ultimately, exactly what aggregate results tell us and their implications for individuals is an extremely complicated issue, with many different angles and caveats. The most important take home point from this blog is to be reminded that one CANNOT glibly apply findings from the aggregate to the individual. It is a point that mainstream psychology has too long been in denial about.

References

Shafer, Emily Fitzgibbons and Neil Malhotra. 2011. "The effect of a child's sex on support for traditional gender roles." Social Forces 90:209-222.

Porter, T. M. (1986). The rise of statistical thinking 1892-1900. Princeton: Princeton University Press.

Recommended Readings by Jim Lamiell

Lamiell, J. T. (2003). Beyond individual and group differences: Human individuality, scientific psychology, and William Stern’s critical personalism. Thousand Oaks, CA: Sage Publications.

Lamiell, J. T. (2000). A periodic table of personality elements: The “Big Five” and trait “psychology” in critical perspective. Journal of Theoretical and Philosophical Psychology, 20, 1-24.

Lamiell, J. T. (1998). ‘Nomothetic and ‘idiographic’: Contrasting Windelband’s understanding with contemporary usage. Theory and Psychology, 8, 23-28

Lamiell, J. T. (1981). Toward an idiothetic psychology of personality. American Psychologist, 36, 276-289.

Low Testosterone, Obesity and Alzheimer’s Disease Are Linked

$
0
0

As men get older two things usually happen: testosterone levels fall and the waistline grows. Testosterone levels fall by about one percent every year after age 30 while obesity among the aged has grown from about 8% in 1980 to about 25% today. These two phenomena negatively influence the normal function of the brain. Low testosterone has been associated with age-associated cognitive impairment which may be due to changes in how neurons interact with each other at the cellular level and whether certain neurons survive exposure to toxins or injury. Furthermore, low testosterone levels are a risk factor for developing Alzheimer’s disease. In men who suffer with diabetes, those who also have low testosterone levels tend to show more neuropathies.

Obesity, particularly in the aged, is a major risk factor for the development of the dreaded metabolic syndrome that includes elevated blood lipids and insulin levels as well as glucose intolerance.  Obesity also increases the level of inflammatory proteins and, ironically, endocrine changes that result in lowered testosterone levels. Obesity coupled with a high fat diet has also been shown to accelerate a decline in thinking, learning and memory and the ability to pay attention.

A recent study published in the Journal of Neuroinflammation by scientists from the University of Southern California investigated the relationship between aging, obesity, low testosterone and mental decline associated with dementia. Simply stated, a high fat diet that leads to obesity will exacerbate the negative consequences of low testosterone upon brain function. The results of the study also indicated that one important feature of brain and body aging, i.e. the impaired regulation of inflammation, links the consequences of obesity and low testosterone levels to Alzheimer’s disease. Interestingly, their study did not find that low testosterone levels significantly affected body weight.

These findings suggest two possible therapeutic approaches: testosterone replacement therapy is controversial, while the other, a caloric restriction diet, is not. Last year a study published in the Journal of the American Medical Association found that testosterone supplements place men at increased risk of death, heart attacks and strokes. Today, the only groups who support using these supplements are the manufacturers who hope to make a profit off ignorance and confusion, regardless of how much risk this brings to the men taking the supplements. We all remember how well this anti-science approach once worked for the tobacco industry.

So what’s an older obese man to do? Do not take testosterone supplements! A much better solution is to lose the excess weight by consuming far fewer calories. Unfortunately, exercise alone is not going to be very effective because low testosterone levels are often associated with muscle weakness and atrophy. Today, an overwhelming body of scientific evidence across a wide spectrum of medical disciplines strongly argues that obesity accelerates the aging process, impairs overall cognitive function and, ultimately, is responsible for numerous processes that kill us. The good news is that the consequences of inflammation due to obesity likely develop slowly and require many years to be fully expressed. Therefore, no matter what your age, the sooner you lose the fat the sooner your brain and body can begin to recover. This risk factor is preventable!

© Gary L. Wenk, Ph.D. Author of Your Brain on Food (Oxford University Press)

TED talk: youtube.com

Share Your Best Narcissistic Social Media Stories

$
0
0

Earlier this week, I wrote a post about the Single Question Narcissism Inventory, which basically asks a person if they are narcissistic and trusts their answer. I wrote about how, in my own research, I'm interested in analyzing social media to determine if people are, in fact, narcissistic.

In response, lots of you shared interesting (and sometimes scary) stories of narcissistic people you know and what they have done. As I've mulled over those comments and emails this week, I thought it would be interesting to gather up some of the best stories of online narcissism and discuss them in a future post.

This post is my request to you, dear readers, for your stories. What are some of the most narcissistic things you have seen people do online? Your stories can come from Facebook, Twitter, Instagram – any social media site you like! Bonus points if it can be shared with a screen shot of what the narcissist posted*.

Don't worry - if I use something you shared, I will blur out any names and photos so the example will be anonymous. You can send me anonymized stories, too. Also, I won't be using any of this for my academic research. Everyone at my university operates under strict ethics board guidelines. I haven't sought or received approval for this kind of data collection and thus I can't use it in research even if I wanted to. So you can be confident that you won't be subjecting your friends, exes, or yourself to any research or analysis by sharing with me.

You can send your stories and screen shots over email to jgolbeck@umd.edu. Please put "Narcissism" in the subject line so I don't accidentally miss your message.

And remember - it's not just the current obsession with #selfies that indicates people are narcissistic. There are seven components of narcissism, and great examples of any of those will be much appreciated:

  1. Authority, which deals with leadership skills and someone's interest in holding power (often for the sake of having power – think Frank Underwood from House of Cards)
  2. Entitlement, which measures a person's expectation that they be given things they think they "deserve"
  3. Exhibitionism, or how much someone likes to be the center of attention
  4. Exploitativeness, describing how much a person will take advantage of others to get what he wants
  5. Self-sufficiency, or how much you rely on others vs. yourself (one of the more positive elements of narcissism)
  6. Superiority, a person's feeling of being better than those around them
  7. Vanity, which centers around physical attractiveness, especially a belief in one's own attractiveness

Next Monday, I will take a couple good examples from each of these categories and share them so we can have a discussion about how social media affects narcissism and how you can decide if and how to respond to it.

Thanks in advance - I'm looking forward to seeing all you have to share!

*If you want to take a screen shot, but don't know how, here's a link that will give you instructions: http://www.take-a-screenshot.org

Photo credit Aleera

Viewing all 51702 articles
Browse latest View live