to wit, an Inarguably Wise but Jaded Forensic Psychiatrist Expatiating on Topics Medico-Legal, Historical, and Scientific, with a Few Non-Sequiturs for Good Measure
On April 18th, 2014, a gun said to have been used by Wyatt Earp during the famous shootout at (actually near) the O.K. Corral sold through a Scottsdale Arizona auction house for the princely sum of $225,000, well above its pre-event estimate of ‘only’ $100,000 to $150,000.
Note that I said, “said to have been used.”
Wealthy aficionados always run up auction prices for the rare and unusual. Anything with a purported Earp or Old West provenance is certain to bring big money.
[sidebar: speaking of gunslingers in general, see my earlier post on the sale of the pistol of Bonnie Parker – of Bonnie and Clyde fame – at http://alienistscompendium.com/hybristophilia/]
A well-heeled collector from New Mexico, who was absentee-battling over the phone, placed the winning bid for the .45 Colt single action army revolver, the so-called Peacemaker model known from every western movie ever filmed. The Colt in question came from the estate of the late Glenn Boyer, an author of several books on Earp who collected Earpabilia until his death in 2013.
Peacemaker
The gunfight near O.K. was actually a small event in a time and place known for not-infrequent barroom brawls and the public brandishing of weaponry. It really wasn’t until 1930 – the year after the subject died – when Stuart Lake published the then-definitive biography of Earp that the gunfight began to assume mythical proportions.
[sidebar: the gunfight wasn’t the only thing that experienced an apotheosis; Earp too became a larger-than-life lawman thanks to Lake and, later, Hollywood, despite evidence that strongly suggests that he was an opportunistic con-man, pimp, and horse thief who skirted both the spirit and letter of the law more than once in his life]
In other words, a small law enforcement action in a backward town in desolate southern Arizona probably wouldn’t have drawn much notice at the time… and it’s uncertain if anyone would have actually paid attention to the weapons used in the immediate aftermath.
And predictably, its sale price notwithstanding, there exists some controversy about that auctioned Colt.
For one, the revolver appears to have had its grips and cylinder replaced, and the serial numbers rubbed off.
There was suspicion that Boyer tweaked the history in his tome to magnify the value of a gun already in his possession.
And further, two other academics, D.K. Boorman and Joseph Rosa, in their respective works, stated unequivocally that Earp carried a Smith & Wesson Model 3, and not a Colt single action army, at the O.K. Corral. Even biographer Lake, who actually interviewed his subject, noted that Earp “preferred” the Smith & Wesson, though he was silent on whether that preference translated into possession on the fateful day in October 1881.
S&W Model 3
[sidebar: if true, Earp kept good company, as Jesse James, John Wesley Hardin, Pat Garrett, Teddy Roosevelt, and Billy the Kid were all said to prefer the Smith & Wesson model as well]
So why the outrageous price with so much uncertainty? Is there more to the gun than is immediately apparent? Or might such uber-wealthy buyers be more interested in (unsubstantiated) bragging rights than in the decidedly non-glamourous research that should invariably accompany such relics.
No word yet on any buyer’s remorse.
~ ~ ~ ~ ~ ~
Addendum, but this time involving old wine: http://www.newyorker.com/magazine/2007/09/03/the-jefferson-bottles
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
When attending on a modestly-sized, community-based, high-volume psychiatric crisis unit, it isn’t always possible to immediately access the full-spectrum of diagnostic resources that are available at the big tertiary centers. For example, if a patient is admitted to a large university inpatient ward and shows signs of forgetfulness, a battery of neuropsychological testing can be readily ordered as a first step to see if, in fact, the patient is suffering from measurable cognitive decline, and if so, to determine the best course of action. The academic centers, because of their focus on education and training, are quick to do ‘the million dollar work-up,’ and many patients who probably don’t need it are nevertheless blessed with the attention of numerous mental health sub-specialists.
But if you are in a small town, short-staffed, and have no neuropsychologist on your treatment team, you may have to rely on simple screening tools that can be administered at bedside; only if there persists evidence of cognitive impairment on such screenings would you then make the (sometimes outside) referrals to further delineate what is ongoing.
The Mini Mental Status Exam (MMSE) is a relatively quick 30-point questionnaire that examines cognitive facets such as short term memory, word recall, object identification, and simple task performance. But if you’re backed-up with six admissions, the ten minutes to perform a MMSE on each subject means an hour of extra work in your already chaotic day.
You’re blessed if you have a medical student or resident to do an MMSE for you, but if you don’t, you need the simplest basic memory/ concentration screening possible.
Just ask the patient where they are. Ask the day, date, month, year, and season. Ask the most recent holiday. And ask who is the U.S. President. No, this isn’t the most sensitive tool, but a person with delirium or dementia will usually stumble, and throw up the requisite red flag indicating the need for referral for more detailed examination.
In this current election cycle, though, I’ve added for fun one add’n question of my own design: name any one person who is running for President (recall at one point, there were more than 16 declared candidates between the two parties). For all but the truly addled, it’s nigh impossible to live in America of 2016 and not be aware, even in passing, that primaries and caucuses are brewing.
In asking this specific question of hundreds of patients with every imaginable mental disorder over the past six months, I’ve observed a very interesting phenom.
Young. Old. All races. Every level of education. Both genders. Psychotic. Neurotic. Organic. On Rx or off. I hear it every day.
“Trump”
Now, there are variations. Sometimes it’s just his surname. Other times, unmistakable descriptors such as “the crazy guy with all the money, the fake tan, and the hair,” or “that dude who thinks the Mexicans are going to pay for a wall.” But there’s no doubt whom they mean.
Even the ones whom I suspected had early dementia answered as the rest.
A couple of times, I thought I had uncovered a heretofore unheard reply, only to have my hopes dashed at the very end with a compound answer:
“Rubio… and then there’s that guy Trump.”
“Christie… and that rich bastard with the Atlantic City casino.”
“Bernie… and that slick New York billionaire with the big mouth.”
“Cruz… boy I wish he’d put the Donald in his place.”
Only once – ONCE – in the past months did someone say “Hillary.” And then stop. I must have appeared expectant (“and…?”) as the patient looked at me quizzically, breaking my train of thought and resulting in the fumbling of papers.
My point in all of this? Probably nothing. And come November 2016 it’ll be back to the simple vanilla questions. But in the meantime, I can’t help but appreciate the late great P.T. Barnum’s old saw that “there’s NO such thing as bad publicity.”
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
[today’s post is sponsored by Lisa S. Kaplan RN, the best nurse practitioner with whom I’ve ever had the pleasure to work. As she is also skilled in those aspects of the time-space continuum not of this plane, what follows seems an appropriate article to which to affix her name… ]
“The Zombies Are After Brains. Don’t Worry, You’re Safe”
~seen recently on a coffee mug at the office
bon apetite!
Ask any teen, or horror movie aficionado, and they’ll tell you that zombies of modern western pop culture – not those of Caribbean or African folklore – eat brains. Why that is odd is because the cinematic masterpiece that jumpstarted the whole modern zombie craze, George Romero’s Night Of The Living Dead (1968), makes no mention of brain-eating. As a matter of fact, none of Romero’s six ‘Of The Dead’ films do.
So from where did this near-diagnostic facet of zombie behavior arise?
When asked, even Romero didn’t know. In a 2010 interview with Vanity Fair, he noted, “whenever I sign autographs, they always ask me [to write], ‘Eat Brains!’ I don’t understand…. I’ve never had a zombie eat a brain. But it’s become this landmark thing.”
He went on to say that while his zombies do feast on flesh in general, he is amused that people even care about the specifics of it all (i.e., if they actually have favorite body parts or cuts of human meat). He closed by asking rhetorically if the next question will be, “do zombies shit?”
Turning back the clock, mention of brain-eating didn’t first appear, and then only fleetingly, until Return Of The Living Dead (1985). You’re forgiven if you thought that Romero had a hand in that film, but he didn’t. You see, like an amicable marital divorce, when Romero and his erstwhile collaborator John Russo parted ways in the 1970s on good terms, they agreed that all subsequent releases with ‘Living Dead’ in the title would be Russo’s, while those ‘Of The Dead’ belonged to Romero.
[sidebar: the two split over their differences re: zombies. Romero’s can be killed, whereas Russo felt that his should be essentially immortal]
So that 1985 release was Russo’s. Fans asked him about it vis a vis brain-eating.
He professed ignorance too about the etiology of the whole cerebrum schtick.
But his chief writer and director, Dan O’Bannon, once made a flip comment – one that would have unforeseen cultural consequences – that zombies probably eat brains to “ease their pain.” This was seconded by Bill Stout, the production designer of the 1985 film, who, when ambushed by interviewers, said that such an explanation “made sense” to him. Those with way too much time on their hands took these clues and offered that zombies are merely trying to boost their serotonin levels to produce the desired analgesia, and brains are a great source of that particular neurotransmitter.
Romero has expressed surprise/ amusement at the attention to such zombie detail, especially as he has noted repeatedly that the focus of his movies was always on us, and how we react to the zombies, not on the zombies themselves. He has frequently criticized those who “take it all too seriously.”
And although the definitive answer may never be known, it has been suggested by film and TV critics that neither O’Bannon nor Stout are directly responsible for the focused brain-eating craze. Paradoxically, Matt Groening of The Simpsons may have earned the honor of popularizing what is now universally held. And Groening ain’t talking.
You see, in his 1992 Halloween classic, Dial Z For Zombies (itself a parody of Return of the Living Dead), Groening had his cartoon zombies eat brains, perhaps as a nod to Russo, et al., or perhaps for entirely silly and comedic effect. But as Matthew Belinki of OverThinkingIt.com has since opined, “millions of kids saw [Dial Z For Zombies] before they were old enough to see a real zombie film. I suspect that for a whole generation, [the cartoon] was the first zombie story [they] ever saw. And that, my friends, is why we think that zombies eat brains, even though most of us have never seen a movie where this is actually the case.”
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
As the political season heats up, there once again have begun the perennial rumblings that the media is “too intrusive” or doesn’t treat one candidate as fairly as another. I don’t worry about this much; there are as many publications on the left as on the right, and if one politician gets heat from a certain sector, you can rest assured that the politician’s opponents will be similarly scrutinized by partisans of the other camps. It can be loud and messy, but I’m convinced it all balances out in the end.
However, and invariably, as the political microscope becomes more focused, there will be more talk about what is ‘fair game’ for journalists. A politician’s family? Dumb things that person may or may not have done in college half-a-century prior?
With HIPAA and Protected Health Information (PHI) in mind… are politicians owed privacy as the rest of us? And if so, to what degree?
Would FDR’s polio be pertinent today to the landmark legislation he championed and his stewardship of the nation through WWII?
Should Thos Eagleton’s history of depression have removed him from contention for the second highest position in the land?
I read recently that, the week following his inauguration in 1961, John F. Kennedy appointed his personal doctor, Janet Travell, M.D., as presidential physician, marking the first time that a woman had held that important post.
This breaking of the glass ceiling, though, came with some additional baggage. Dr Travell had an impressive professional resume, including prestigious academic appointments in pharmacology, orthopedics, and cardiology. She then-already enjoyed an established reputation as a pioneer in the treatment of chronic pain conditions.
[sidebar: it is said to have been her recommendations on ergonomics that later resulted in the iconic images of JFK sitting in rocking chairs]
But when expressly asked about rumors of JFK’s health during the 1960 campaign, she stated that he did not have Addison’s Disease and that she had never treated him for same – both statements found after his death to be inaccurate.
In short, she lied.
Additionally – though this may be a reflection of the times and not as much the clinician – Dr Travell prescribed for JFK an astounding array of potentially habituating agents to treat his pain, including high doses of Luminal, Librium, Miltown, Laudanum, Meperidine, and Dolophine. Add to that his frequent, sometimes nightly, use of Nembutal for sleep. Though the Kennedy family credited Dr Travell with enabling a determined JFK to maintain his punishing schedule in the face of physical difficulties, Dr Jeffrey Kelman, who later researched and published a book on Kennedy’s health, has since stated that the president’s ailments probably would earn him social security disability benefits were he were alive today. And as one who had seen combat, he’d also arguably be 100% service connected through the Veterans’ Administration for such serious and chronic debilities.
All of this occurring concurrently with the Bay of Pigs and the Cuban Missile Crisis!
However, one might say, he successfully navigated those challenges. Yes, others will add, but what if his sensorium had been clouded by that potentially stupifying drug cocktail?
As physicians, we deal with the headaches of HIPAA daily, the near-constant concerns over aspects of privacy – presumed, expressed, implied – that any/ all practitioners can readily appreciate. We worry about minutiae like names on the spines of filed charts being visible from a distance.
The media has no such worries in reporting on the body politic. Or should it?
All points worth keeping in mind as we enter the nominating primaries.
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
For Christians, December 25th by tradition marks the birth of Jesus of Nazareth, and therefore the start of a new year in their calendar.
[sidebar: that the Western new year actually begins a week after Christmas goes back to Julius Caesar and, by necessity of length, will be fodder for another post]
From where, then, do we get ‘BC’ (‘Before Christ’), ‘BCE/CE’ (‘Before/Common Era’), and ‘AD’ (Anno Domini, or ‘In The Year Of Our Lord’)? That’s not as simple a question as it may seem; no one in what would later come to be known as, say, 10 AD called the year thus, since Jesus was by then merely an unknown pre-adolescent in Judea.
The Christian calendar got off to a rocky start as the society from which it sprang, that of the Romans, measured the passage of time from pagan emperors and events. There were two competing Roman calendars, that of Anno Mundi (‘In The Year Of The World’) which counted from the founding of Rome (753 BC), and later that of Anno Diocletiani, created by its namesake (244-311 AD), which narcissistically measured time from his ascension to the purple robe.
Diocletian fomented numerous persecutions of Christians. He particularly enjoyed Damnatio ad Bestias, what the Romans called the amusement of throwing Jesus’ followers to the wild animals. Little wonder, then, that those potentially facing the lions didn’t want to measure the passage of their lives in reference to the man who so hated them.
Fast forward several centuries. Christians, along with everyone else, had been forced by lack of reasonable alternative to use the calendar of Diocletian. For a while, some tried to employ an Anno Adami system (‘In The Year Of Adam’), but it was confusing, impossible to accurately measure, and never caught on. In 525 AD, though, a monk, Dionysius of Scythia Minor (Romania), was tasked with creating a liturgical table to determine on what dates Easter was to occur in subsequent years.
[sidebar: recall that Easter is the Sunday following the first full moon after the spring equinox, which is why it changes every year]
Dionysius decided to be rid of all association with the Christian-hating Diocletian once and for all. He is the first author of whom we know whose extant work measured time from Jesus’ birth. He listed the first year of his table as 532 Anno Domini; why he didn’t use 525 AD is unclear, but as modern scholars believe that Jesus was actually born sometime between 6 BC and 4BC, not in 1 AD, Dionysius wasn’t actually far off.
But the Anno Domini system didn’t catch fire until after the Venerable Bede authored The Ecclesiastical History Of The English Peoples (731 AD), and used it throughout his discourse. Bede’s writings were also notable for introducing the concept of ‘BC’ (what he called Ante Incarnationis Dominicae, or ‘Before The Time Of The Lord’s Incarnation’) and setting 1 BC to have been the year immediately prior to 1 AD, ignoring any potential Year Zero.
After Bede’s landmark tome, both Emperor Charlemagne (742-814 AD) and the Holy See (11th century AD) officially adopted the Anno Domini system to measure the passage of time. From that point forth, it quickly became widespread in Christendom.
[sidebar: for some odd reason, in English, ‘Before Christ’ didn’t appear in writing until the late 17th century – ‘Before The Lord’s Incarnation’ was used instead – and one doesn’t see the published abbreviation ‘BC’ until the early 19th century]
So that explains BC and AD, but what of BCE and CE? Are they strictly used by non-believers, just as Christians eschewed the use of Diocletian’s calendar? Not entirely (and to no small degree because the modern conservative political prism and the so-called War on Christmas were still years in the future!)
While BCE/CE have been popular amongst Jewish authors since at least the mid-19th century (when Rabbi Morris Raphall published his widely read Post Biblical History Of The Jews), the nomenclatures’ uses far predate the middle of that century.
The German astronomer Johannes Kepler adopted his own terminology, Vulgaris Aerae (‘Vulgar Era’), freely interchangeably with Anno Domini in his scientific treatises of the early 17th century. This was in part because, in Kepler’s time, the Latin root of ‘vulgar’ was closer in meaning to ‘ubiquitous’ – that is, Kepler was merely employing a Christo-centric view of the civilized world. Later in that century, though, when ‘vulgar’ came to mean ‘uncouth’ in the non-academic English vernacular, many Western authors, staying true to Kepler’s intent but desiring to apply terminology that was not potentially pejorative, employed ‘Common Era’ in lieu of ‘Vulgar Era,’ both interchangeably with Anno Domini.
So despite what modern Christian apologists maintain – that ‘CE’ is short for ‘Christian Era’ – that is not borne by the historic record. And over the very years it was designed to measure, reference to the Common Era has gained much traction in modern scholarly circles in an attempt to sever the documentation of time from its semantically parochial roots.
Merry Christmas!
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
“The past is never dead. It’s not even past.” ~Requiem For A Nun
I have always loved Faulkner’s oft-recounted quote, since it is true on so many levels.
With that in mind, here is an odd present-day story that started almost a century ago, and is neither dead nor past.
Adolf Hitler wrote the draft for his 720-page autobiographical manifesto, Mein Kampf (My Struggle), while imprisoned after the failed coup of 1923. It represented his vision and blueprint for a National Socialist world, and was not at first a best-seller when released in 1925 (9000 copies). Once Hitler rose to prominence, however, the Nazis mandated its distribution to soldiers, newlyweds, and schools nationwide, and it started to generate large sums in royalty. Over Hitler’s lifetime, it is estimated that the book sold 10M copies (and ~$430M for its author, adjusted for inflation, all of it tax-free since he was in charge and made the rules).
Fast forward to May 1945. Hitler was dead and the war was fast closing. Bavaria, as the jurisdiction of Hitler’s official residence (Munich), seized all of his property, including the rights to the book. None of Hitler’s distant surviving heirs cared to contest this confiscation. And through assertive de-Nazification efforts, the Bavarian government promptly prohibited the publication of Mein Kampf, now their book, anywhere in (then-West) Germany.
But of course, that had little binding effect on other countries, where the tome continued to be printed and sold to varying degrees, both by previously-licensed publishing houses and bootleg operations [strangely, it has enjoyed strong sales in both Turkey and India]. Those international licensees then generated royalties for the legal copyright holder – the reluctant Bavarian state.
[sidebar: Bavaria holds the copyright for most of the world, but things are a little different in the U.S. and U.K. More on that in a moment…]
So, what to do with the tainted gains? Bavaria started to quietly donate all proceeds to charity.
In the U.S., Houghton Mifflin purchased the rights to Mein Kampf in 1933. The U.S. government seized the copyright in 1942 under the Trading With The Enemy Act – even though Houghton Mifflin is an American company based in Boston – and amazingly held it until 1979, placing the $139,000 generated in sales over those years in the War Claims Fund. In 1979, with no fanfare or press release, Houghton Mifflin bought back the rights from Uncle Sam for $37,254, and then proceeded to pocket over $700,000 in sales over the next two decades. When this was publicly revealed in 2000, the chagrined publisher said that they were distributing the monies to charities that promote “diversity and cross-cultural understanding,” and a host of other things that Hitler would have hated. Still, many of those charities – the Red Cross amongst them – refused to take the cash, leaving Houghton Mifflin wondering if buying back the rights was such a good business idea after all.
In the U.K., Hurst & Blackett (Random House) had purchased the rights to a translated English version from Hitler’s publisher also in 1933, still retaining that right in the post-war years; as with the Bavarians, H&B gifted all proceeds to charity. Interestingly, the Jewish charities initially selected didn’t want the money, so H&B started gifting anonymously (and it remains uncertain if the recipients ever knew the source of the donations).
Under U.K. law, the copyright on Mein Kampf expired in 1995. And under both U.S. and German copyright law, Mein Kampf is scheduled to enter the public domain in seven weeks, on January 1st, 2016. But while that will sever any direct connection between the text and Hitler’s estate, publishers, or those who directly dealt with them, it doesn’t mean that the book will not still be printed and sold.
Meaning that, ninety-two years after first conceived, the hate-filled diatribe of a fallen dictator dead for seventy years is still churning out income… that no one wants.
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
[Today’s post is sponsored by longtime reader and fan Carole Ann Thomas of Greenville NC]
It was just after midnight on New Years Day when the Cadillac crossed the Tennessee state line and pulled up to the all-night diner. The driver, a college kid named Charles Carr who had been pressed at that last minute into this job, asked the sole occupant of the back seat if he wanted anything to eat. That occupant, who had not been feeling well for several hours prior, is said to have declined the offer. The driver went inside, got a sack of burgers, and returned to the idling car to continue the arduous and icy all-night journey to Canton OH.
By the time the duo had passed Mount Hope and reached the outskirts of Oak Hill WV, sometime after 3:00 a.m., the driver noticed in the rearview mirror that the blanket had fallen off the seemingly resting passenger. He stopped to rearrange the cover given the bitterly cold night, only to find that his compadre’s hand was as cold as the ambient temperature. Yup, the man in the back was deader than yesterday’s fish by that point.
The car was a 1949 Cadillac Series 62 convertible. The corpse was that of 29 year old Hank Williams, Sr., Hall of Fame and Grand Old Opry star, and arguably one of the most influential musicians of the 20th century, with eleven Billboard country-western #1 hits then to his credit.
Hank Williams Sr
Williams was a known substance abuser, and over the previous twelve hours, is documented to have been drinking and taking the sedative chloral hydrate and possibly other prescription meds to assist him in sleep. His (not helpful) doctor in Knoxville had also given him a dose of morphine on top of all else before Williams started on the drive.
An autopsy documented the cause of death as myocardial infarction complicating preexisting congestive failure and severe substance abuse. But there were unexplained bruises on the body of indeterminate age, and for a while, suspicion fell on Carr, though no charges were ever filed. The matter was laid to rest amongst all but the hardest-core conspiracists.
And the burgers?
The Burger Bar
Carr said that Williams had declined his offer to eat when they pulled up to the Burger Bar in downtown Bristol VA just after midnight. But Carr did buy a sack-full, and whether Williams later partook of the snacks driving out of town is known only to Carr. In short, no one is sure if the Bristol burgers tipped an already-sick Williams over the edge into eternity.
But that possibility hasn’t hurt business any. I recently took my daughter to college in Tennessee, and on the way back, went through Bristol where we once lived. The Burger Bar is in full swing, capitalizing on its dubious connection to Hank Williams (if not exactly advertising the fact that the burgers might have killed him).
I’m a sucker for any roadside dive with history. The burgers are just as greasy and gooey as you’d expect from a grill of this age. If in town, don’t miss them… unless you’re on chloral hydrate.
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
I attended the College of William and Mary for my undergraduate education. Back then, as a Phi Beta Kappa history major with a biology minor, I studied a lot. There were no girls’ schools nearby. I didn’t do drugs or drink to excess. It was difficult to get into too much trouble in that small picturesque tourist-laden town. So opportunities to road-trip and have some ‘real fun’ were not only few and far between, but much desired.
It was December of 1981, my second year in college half-over. Grades were on-track. It was starting to feel more relaxed. I breathed a sigh of relief. Perhaps it was not too late to start having some real fun.
Then the news hit. The Rolling Stones were playing the Hampton Coliseum on the final stop of their ‘Tattoo You’ tour on the 18th and 19th of that month, just after exams were finished. Unfortunately, to get to Hampton required a car… something that I didn’t have in college. One of the fellows across the hall in my dormitory did have a car, though, and the next thing I knew, he had landed six tickets to see the Stones, and I was offered one of them.
I was excited! The Rolling Stones, always the bad boys to the (initially clean cut) Beatles and Elvis, had by then outlasted the Fab Four and the late-King on the live concert circuit by twelve and four years respectively. The Stones were the premier act of the British Invasion still going strong. This was an opportunity not to be missed!
But fate had other ideas. Most college students wind up getting the sniffles around exam period, the effects of long hours, poor diet, and intermittent sleep. Some get it worse than others. I came down with a bad upper respiratory infection in the days prior to the concert. I even had to go to student health, something we all tried to avoid because of the long wait times. Needless to say, I felt awful when the day of the concert rolled around, and with great regret had to let someone else take my coveted place.
Med school and residency. Marriage. Children. Dogs, Relocations. Re-marriages. Jobs. Jobs. Jobs. They all intervened. That missed gig in Tidewater was the only opportunity for me to see the Stones conveniently from that time… until this month.
I was at work three months ago when a fellow psychiatrist texted me that she had gotten tickets to see the Stones at Carter-Finley (the 58,000-seat NCSU football stadium here in Raleigh) in early July, and did I want a pair of the tickets that she had snagged? I had a flashback to 1981, and told my thoughtful friend that my wife and I would be thrilled to go with her and her group!
There was an inauspicious start to this plan, though. I excitedly texted the missus and told her I had tickets to see the Rolling Stones! She texted back that she was happy… and would have to Google ‘Rolling Stones’ to see if she knew any of their music.
Google the Stones?!?!
[sidebar: in her defense, she only emigrated to this country in 1994, but still, the Stones are known the world over. The Iron Curtain wasn’t THAT impermeable, was it?!?]
I planned on taking the day off work so that I wouldn’t have to worry about missing the concert in the early evening commute/ traffic jam. I read all that I could about parking and routes by which to approach the stadium. I had planned on scanning eBay for some Stones’ paraphernalia to wear on the big day. I showed the missus a t-shirt I wanted to buy, one with the large red ‘lips and tongue’ logo that the Stones have been using since at least the mid-1960s. She asked me why I wanted to buy a KISS t-shirt?
[sidebar: this wasn’t looking too good; even my ‘golden oldies’-knowledgeable teenage stepdaughter rolled her eyes when her mother made such comments, and she wasn’t even BORN when I missed the Stones in Tidewater!]
Last night was the event. We left on time, got to the stadium parking without difficulty, and found a decent spot (though it was far from the exit, which, I knew, would make egress a nightmare when the concert was over). Everyone was tailgating. Grills. Cooking meats. Ice chests with libations. Though the youngsters were there in force, there was an equally large contingent of folks who, like myself, sported more than a few grey hair. I kept thinking of that line from Don Henley’s ‘Boys Of Summer’ about having seen a Dead Head sticker on a Cadillac.
I found my friend’s car. She and several others were standing around drinking and eating snacks at the rear hatchback. Talk turned to work. And kids. And our various physical ailments. Really, you had surgery? How much did you lose on that diet? So-and-so retired/ died? I don’t remember you wearing those glasses? Then it was time to head to the stadium entrance.
The hill was long. I was sweating by the time I reached the gate. I was starting to feel sore. I asked my friend about the quality of the seats. She looked sheepish, and said that, though she had been made a special offer through her credit card company to buy these tickets in advance, one of our party had decided to come at the last minute, had bought his ticket only that afternoon, and had apparently scored a much better seat than did the rest of us (he bid us farewell as he veered off for the seats nearer the stage, while we hiked up into the section requiring supplemental oxygen).
The logo, through zoom lens
Far removed from the days when big-name acts played small club venues, the organizers of today’s mega-concerts have developed a trick to fool those in the nosebleed seats. By putting up giant Jumbotron video screens around a site, one gets a clear picture of who is on stage, even though this is essentially like watching TV at home, only minus the comfort and nearby refrigerator. That was the case last night. I could clearly see the faces of the ants on the stage. But that was really the least of my concerns. The stadium bleachers were the most infernally uncomfortable seats I have ever experienced. Plus, much like flying coach on domestic airlines, the people were crammed in so tightly that it was a challenge to keep my knees out of the backs of those in front of me, or even stand to stretch (since it would be difficult to wedge myself back in the seat afterward).
It was hot and sticky. The crowd was loud. The wannabe warm up band blared in the background. My butt hurt. The concessions were highway robbery, and the band’s merchandise was outrageously expensive too. My cellphone had no reception. The restroom lines were unspeakable. But hey, I was going to see the Rolling Stones, right?
At 9:30 p.m., the lights dimmed, and then in a technicolor explosion, the Jumbotrons flashed the red ‘lips and tongue’ logo, and onto the stage strolled Ronnie Wood, Charlie Watts, Keith Richards, and Mick Jagger. Or at least that’s who the TV showed me was on the stage.
They launched into their first number. Someone had turned the volume WAY up since the warm-up act. The seats reverberated. My pacemaker vibrated. I strained to figure out the song. Was it ‘Tumbling Dice’? Or perhaps ‘Brown Sugar’? Maybe ‘Midnight Rambler’? After close to a minute, I figured out that it was ‘Jumpin’ Jack Flash,’ but only because I recognized features of the beat, not because I could actually understand any of the lyrics.
By 11:00 p.m., the band was still going strong, but I was not. I couldn’t possibly be heard over the din to explain to my group that we were leaving, assuming that I could have even stretched over to yell in their ears. I told my wife we were going to do ‘the English departure’ (a former Soviet term for slipping out without saying goodbye to the hosts). I ‘went to the restroom,’ and she followed five minutes later. We made our way to the car. At least, I thought, we’ll avoid the total jam that will occur when all 58,000 fans head for the parking lot when the concert is finished.
Not exactly. Stiff and palsied, it still took us a while to exit – a lot of those formerly-referenced grey haired fans were making for the doors as well. And sadly, the sound quality was far superior in the parking lot, probably more than half a mile away from the stage. We should have saved the ticket price, paid for parking, and listened to the concert from outside!
Home by midnight, I fell fast asleep like the dead. I have no idea what time the concert ended, or what time those hold-out stalwarts actually made it home. But as I drifted off, I could not have cared less how many encores were played.
Thirty four years after Tidewater, Mick Jagger is right. You can’t always get what you want.
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
I’ve always held that it is unfair to judge historical figures, acts, and events through the lens of 21st century morality. The taking of underaged concubines of either gender was de rigueur for centuries amongst potentates of the Orient, Middle East, and African continent. Bear-baiting was a wholly accepted sporting event during the Stuart Restoration. Some of our Founding Fathers owned slaves, and there’s that mess with Jefferson and Sally Hemings with which to contend. ‘Idiot,’ ‘Imbecile,’ and ‘Moron’ were professionally-employed psychiatric terms. It has only been within a few decades that the Ottoman murders of Armenians have been referenced as genocide. Remember poll taxes? Until recently, disenfranchising the poor was considered appropriate (wait, it still is). Children’s toys of the turn-of-the-century were often overtly racist, depicting minorities as contemptible buffoons. And blackface vaudeville was common in early 20th century America.
Openly advocating such improprieties today would be unconstitutional and immoral at best, and a matter for the International Criminal Court at worst.
But societal mores do change, and there are few if any absolute constants.
Marketing and the almighty dollar? Well, that’s a whole ‘nuther subject.
You may think you know the story below, but trust me, you don’t.
It begins in the autumn of 1902. Mississippi Governor Andrew Longino was up for reelection, facing a primary challenge from a rabid white supremacist, one James Vardaman. Longino, though a Democrat, had invited the sitting President, Republican Theodore Roosevelt, to come south for some recreational bear hunting, knowing the chief executive’s proclivity for outdoor activities. TR, sensing an opportunity to help a moderate politician-in-need, even if a Democrat, and get away from the Capitol at the same time, enthusiastically accepted.
[sidebar: Vardaman saw this as a wholly political ploy, calling out Roosevelt with racial epithets and adding that he was nothing but a miscegenationist hell-bent on destroying the last vestiges of Confederate culture. And you think Obama has it rough?]
Included in the Roosevelt-Longino party were two local celebrities, Robert Bobo and Holt Collier. Bobo was a renowned breeder and trapper who brought with him fifty of his prized bear hounds. Collier was a former slave and scout for none other than Nathan Forrest’s cavalry, and said to possess the best nose for bear in the delta. Roosevelt, to the consternation of the white Southerners present, interacted with Collier as an equal and comrade-in-arms, being particularly impressed with Collier’s accuracy with his Winchester M1894 using either hand.
[sidebar: Collier’s service for General Forrest has often be used by Lost Cause apologists to illustrate that there were, in fact, black Confederates. Whether Collier’s time with Forrest was volitional in nature or not, we may never fully know, but see the above opening statement re: judging history by the lens of 21st century expectations, and proceed]
Anyway, the vast undeveloped tract on which the Roosevelt-Longino party decided to hunt – in Outward, MS, 30 miles north of Vicksburg along what is now state highway 61 – was then owned by railroad magnate W.W. Magnum, the man also known to having once imported monkeys to Mississippi in a failed effort to train them to pick cotton. But that’s a tale for another post.
This hunting preserve was thick with tangled underbrush, stunted pines, and canebrake. Progress on foot was slow. It was Collier with Bobo’s hounds who first picked up the scent of bear early on Saturday, November 15th. They tracked but found nothing at first. The party, having no other leads, and tired from traversing the demanding terrain that morning, returned to camp for a late lunch. But Collier persevered, and at 3:30 p.m., cornered an old 235lb female black bear. Sounding his bugle to alert the president’s party back at camp, Collier and the dogs surrounded the bear near a watering hole. The bear may have been old, but still had some fight in her, killing one of the dogs with a swipe from her claws, and maiming several others. Collier smashed the bear’s skull with the butt of his Winchester, and while dazed, he was able to lasso the beast and tie it to a tree trunk awaiting his colleagues.
Roosevelt, upon arriving at the watering hole, was disgusted. He found there a mortally wounded dog, several others seriously injured, blood everywhere, and a half-dead mangy bear tied to a tree trunk and gasping for air. Those present told TR that the honor of shooting the bear was his.
But Roosevelt was not cut from the same cloth as his contemporary outdoorsman, William ‘Buffalo Bill’ Cody (the man famous in part for indiscriminate mass killings of bison on the western plains). TR refused to shoot the bear under these circumstances.
the Berryman cartoon
The papers smelled a popular story. The Washington Post ran the now-famous Clifford Berryman cartoon and the ‘official’ line in its edition of Monday, November 17th: conservationist president upholds personal honor and refuses to shoot captive bear in an unsportsmanlike setting. What a wonderful guy that TR is!
[sidebar: each time the Berryman cartoon was reprinted in the days that followed, the bear was drawn smaller and smaller, until it was nothing more than a frightened cub]
A New York candy shop owner, Morris Mitchom, saw the cartoon and article. He asked his wife Rose to make two plush toy bears, stuffed with excelsior and sporting black shoe-button eyes, and put them in the window of his shop with the sign, “Teddy’s Bears” (I’m not sure why it was plural, since there was only one bear.) This simple act, and TR’s later permission to use his name on the product, soon evolved into a new business venture for the Mitchoms, one that eventually became the Ideal Novelty and Toy Company. Its sales soared on the strength of sales of its Teddy’s Bear line.
Copied around the globe, today’s teddy bear has remained a staple of childhood memories now for over a century, all from the warm and fuzzy story of the kind president who refused to shoot a frightened captive bear.
But as the Late Great Paul Harvey would have said, now it’s time for the rest of the story, the part you almost certainly don’t know.
First, a post-script. When TR left the White House in 1908, there was fear in toy-land that the popularity of the teddy bear would quickly wane. They needed a new gimmick. The incoming President. W.H. Taft, wasn’t nearly as charismatic as TR. Taft was hugely fat, and his eating habits were the stuff of tabloid fodder. At one banquet in Georgia during the campaign, Taft was served barbequed possum with potatoes, and apparently ate all of it and asked for seconds. The president-to-be, while wiping his lips, was quoted as saying, “I’m for possum first, last, and all the time between.” Toy companies decided to market what they called the Billy Possum, the incoming administration’s answer to Teddy’s Bear. It was to be a political symbol for adults, but one that could easily be made into a cuddly toy with which children across the country would play and contentedly fall asleep for years to come.
Billy Possum
Teddy’s Bear survived and thrived. Billy Possum failed miserably, though if you’re lucky enough to find a surviving example of the latter, buy it, as they can fetch well into the tens of thousands of dollars at auction.
But secondly, and of more importance, what became of Teddy’s actual bear?
Having put away his rifle that November day, the president instead had handed his 14” Bowie knife to an aide and told him to put the bear out of its misery. The aide obliged, slitting the struggling bear’s throat as it vainly tried to escape. Butchered on site, the bloody carcass was brought back to camp, and the feasting began.
So much for the warm and fuzzy story trumpeted by the media.
Perhaps readers in the early 20th century would have seen nothing wrong with the conservationist president’s mercy knifing. Whether his directive was humane or not remains to be seen. Whether this should be judged by 21st century mores in the era of PETA is debatable. But the marketing people, even back then, knew that leaving this last tidbit OUT of the story of Teddy’s bear was probably good for business. They were correct.
Oh, and Andrew Longino lost reelection.
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]
U.S. Supreme Court Justice Joseph Lamar died in January 1916. An otherwise forgettable tenure on the High Court remains notable for the subsequent succession battle it unleashed. A week after Lamar’s funeral, President Woodrow Wilson asked the Sec’y of the Treasury, his friend Wm McAdoo, whom he might suggest to take the late Justice’s seat. McAdoo offered the brilliant Louis Brandeis, a profound liberal thinker, attorney, and social activist of the day. Wilson asked McAdoo if he thought Brandeis could be confirmed. “Yes,” McAdoo said, “but it will be a stiff fight.”
Nobody could then have known how stiff that fight would prove.
Brandeis
For all of his admirers, Brandeis had an equal number of enemies. Wall Street regarded him as an untrustworthy radical. Conservatives saw him as a troublemaker, an attorney who was known to rely on sociological and psychological data. The strict constructionists considered him dangerous because of his activism. Even former President William Taft privately called Brandeis’ nomination “one of the deepest wounds that I have ever [suffered] as an American and a lover of the Constitution.”
What was unspoken? Brandeis was a Jew.
To President Wilson, who had appointed the first Jewish professor at Princeton and the first Jewish Justice to the New Jersey Supreme Court, Brandeis’ faith was immaterial. What was important for Wilson was to correct what he considered his biggest mistake as President – having nominated the rabidly conservative and anti-Semitic James McReynolds to the High Court two years earlier.
Throughout the 19th century, Supreme Court nominations were usually minimally controversial processes that resulted in voting the same day that the nominee was presented. This changed with Brandeis. The Eastern Establishment lined up petitions and testimonies to bemoan and denounce the Brandeis candidacy. The Senate Committee on the Judiciary announced an investigation of the many charges leveled against Brandeis; McAdoo shrewdly urged Brandeis to ask the committee to hold their hearings in public, as he figured most of the objections would fade in the light of day. In February 1916, more than forty witnesses – largely Boston Brahmins and those on the losing sides of cases that Brandeis had prosecuted – paraded before the Senate cloaking their prejudices in rhetoric about dishonorable and unprofessional conduct.
Supporters countered. Felix Frankfurter and Walter Lippmann defended Brandeis in the press, and former Harvard President Charles Eliot sent the committee a ringing endorsement, as did nine of Harvard Law School’s eleven law professors. Former Chief Justice Melville Fuller called Brandeis “the ablest man who ever appeared before the Supreme Court of the United States.” Speaking for himself, Wilson wrote, “I cannot speak too highly of his impartial, impersonal, orderly, and constructive mind, his rare analytical powers, his deep human sympathy, his profound acquaintance with the historical roots of our institutions and insight into their spirit, or of the many evidences he has given of being imbued to the very heart with our American ideals of economic conditions and of the way they bear upon the masses of the people.”
The four-month confirmation process was as brutal as any the country had ever seen. It also opened the doors for future examinations of judicial nominees, who were soon required to defend themselves in person before the Senate.
The arguments came down to a partisan vote in committee. The full Senate confirmed Brandeis in June 1916 by a vote of 47 to 22.
Brandeis went on to a distinguished career.
But whenever Brandeis spoke in judicial conference, Justice McReynolds was known simply to rise and leave the room. He went so far as to avoid official Court pictures because he did not want to be photographed with a Jew. And when Brandeis retired in 1939, leaving an enviable legacy of decisions behind him, he received the customary laudatory letter of thanks, signed by all of his colleagues.
All except one.
[Have an idea for a post topic? Want to be considered for a guest-author slot? Or better, perhaps you’d like to become a day-sponsor of this blog, and reach thousands of subscribers and Facebook fans? If so, please contact the Alienist at vadocdoc@outlook.com]