Bread and Circus

An online journal of culture


by Editors




Image from Teensy Weensy Book
Kristine Williams, 2009

“This is a book project I’m working on. Almost every piece has a cutout, or what I’m calling cliffhangers in this series.” –K. Williams

For more, visit the artist’s blog here.

© 2009 Kristine Williams. All Rights Reserved.  Used with Permission.



The DIGNITY Series: A Scientific Explanation

by Editors

This is the seventh of a 10-part series examining the loss and possible recapture of dignity in our public lives and political discourse. (Read the series from the beginning here.)


By Stanley Baran

7. A Scientific Explanation

In 1954 two psychologists, Albert Hastorf and Hadley Cantril, produced what many consider the classic cognitive dissonance experiment, “They Saw a Game,” in which students from Dartmouth and Princeton recounted two widely different versions of a particularly brutal football game between the two schools. For Dartmouth students, Princeton’s players were the malefactors. For Princeton students, Dartmouth’s gridders were the evil-doers. “In brief, the data here indicate that there is no such ‘thing’ as a ‘game’ existing ‘out there’ in its own right which people merely ‘observe’” wrote the researchers, “The game ‘exists’ for a person and is experienced by him only insofar as certain happenings have significances in terms of his purpose.” Many who study dissonance theory, however, identify a different piece of research as more instructive because it deals with our attitudes toward a more vulnerable class of people than those matriculating at exclusive Ivy League universities.

In November, 1945, Harvard psychologists Gordon Allport and Leo Postman delivered a talk to the New York Academy of Sciences on the role of rumor in the just-ended World War. But theirs was not a study of historical happenstance. “At the present time,” spoke Dr. Allport, “there is reason to suppose that we may be headed for another critical period of rumor-mongering, since we anticipate sharp clashes between minority groups of Americans and majority groups during the coming years of social readjustment.” Through a series of clever experiments they demonstrated that rumors found their basis and longevity in people’s preexisting attitudes and beliefs. Allport and Postman’s most famous test involved a drawing of a confrontation on a passenger train. In it, two men, a well-dressed African-American and an overall-clad white man, are standing in the aisle. The white man holds a straight razor in one hand and points at the Black man with his other. The Black man’s hands are at his side. A description of the scene is passed from one who has seen the image to another who has not, and in something akin to the kids game of telephone, that person is asked to describe as “accurately as possible what you have just heard” to the next person, and he or she to the next, and so on, until the description has passed through six or seven retellings. The psychologists ran the experiment more than 40 times, using people from all walks of life. In a finding they themselves called “the most spectacular of all our assimilative distortions,” by the time the description had moved from actual viewing to final recounting, “in more than half of our experiments, a razor moves (in the retelling) from a white man’s hand to a Negro’s hand.”

Because “Black men are ‘supposed’ to carry razors, white men not,” as Allport and Postman explained, people reconfigured the “reality” of the drawing to reduce their psychological discomfort (their dissonance). They did this through selectively perceiving and remembering what they had heard, leading the researchers to conclude, “Each subject finds the outer stimulus-world far too hard to grasp and retain in its objective character. For his own personal uses, it must be recast to fit not only his span of comprehension and his span of retention, but, likewise, his own personal needs and interests. What was outer become inner; what was objective becomes subjective.” The I is preserved.

Contemporary psychologists have added to thinking on dissonance reduction by expanding people’s “personal needs and interests” to include their motivation to hold socially “correct” attitudes. Among the first to do so were social psychologists Richard Petty and John Cacioppo. In the 1980s they offered their Elaboration Likelihood Model of information processing, which while accepting the idea that people may indeed want to be correct in the attitudes they hold, argues that some of us are more willing to find the correctness in those attitudes than are others, or to use the first word in the theory’s name, people bring varying degrees of elaboration to the attitudes they hold. Those who take the central route of information processing are willing and able to engage in message elaboration, examining different facets of an idea, challenging evidence, questioning personally held assumptions. Those taking the peripheral route avoid elaborating on the information before them, relying instead on other cues in the information environment, cues which may or may not have anything to do with the issue at hand. All of us take the peripheral route at times, especially if the matter before us has little personal import. But we all have a more or less individually characteristic route that we are willing to take when processing information, especially information that potentially challenges our existing attitudes. Some of us are generally willing to scrutinize information; some of us habitually travel the peripheral route. Some of us are willing to dignify the words and ideas of others (even if after elaborating them we reject them); some of us simply retreat to the peripheral route.

Psychologists who believe that people want to a) hold the right opinion, b) reduce, or even better, eliminate dissonance, and c) not have to think about things too much, call this the heuristic model of information processing, the use of simple decision-making rules allowing people to deal with the world without much cognitive effort. A central route processor, for example, might be a long-time Republican who examined Barack Obama’s rhetoric, tested it against the candidate’s public service record, examined both in light of his or her own life-experience while adjusting for personal biases (years of commitment to the Republican Party, for example), and based on this scrutiny either did or did not vote for Obama. A peripheral route processor would rely on a heuristic—I’m a Democrat, he’s a Democrat, I’m voting for him; I’m an American, he has a Muslim name, I’m not voting for him.

Read part 8: Leaving the Reality-Based Community (click here)


Stanley Baran is Professor of Communication at Bryant University. A Fulbright Scholar, he is the author of Introduction to Mass Communication: Media Literacy and Culture and Mass Communication Theories: Foundations, Ferment, and Future. He writes frequently on the media, popular culture, and our understanding of ourselves and our world. He will happily provide citations for this series’ quotations and statistics. Simply e-mail him at

Text copyright 2009 Stanley Baran


by Editors



Environmental Protection Agency photograph in the collection of the National Archives Administration.


Vampire Movies (and Television) Worth Seeing

by Editors



By G. Arnold and Erin Dionne

With Halloween around the corner, it’s time to sit back and enjoy some of the viewing choices that the season has to offer. Among the many movie and television themes that are associated with this time of year is a perennial favorite: vampires.

Since two of our editors are fans of this genre, we decided to put together some viewing suggestions with a vampire theme. Here, in chronological order, are a few they suggest for the next time you want to spend some quality time with the undead.


Nosferatu (1922)

Nosferatu is a very slightly reworked version of Bram Stoker’s 1897 novel Dracula. Unfortunately, the producers didn’t have the rights to the book. Not long after the movie opened, Stoker’s estate sued. The courts ordered the destruction of existing prints, but Nosferatu had already taken flight. Many copies already had been distributed around the world, making it impossible to round up and torch every print. So Nosferatu lived on.

Quickly paced and entertaining throughout, Nosferatu is bolstered by an original look and the innovative use of then-new special effects. The vampire is Count Orlok, a strange creature of the very undead type. Looking inhuman — with pointy ears, rodent-like eyes, and hands that resemble claws  — he is a far cry from the more elegant Draculas that appeared in later films. There’s little doubt about who the monster is in this film.

Nosferatu is an important piece of film history, but more than that, it’s still fun. Even if you seldom watch a silent-era movie, make an exception for Nosferatu. It’s a must-see viewing for fans of vampire movies.


Dracula (1931)

Unlike Nosferatu, director Tod Browing’s Dracula is more directly (and legally) connected to Stoker’s novel. But the movies are very different in numerous ways. By 1931, the era of sound films had begun. And Dracula makes the most of this, capitalizing on the eery and menacing voice of lead actor Bela Lugosi. Indeed, much of the film’s staying power can be attributed to Lugosi, whose iconic and strangely mesmerizing performance was the epitome of the Dracula character for generations. A more subtle monster than Nosferatu, Lugosi’s vampire has a decidedly exotic and aristocratic air — he’s like a foreign ambassador who just happens to be undead.

Browning’s version of Dracula exerted an enormous influence on most of the Dracula films that followed. Its impact has been so widespread, in fact, that viewers may be familiar with its take on the Dracula story even if they have not seen the original. So it can be hard for audiences today to see Dracula with fresh eyes. But it’s worth a second look. Dracula is an impressive film on its own merits.


Horror of Dracula (1958)

A generation after Browning’s version of Dracula, the British outfit Hammer Films issued director Terence Fisher’s new version of Stoker’s vampire story. Horror of Dracula (the movie’s title in the U.S.) is a stylish and engaging film, even if it is a rather low-budget affair. The movie’s energetic take on the classic story reinvigorated interest in the Dracula character, especially among enthusiasts of the horror genre. They appreciated the actors having fun with the roles. They also liked that there was more blood.

The film pits Count Dracula (portrayed by Christopher Lee, who went on to play the character several times) against the persistent Professor Van Helsing (Peter Cushing). The pairing of Lee and Cushing was a masterstroke and so popular that it was repeated several times. (A bit of trivia: Each actor appears in separate movies in the Star Wars series.)

There is little that is subtle about a typical Hammer film, and Horror of Dracula is no exception. But the brash directing and enthusiastic, twinkle-in-the-eye acting adds an undercurrent of fun to what would otherwise appear to be a rather grim story.


Dark Shadows (TV series, 1966-1971; remake 1991)

In the late 1960s the vampire tradition got an unexpected jumpstart in the unlikely venue of an afternoon soap opera. The most popular storyline in ABC television’s Dark Shadows focused on Barnabas Collins (played by Jonathan Frid), whose sorrow and angst about being a vampire was an almost endearing character trait.

Not that the undead don’t also have issues in their love lives. Barnabas was a forlorn vampire and frequently at the center of love triangles. The whole series –which also delved into the world of werewolves, witches, ghosts, and all things supernatural — combined traditional soap-opera melodrama with a camp sensibility. The low-budget production values and limitations, brought on by the quick turnaround time demanded for the production of a show that needed five new episodes every week, add to its charm, if you’re in that frame of mind.

A popular and stylish remake of the series was produced in 1991. Released on DVD under the title Dark Shadows – The Revival, it stars Ben Cross as the vampire.

Dark Shadows has long maintained a cult following. Several sources indicate  a re-working of the franchise in movie form may appear sometime soon.

Many episodes from the original series have been released in various collections on DVD. A movie version of the Barnabas Collins story — featuring the original cast of the ABC series — was released to theaters in 1971 as House of Dark Shadows. (At the time of this writing it appears to be unavailable on DVD.)


Nosferatu the Vampyre (1979)

Although Tod Browning’s vision shaped how directors approached the Dracula story for decades, in the later 1970s director Werner Herzog went back to an earlier source. His Nosferatu the Vampyre pays homage to director F.W. Murnau’s 1922 classic, updating  the story and adding stunning visual design.

Nosferatu the Vampyre is much more than simply a remake, however. As a Village Voice reviewer said, it’s “a reconnection with German culture.” It’s also a reflective, moody film. In fact, critic Roger Ebert said the movie is “so slow it’s meditative at times.”

Hidden beneath layers of heavy make-up, Klaus Kinski offers a solid performance. Thankfully, he avoids the usual acting clichés for a vampire role. The performances of the rest of the cast — including Isabelle Adjani, Bruno Ganz, Roland Topor, and Walter Ladengast — are also commendable, but the movie is more about mood, atmosphere, and symbolism than character-focused narrative.

Herzog’s movie has a prominent personal vision. Taken on its own terms, it’s captivating viewing.

(Although the film is available with English sub-titles, the original German soundtrack offers a richer experience.)


Salem’s Lot (1979 TV Miniseries)

Tobe Hooper directed this version of Stephen King’s chilling story about a Maine town that becomes overrun with nighttime blood suckers and the Prodigal Son writer who returns to kill them. Nominated for three prime time Emmys, watch this for the creepy-kitschy factor. Salem’s Lot stars David Soul, James Mason and others.

[Salem’s Lot was remade in 2004 for USA TV networks, with Rob Lowe, Donald Sutherland, and Andre Braugher. This one ups the gore and has more convincing effects.]


The Lost Boys (1987)

In the 1980s, Director Joel Schumacher’s The Lost Boys brought contemporary sex appeal to things that go bump in the night. Even the tagline was hot: “Sleep all day. Party all night. Never grow old. Never die. It’s fun to be a vampire.”

The story involves the little town of Santa Clarita, which has one major problem: all the damn vampires. Watch new kids Corey Haim and Jason Patric as they get roped in to the local cult of the undead. It’s a fun and entertaining ride, despite the trade paper Variety’s wet-blanket assessment that it’s “a horrifically dreadful vampire teensploitation entry … that daringly advances the theory that all those missing children pictured on garbage bags and milk cartons are actually the victims of bloodsucking bikers.” (Variety‘s review is here.)

Starring Jason Patric, Kiefer Sutherland, Coreys Haim and Feldman. See the trailer for The Lost Boys at here.


Bram Stoker’s Dracula (1992)

With legendary director Francis Ford Coppola at the helm, Bram Stoker’s Dracula also boasts a stellar cast, including Gary Oldman, Winona Ryder, Anthony Hopkins, and Keanu Reeves. Lush and stylized, this take on Bram Stoker’s tale focuses on the love story between Mina Harker and the Count. Keanu Reeves, as Jonathan Harker, is as wooden as the stakes used to kill the vampires, but Anthony Hopkins and the amazing visual palette of the movie more than makes up for it.

Indeed, this is a hard movie to pin down. A review in the Washington Post complained: “You can’t tell if this is a flawed masterpiece or an intricately designed bag of wind.” Still, there are more than enough elements in the movie to make it an essential part of anyone’s introduction to the genre.


Buffy the Vampire Slayer (television series 1997-2003)

After writing the script for director Fran Rubel Kuzui’s 1992 movie of the same name, Josh Whedon took his characters and their story to television and did what is seldom accomplished: He improved on the original. Indeed, television’s Buffy the Vampire Slayer — a more somber take on the same basic story — did more than follow the never-ending struggle of young vampire slayers against an army of the undead. It also spoke tellingly about the lives of American teenagers at the turn of the 21st century.

Whedon reportedly said the show was “high school as a horror movie.” But it’s engaging viewing no matter what you call it. Amassing a legion of fans, the series benefited from not only smart writing, but also a strong cast — especially Sarah Michele Gellar in the leading role.


I Am Legend (2007)

Director Francis Lawrence’s 2007 reimagining is about as far from the original Richard Matheson book as you can get. In his adaptation of I am Legend, a plague causes the world to succumb to vampiric, zombie-like illness, and Will Smith is apparently the lone New Yorker immune—good thing he has his dog for company!

The relationship between Smith and Marley, his canine companion, is as touching as the vampires are evil. Have a box of tissues handy for this one!

[For a different take on Matheson’s story, check out The Last Man on Earth, the 1964 movie starring Vincent Price.]


Twilight (2008)

Ahh, first love…vampiric love. Much has been said about the sappy, silly aspects of this teen drama, but the film has some classic vampire moments: liberal gore, good special effects, and Lost Boys-esque sex appeal. No doubt this is part of the reason that Twilight, director Catherine Hardwicke’s film, is the most recent phenomenon in the vampire movie tradition.Think of Twilight as an appetizer to Coppola’s main course. With Kristen Stewart and Rob Pattinson.





Don’t see you favorite vampire movie? No problem. Send along your favorite by posting a comment.


G. Arnold & Erin Dionne are writers and editors of Bread and Circus Magazine.

Images (above): DVDs available from


The DIGNITY Series: Protecting Ourselves from Dignity’s Demands

by Editors

This is the sixth of a 10-part series examining the loss and possible recapture of dignity in our public lives and political discourse. (Read the series from the beginning here.)


By Stanley Baran

6. Protecting Ourselves from Dignity’s Demands

The Lynch deception should have been what Salon writer and former constitutional lawyer Glenn Greenwald calls a pitchfork moment, if not sending us into the streets in protest, at least generating the level of public outrage that accompanies an over-the-hill athlete’s steroid use or the discovery that a pop music group lip-synched its lone hit tune. What has happened to dignity is analogous to what former Senator Daniel Patrick Moynihan called “defining deviancy down.” Just as we have allowed the erosion of our commonly accepted civic standards for what constitutes criminal behavior, we have permitted the erosion of standards for what constitutes dignified behavior.

On the floor of the U.S. Senate, Vice President Dick Cheney tells Vermont’s Patrick Leahy, “Go f*** yourself,” in response to a question. It’s not undignified behavior, only the rough and tumble of politics. President Bush’s chief political aide, Karl Rove, orchestrates the outing of an undercover CIA agent, destroying her career and the spy network she spent 10 years building. It’s for our own good in the never-ending battle against Islamofacism. Mr. Rove now enjoys hefty income from Fox News and Newsweek. Secretary of Defense Donald Rumsfeld tells us that we went to war in Iraq in the spring of 2003 “with the army we had, not the army we might want or wish to have had at a later time.” We later learn that had we delayed long enough to properly equip, armor, and train that army and deploy it in numbers large enough for the mission hundreds, if not thousands of our brave men and women might not have died. Why the rush to invade? The President’s Chief of Staff Andrew Card explains that “from a marketing point of view, you don’t introduce new products in August.” Mr. Rumsfeld now enjoys his appointment as a fellow at the Hoover Institution at Stanford University and Mr. Card the glow of an honorary doctorate from the University of Massachusetts.

Displaying pictures of himself looking under the furniture in the Oval Office for the pesky weapons of mass destruction, the causus belli for that “new product,” George W. Bush joked to scores of appreciative reporters at the 2004 Radio and Television Correspondents’ Association black tie dinner, “Nope, no weapons over there; maybe under here.” At that time, more than 500 U.S. men and women and countless Iraqi civilians had been killed. The journalists, most of whom had abetted the push to get that product to market, laughed. Those disgusting images of American soldiers humiliating prisoners at Baghdad’s Abu Ghraib prison that enraged the world? Radio talker Rush Limbaugh says no big deal, no worse than a fraternity prank, “no different than what happens at the Skull and Bones initiation,” later elaborating, “If you, really, if you look at these pictures. . .it looks just like anything you’d see Madonna, or Britney Spears do on stage.” Maybe, he wondered, he could “get an NEA grant for something like this.” Mr. Limbaugh, “the most listened to voice in American radio,” is paid $38 million a year to continuing educating his 14 million daily listeners.

In 2005 Congress voted to make enhanced interrogation official U. S. policy, oblivious to the fact that “enhanced interrogation” is the verbatim translation of the Nazi’s euphemism for torture (Verschärfte Vernehmung). Around that same time a handful of economists began sounding largely ignored warnings of a looming economic crisis. Surveying these events, former Vice-President Al Gore asked in a speech, “Are we still routinely torturing helpless prisoners, and if so, does it feel right that we as American citizens are not outraged by the practice? And does it feel right to have no ongoing discussion of whether or not this abhorrent, medieval behavior is being carried out in the name of the American people? If the gap between rich and poor is widening steadily and economic stress is mounting for low-income families, why do we seem increasingly apathetic and lethargic in our role as citizens?”

Why, indeed? Yes, we loudly spoke our indignation on November 4 (if 53% to 46% can be considered loud), but why did we accept these and countless other indignations—assaults upon dignity—with so little complaint for so long? How could we look at events that so obviously betrayed what should have been our collective sense of national dignity with so little protest? As Thomas More reminds us in A Man for All Seasons, “Qui tacet consentiret,” silence gives consent. To not protest is to acquiesce; to refuse to notice is undignified.

As long ago as 1928, George Bernard Shaw offered this explanation: “The moment we want to believe something, we suddenly see all the arguments for it, and become blind to the arguments against it.” Since then, psychologists have produced more formal explanations of how people maintain cognitive consistency, that is, how we ensure that our actions toward an issue are consistent with our attitudes toward it.

Most prevalent in our public discourse (although rarely if ever offered in terms of our willingness to suffer repeated indignations) is cognitive dissonance theory. The New York Times’ Michiko Kakutani relied on dissonance theory to explain the popularity of faux newsman Jon Stewart. “The Daily Show resonates not only because its keen sense of the absurd is perfectly attuned to an era in which cognitive dissonance has become a national epidemic. Indeed, Mr. Stewart’s frequent exclamation ‘Are you insane?!’ seems a fitting refrain for a post-M*A*S*H, post-Catch-22 reality, where the surreal and outrageous have become commonplace—an era kicked off by the wacko 2000 election standoff in Florida, rocked by the terrorist attacks of Sept. 11, and haunted by the fallout of a costly war waged on the premise of weapons of mass destruction that did not exist.” How do Americans maintain belief in the dignity of their nation when confronted by events suggesting they rethink that assessment? They reduce their psychological discomfort (dissonance) by “reconfiguring” the facts of those events. To preserve the self (the I in an I-It relationship) people see what they believe rather than believe what they see.

Read Part 7 : A Scientific Explanation (click here)


Stanley Baran is Professor of Communication at Bryant University. A Fulbright Scholar, he is the author of Introduction to Mass Communication: Media Literacy and Culture and Mass Communication Theories: Foundations, Ferment, and Future. He writes frequently on the media, popular culture, and our understanding of ourselves and our world. He will happily provide citations for this series’ quotations and statistics. Simply e-mail him at

Text copyright 2009 Stanley Baran


Amidst the Echoes

by Editors



Amidst the Echoes:
The Story of Lidice, Czech Republic

Text and photographs by Jessica Miles

While studying abroad this past June in Prague, Czech Republic, I traveled outside the city to gain a better understanding of the effects of Nazism on the country. I arrived in Lidice on a hazy Saturday afternoon overlooking the stunning green countryside, but would soon learn of the horrific past behind the village.

Photo 3 - (c) J Miles 2009

On June 10, 1942, one hundred seventy-three village men were shot in Lidice, Czech Republic. The women and children were either gassed at Chelmno in Poland or sent to Ravensbruck concentration camp. The Nazis then leveled the village, burning the church, the cemetery, and the homes of the 503 residents. In the aftermath of the Lidice invasion only 143 women and 17 children survived.

This story lurked in my mind as I traveled outside of Prague on a weekend trip that highlighted the impact of the Nazi regime on the Czech people. I began in Lidice, a small mining village northwest of Prague that became the focus of Nazi retribution. Days after a vague link was found between the assassination of Deputy Reich Protector Reinhard Heydrich and a Lidice family, the Nazis invaded the village.

Photo 2- (c) J Miles 2009The assassination attempt was made on May 27, 1942, by two Czech soldiers launching a grenade at Heydrich’s vehicle causing an explosion and severe injury to Heydrich. He died eight days later. It was never proven whether the Horák family of Lidice was connected to the assassination of Reinhard Heydrich. However, the rage caused by the incident was released on the village on June 10, 1942, when the men were executed and the women and children were removed from Lidice. In all, 340 of the 503 villagers were executed including 82 children who suffocated in an adapted truck filled with exhaust gas.

Today, visiting the grounds of Lidice is a somber experience. The grim hills surrounding Lidice are accompanied by the echoes of what was once a tiny village with a population of five hundred and is now a series of hills and valleys surrounded by beautiful foliage. Although there are no villagers left in Lidice, the rebuilding of the village less than a mile from the original location in 1948 represents the immense hope the villagers sought and the significance of preserving the land where Lidice once stood.

Memorial photo (c) J Miles 2009Peering through the swaying trees within the hills is the Children’s War Victims Monument. Created in 1969 by sculptor Marie Uchytilová, these three-dimensional bronze figures commemorate the 82 children of Lidice who lost their lives at the hands of the Nazis. The poignant figures stand isolated and helpless in nothing but wrinkled shirts, trousers and dresses as they overlook the valley of what was once their home.

The nearby Lidice Museum, dedicated to the lives of the Lidice families and the village itself, is a startling revelation of that fateful day. The octagonal-shaped structure hugs the crescent-shaped museum and stands amidst a stone courtyard. The exhibition portrays a prison-like atmosphere, dreadfully quiet and eerily cold. The concrete dividers, the open space, the dim lighting, and the historical visions projected on the walls of the exhibition create a bleak, yet intimate connection with the residents of Lidice and the torture they endured.

A timeline throughout the exhibition details the horrific acts that took place, as well as the aftermath of the Nazi raid. On each wall is a glimpse into what would become the future of the village, including actual footage of German tanks demolishing Lidice. Photos of the families before the invasion depict ordinary life, while visions of torment and death drape the walls as the story progresses. One wall even gives the children a voice; letters to their mothers, fathers, grandmothers and grandfathers are shown, conveying confusion, desperation, and even hope.

Around the final corner of the exhibition adult survivors tell their stories and the impact of that day on their lives in a moving film that brings the reality of the Lidice raid to the present. Their stories represent what is left of the Lidice inhabitants, as they struggle to find peace with their past. One survivor describes being ripped away from her mother by a German soldier and sent away on a bus, only to be stripped of everything but her undergarments. Another recalls the horror of being separated from her family as they were beaten before her eyes. And after escaping from what should have been his last days, a man remembers searching for his mother in the aftermath of the attack. She was never found.

The stories told in the words of the survivors paired with the vivid imagery of the invasion bring light to the importance of remembering Lidice beyond the attack. It is a story about the relentless hope and prolonged strength of the survivors and their quest to restore the village of Lidice and recreate the tranquil way of life that once existed. The Lidice Museum exhibition is an extraordinary portrayal of life before and after the Nazi invasion, as it is here that the villagers are given an individual identity, a name, and a legacy.


Jessica Miles is a Bread and Circus Magazine contributing writer.

(c) 2009 Jessica Miles

NEW VOICES is a Bread and Circus Magazine feature in which emerging writers share their views on aspects of contemporary culture.


The DIGNITY Series: War Stories

by Editors

This is the fifth of a 10-part series examining the loss and possible recapture of dignity in our public lives and political discourse. (Read the series from the beginning here.)


By Stanley Baran

5. War Stories: Knowing Dignity When We See It and When We Don’t

Physicians regularly debate the value of dignity as a guide for their work, often with little success. American medical ethicist Ruth Macklin, in an essay entitled “Dignity is a Useless Concept,” argued “in the absence of criteria that can enable us to know just when dignity is violated, the concept remains hopelessly vague.” Dignity, she concluded, “is nothing more than a capacity for rational thought and action.” Richard Horton, editor-in-chief of the British medical journal The Lancet, reached for a more instructive definition by combining the ideas of two philosophers, the eighteenth century’s Immanuel Kant and the nineteenth century’s Thomas Hill. Horton produced an argument for the I-Thou definition of dignity, writing, “Kant identified dignity as the absolute inner worth of a person, ‘by which he exacts respect for himself from all other rational beings in the world.’ Dignity and self-respect were instruments for asserting the equality of each person. . .Human dignity is an unconditional and incalculable value, admitting no trade-offs. Hill argues that the choices we make should be decided upon according to the view that no one is a mere means, that human dignity is priceless, and that our decisions can and must be made on the basis of mutual respect, seeing every human being as a source of value.” Nonetheless, Horton eventually had to admit defeat, conceding, “Human dignity is a linguistic currency that will buy a basketful of extraordinary meanings. It is not surprising, perhaps, that some critics describe dignity as a meaningless slogan.”

Still, one solution to defining dignity might reside in Supreme Court Justice Potter Stewart’s strategy for evaluating sexual media content: “I may not be able to come up with a definition of pornography, but I certainly know it when I see it.” So it may be with dignity—difficult to define, but we know it when we see it. We preserve the stories of Joan of Arc, Sir Thomas More, Anne Frank, and Oskar Schindler because they define for us lives lived in dignity. But can dignity be demonstrated only in extraordinary times and circumstances (the Hundred Years War, Henry VIII’s break from the Catholic Church, World War II and the Holocaust)? Thurber believed so. “That which is only sporadically realized can scarcely be called a characteristic,” he wrote. “It is impossible to think of it as innate; it could never be defined as normal. Nothing is more depressing than the realization that nobility, courage, mercy, and almost all the other virtues which go to make up the ideal of Human Dignity are, at their clearest and realist, the outgrowth of Man’s inhumanity to Man, the fruit of his unending interspecific struggle. The pattern is easily traceable, from Christ to Cavell.” The story of Jesus Christ, divine proponent of the I-Thou life (“Do unto others as you would have them do unto you,” “Thou shalt love thy neighbor as thyself,” both from the Book of Matthew; “That which you do unto the least of mine you do unto me,” from the Sermon on the Mount), is a familiar one; but who is Cavell?

A British nurse serving in Belgium during World War I, Edith Cavell is commemorated by a statue in London’s Trafalgar Square. She helped more than 200 wounded Allied soldiers escape to neutral Holland from the Brussels hospital where their German captors had taken them. For this act she was imprisoned in solitary confinement for 9 weeks and executed by firing squad. And as Aristotle wrote that “dignity consists not in possessing honors, but in the consciousness that we deserve them,” it’s proper that Edith Cavell both possesses (her statue) and deserves her honors.

Today we can recognize dignity in another, more contemporary armed conflict, the war in Iraq. One of its true heroes is a young woman who defines dignity, not because she possesses honors (in fact, she rejected them), but because she deserves them (more so because, feeling others more deserving, she did in fact reject those honors). Private Jessica Lynch’s story may or may not be familiar, although had she behaved as her superiors had wished, she would now be a national heroine, famous and rich. Instead, Ms. Lynch chose dignity.

In the early days of the 2003 invasion of Iraq, Pvt. Lynch’s convoy was attacked in the town of Nassiriya. Eleven of her comrades were killed. The Pentagon’s official story had the 19-year-old supply clerk wounded, emptying her weapon at the enemy. Knocked unconscious, she was captured, tortured, and sexually assaulted. Only a daring late-night raid by commandoes representing all branches of our military freed her from her captors. Although there was no video or audio record of the attack in which she had been captured, it was America’s good fortune that the rescue was chronicled on green-tinged night camera video. Almost immediately upon its official telling, non-U.S. media outlets challenged the military’s account. Nevertheless, the American press ran with the story of the “little girl Rambo from the hills of West Virginia who went down fighting.”

But Lynch herself was soon telling all who would listen—network news shows and Congressional committees—that this was all a lie. She never fired a shot; her gun had jammed. She had been treated kindly by civilian Iraqi doctors and nurses caring for her. Her armed rescuers faced no opposition and, in fact, turned back emissaries who offered to bring Lynch to them. “I am still confused as to why they chose to lie and tried to make me a legend when the real heroics of my fellow soldiers that day were, in fact, legendary,” she said. In another interview she added, “That wasn’t me. I’m not about to take credit for something I didn’t do.” And later, what did Ms. Lynch do with her time in the public eye? After disavowing a network made-for-TV-movie hewing closely to the Pentagon’s disinformation, she convinced the television show Extreme Makeover: Home Edition to build a new house for a fallen comrade’s orphaned children.

Lynch’s reward for these acts of dignity? Hate mail instead of plaudits. Obscurity instead of fame. A seat at West Virginia University in Parkersburg instead of fortune. “I want people to remember me as being a soldier who went over there and did my job. Nothing special. I’m just a country girl at heart,” she said three years after her capture. How many of us remember her?

Read Part 6: Protecting Ourselves from Dignity’s Demands


Stanley Baran is Professor of Communication at Bryant University. A Fulbright Scholar, he is the author of Introduction to Mass Communication: Media Literacy and Culture and Mass Communication Theories: Foundations, Ferment, and Future. He writes frequently on the media, popular culture, and our understanding of ourselves and our world. He will happily provide citations for this series’ quotations and statistics. Simply e-mail him at

Image: Public domain photograph of Edith Cavell (Wikipedia)

Text copyright 2009 Stanley Baran


Are You Going to Eat That?

by Editors


Are You Going to Eat That?
Thoughts on “Freeganism” Today

By Judith Shimer

Freeganism, a mash-up of “free” and “veganism,” is a word I’ve really only encountered in slightly condescending newspaper articles. The rubbish-happy culture which adopted me in Scotland (and which thrives in parts of the States as well) doesn’t have a single name, and not everyone is vegan or vegetarian or a social activist. They share only a love of getting stuff for free — from the trash.

I was an instant convert. Suddenly I was proselytizing to everyone who didn’t already do it, including Argie, a games designer living in my hostel in London. He was predictably disgusted at first. “You mean, you look for food in people’s trash?” I could see the usual montage flashing through his head: meth addicts, a bitter night in December, half-eaten McDonald’s hamburgers.

“No no,” I assured him. “It isn’t people’s bins you want, it’s the supermarket dumpsters.”

There are many items in supermarkets that are considered unsellable after only a day — especially bread, even though it takes weeks to go stale in your fridge, and it can be stored for months in the freezer. Bread is abundant in supermarket skips (that’s regional for “dumpster,” making dumpster-diving “skipping,” which sounds more appealing anyway). And not only sliced white bread, but wholegrain bread, rolls, pastries, muffins, cupcakes, birthday cakes, scones, cookies, doughnuts, and these little pre-fried pancakes with maple syrup mixed in the batter, all neatly packaged and usually bundled safely together in fresh garbage bags.

Other skipping regulars include yogurt, juice, fruits and vegetables, prepackaged meals, and frozen meat. When you think about it, the obsessive quality of the products on supermarket shelves—un-dented, crisp corners, airtight shrink wrap and distant sell-by dates—requires massive amounts of waste. And all of the products which are unlikely to sell, less often because they’re rancid than because one edge of the wrapping is crushed, are doomed for the landfill.

“Damn,” said Argie. “I can’t wait ’til Waitrose closes.”

There’s no need to lecture on how many starving people could get fat on the things we throw out. But as if the waste isn’t depressing enough, consider the grocery stores that keep their dumpsters locked. You could make an argument for vermin, but what about the British chain that puts blue food coloring in the skips? That sure isn’t to dissuade raccoons. (It doesn’t dissuade some humans, either; our kitchen had electric blue smears all over.) The fact is, supermarkets don’t want their reputations spoiled by folks who look poor, or electively excuse themselves from certain social etiquette.

Like with everything we consume, edibility and safety aren’t the only factors when choosing what we eat. Convenience also counts—I can get that squeamish types might not want to climb into a dumpster. On the other hand, when I joined a group of students at Montserrat College of Art organizing a “Food Not Bombs” to give away meals cooked with rescued food, it was baffling the number of people who opted to buy their lunch at the restaurant down the street rather than eat something hot, delicious, convenient and free, sanctioned by the health department, and requiring absolutely no flies, no strange runny substances on shoes, and no people yelling at you to get out of the bins.

I didn’t understand. Why pay money when you don’t have to? The answer: We don’t buy food because we have to. We buy food because it’s our privilege. And if you surrender that privilege for a free meal, you may suffer from homelessness, weirdness or socialism.

My flatmate Scoutt found a dumpster key and was overjoyed at finally getting into the Costcutter bin down the street. This Costcutter does only seem to throw out a lot of one thing at a time—all sliced ham once, all Smirnoff Ice and vodka Irn Bru another—but that first night, Scoutt, Joey and I found six liters of orange juice.

Our glee was only a little dampened by the pub-crawler who halted at the side-street entrance and stared. “You’re in the bins,” he muttered, unable to believe his eyes.

“Yes, want some OJ?” said Scoutt, making to toss him a carton.

He just continued to stare. “You’re in the bins!” he said again, louder. “Freaks!” And then he walked off.

The three of us looked at each other and shrugged, pitying the fool who will waste hundreds of quid on orange juice in his lifetime.

For more information on Food Not Bombs, the alarmingly benevolent international anarchist free food organization, go to


Judith Shimer, contributing writer, fronts Ohio indie rock band The Alphabet.

NEW VOICES is a Bread and Circus Magazine feature in which emerging writers share their views on aspects of contemporary culture.



The DIGNITY Series: Dismissed Warnings

by Editors

This is the fourth of a 10-part series examining the loss and possible recapture of dignity in our public lives and political discourse. (Read the series from the beginning here.)


By Stanley Baran

4. Dismissed Warnings

More than a quarter century ago Jimmy Carter, who since having left the Presidency in 1981 has lived his life in service to others, confronted Americans’ collective loss of dignity, our “growing doubt about the meaning of our own lives and in the loss of a unity of purpose for our nation.” He reminded his fellow citizens that, “In a nation that was proud of hard work, strong families, close-knit communities, and our faith in God, too many of us now tend to worship self-indulgence and consumption.

Human identity is no longer defined by what one does, but by what one owns. But we’ve discovered that owning things and consuming things does not satisfy our longing for meaning. We’ve learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose.” Rather than accept his call for a return to our better angels, we recoiled from the “scolder-in-chief,” rejecting his characterization of a nation in “malaise” (a word he never used), and we booted him from office in favor of Ronald Reagan’s morning in America, complete with its tripling of the federal budget deficit and the Gipper’s conviction that the growing hordes of homeless people, 600,000 on any one night and 1.2 million over the course of a year, “make it their own choice for staying out there.”

President Carter’s contemporary, Polish-born American rabbi and civil rights activist Abraham Heschel, also worried about the disappearance of dignity under the avalanche of stuff. He believed that the Hebrew Prophets, in whose voices “the word of God reverberated,” taught that “self-respect is the fruit of discipline; the sense of dignity grows with the ability to say no to oneself.” Of the prophets he said, “The Prophet is an iconoclast, challenging the apparently holy, revered, and awesome. Beliefs cherished as certainties, institutions endowed with supreme sanctity, he exposes as scandalous pretensions.” People today who argue the merits of self-denial are the ones deemed scandalous, occupying the fringes of culture and discourse.

We see this in the until recently-unquestioned truth of how our economy is supposed to work, in our cherished certainty in the holy, revered, and awesome institution endowed with supreme sanctity, “the market.” Nobel-Laureate economist Milton Friedman wrote, “So the question is, do corporate executives, provided they stay within the law, have responsibilities in their business activities other than to make as much money for their stockholders as possible? And my answer to that is, no, they do not.” This child of Hungarian immigrants, graduate of public high school and state-supported college, the man The Economist called “the most influential economist of the second half of the 20th century…possibly of all of it,” recommended the eradication of Medicare, welfare, the postal system, Social Security, and public education. He said that “there is no poverty in America.” We see it in George W. Bush’s national call to action after the attacks of September 11, 2001: restore trust in the economy, go shopping, “Get down to Disney World in Florida. Take your families and enjoy life, the way we want it to be enjoyed.” In the 1987 movie Wall Street, insider-trading, rapacious capitalist Gordon Gekko (played by Michael Douglas whose performance earned him an Academy Award for Best Actor) insisted that “greed is good.” Writer/director Oliver Stone had intended Gekko to repulse audiences. Instead, Americans loved him. In October, 2008, at the height of this country’s financial meltdown, the Providence Journal, the flagship newspaper of the state that had just that week surpassed Michigan as having the nation’s highest unemployment rate, editorialized using Gekko’s mantra, “Greed is Still Good.”


Read Part 5:

War Stories: Knowing Dignity When We See It and When We Don’t


Stanley Baran is Professor of Communication at Bryant University. A Fulbright Scholar, he is the author of Introduction to Mass Communication: Media Literacy and Culture and Mass Communication Theories: Foundations, Ferment, and Future. He writes frequently on the media, popular culture, and our understanding of ourselves and our world. He will happily provide citations for this series’ quotations and statistics. Simply e-mail him at

Text copyright 2009 Stanley Baran


Poetry: The Confessional

by Editors


The Confessional

By January Gill O’Neil

For more than a year, I’ve posted a series on my blog called Confession Tuesday. I wanted to dig deep and really discuss the small things, poetic and nonpoetic, happening in my life. Somehow it caught on, and I’ve kept it going as a regular feature on my Poet Mom blog.

Poets have a keen sense of mining deep into their everyday lives for material for their poems. When I consider the personal as subject matter for our work, I think of the opening lines of Stephen Dunn’s poem “The Routine Things Around the House”:

When Mother died
I thought: now I’ll have a death poem.
That was unforgivable
yet I’ve since forgiven myself
as sons are able to do
who’ve been loved by their mothers.

Every poem is a confession.

Through the years, however, confessional poetry has received a bad rap. While the Beats were reinventing language in the late ’50s and early ’60s, poets such as Robert Lowell, Sylvia Plath, Anne Sexton, and W.D. Snodgrass removed all poetic artifice to reach a more personal, intimate level of verse. By its nature, poetry is personal. Yet, the word “confessional” has always been code for “women’s poetry,” as if the poetry by, for, and about woman is any less valid, dynamic, or revelatory. Simply not true. When it’s done right and done well, confessional poetry is personal and universal, speaking to the broader spectrum of the human condition.

Under the header of Confession Tuesday, I’ve admitted how my desire to write often overrides my daily duties, such as work, family, and household chores. I’ve discovered that these posts are a tool to work out poems before I get to the page. My confessional also has given me the license to praise or rant about topics important to me regarding the poetry community. For instance, I’ve posted about the myth of the work-life balance for a writer (read: there’s no such thing. You just write, then deal with the rest.). I’ve discussed, ad nauseam, how I much I want to be U.S. Poet Laureate someday (It could happen!). And recently, I came clean about how I feel poets should market their poetry, which is taboo in most (academic) poetry circles.

Through the process of “confessing,” I have been able to work out issues before I get to the page, leaving me available to navigate the open waters of thought.

As one who writes in the confessional vein, I understand that to keep my work fresh and interesting, I must strive for clear, crisp language that expands upon my point of view. But there’s also another aspect I can’t neglect. Admittedly, since we’re talking about confessions, it’s just fun to let loose! A confession is an open invitation to say what’s really on your mind in a safe space.

So, consider Bread and Circus a safe space. This is your chance to let loose. What are your poetry confessions? What are your poetry likes and dislikes? Tell us something that you wouldn’t normally say in polite poetry circles. I bet you’ll find that what might seem outlandish or trite to you is more universal than you think.


January Gill O’Neil‘s first book of poems, Underlife, will be published in November 2009 by CavanKerry Press. Visit her at the Poet Mom blog.


Have thoughts or feedback to share about this post?

Join the conversation! Post a Comment or send us an email (click here).