The Flash, Apparently Official

Nov. 18, 2014

by Carl Hausman

On November 22, 1963, Walter Cronkite faced what was surely one of the most difficult and explosively pressurized decisions in the history of journalism.

Shots had been fired at U.S. president John F. Kennedy, and Cronkite had taken the lead on the story, reading a bulletin over a CBS News graphic at 1:40 P.M. EST (an hour later than local Central time zone in Dallas), beating NBC by about a minute. ABC took to the airwaves with the story about a minute later.

The bulletin that Cronkite read said that shots had been fired at the president’s motorcade and that the “first reports say that President Kennedy has been seriously wounded by this shooting.”

From that point on, CBS essentially “owned” the story. In addition to beating rivals to the air, the network had a logistical advantage because they had several experienced reporters and a good deal of equipment deployed near the motorcade and at the Dallas Trade Mart, Kennedy’s destination at the conclusion of the motorcade.

The morning papers had been printed already, and the process of printing and distributing an extra edition would take hours. Radio had large audiences in 1963, but television had supplanted it as the cool hearth where the family gathered, and people were drawn immediately to their TVs — in homes, offices, and even in crowds around stores that sold televisions.

Television still was not taken entirely seriously as a news medium, though. People tended not to believe much until it was set in type. But on this day, that was not an option.

The eyes of the nation were therefore on Cronkite. His ethical and pragmatic decisions over the next couple of minutes would be crucial for his career, the credibility of television news, and for the nation.

Was Kennedy dead? Cronkite surely wanted to be first with the story, but he also wanted to be right, so he refrained from making a pronouncement. He didn’t say that the president was dead until 2:27 P.M., and then did so in a cautious and tentative way — relaying that Dan Rather had confirmed that the president was dead, but that there had been no “official confirmation.”

Eleven minutes later, official confirmation came in a “flash” (in wire-service parlance, the most urgent level of news report, outranking “bulletin”) from the Associated Press. Cronkite intoned the most riveting and impactful report in broadcast history:

From Dallas, Texas, a flash, apparently official, President Kennedy died at 1 P.M. Central Standard Time, two o’clock Eastern Standard Time, some 38 minutes ago.

Why did Cronkite hold off? Other news organizations had reported already that the president was dead, albeit with varying degrees of certainty, citing second- or third-person accounts of witnesses, including priests called in to administer last rites or bystanders who had overheard remarks made in the corridors of Parkland Hospital.

Part of the answer is obvious: Erroneously reporting that the president was dead would be a horrifying mistake. Still, CBS is in the news business, and the very nature of the business (as well as the linguistic derivation of the word) specifies that readers, listeners, and viewers be provided with the newest information. Viewers wanted to know and needed to know if the president had been killed.

The question became: How long could Cronkite sit on the news, and what sort of official confirmation would tip the scales to the point where he would confirm that the death was, at least, “apparently official.”

The tipping point was the flash from the Associated Press — then, as now, the news organization legendary for its accuracy, the news cooperative from which most of the business gets much of its news. Cronkite also weighed the fact that trusted, experienced colleagues on the ground had expressed their belief that the president had, indeed, died.

But there is a not-so-obvious reason why Cronkite held off. He was an experienced newsman (age 47 at the time of the assassination) and knew that initial chatter, traffic on the police scanner, and even the initial observations of reporters and officials are often confused and inaccurate.

I, like seemingly every other journalism educator and author of a certain age, wrote a book that included discussion of some incidents related to the assassination and the nature of truth-telling. I included a reflection by the New York Times‘s Tom Wicker, who noted that some initial “facts” were wrong.

There was an official announcement soon after the president’s body had been taken to the hospital, made by a White House assistant press secretary, that the president was alive. Apparently some vestige of electricity in the president’s body had fooled monitoring equipment into detecting what looked like a heartbeat.

Bombarded with conflicting information and with none of the means by which a reporter of that era would normally check things out, Wicker followed his instincts and confirmed, later in the day, that the president was indeed dead.

Reporters who covered the assassination or who were peripherally involved with the relaying of reports from the scene have told me that Dallas on the day was crackling with rumors and speculation, including reports that Lyndon Johnson had been killed and claims that the shooting had been linked to an attack by the Soviet Union.

But wild speculation generally did not receive widespread circulation, and while there were many errors in fact and interpretation on November 22, 1963, and in the days that followed, the coverage of the death of President Kennedy is regarded as the signature event that helped television to evolve from a medium that was not taken very seriously into the primary source of news for the U.S. public.

So, how would a similar scenario play out today?

Don’t worry; this is not going to be a sermon canonizing the good old days and vilifying the new media. Any serious student of the news business knows that there have been errors and transgressions throughout the history of the business and even “Uncle Walter” was no saint. In his early years as a reporter in Houston, for example, he had broken into a home to obtain a photo of a crime victim. (It turned out that he had burgled the wrong house, incidentally.)

Instead, I would simply like to point out how the ethical issues surrounding breaking news have been magnified by technology and by the increasing “reporting” by people who are not news professionals (and who have little to lose if they relay incorrect information). Moreover, the modern lack of a publication schedule or a newscast that airs only at 6:30 P.M. deprives even the most conscientious journalists of the time to confer, weigh, and evaluate information.

Over the past decade, there have been instances in which the rush to be first has resulted in news organizations getting it wrong. In 2006, for example, misinterpretations of a sting of cell-phone messages, announcements, and even a misunderstanding of what the pealing of distant church bells meant led many in the mainstream media to mistakenly report that a dozen men had survived a mine disaster in Sago, West Virginia, when they had not.

In 2012, two major cable networks erroneously reported that the U.S. Supreme Court had struck down the individual mandate in the Affordable Care act.

Social media force-fed erroneous reports about suspects in the 2013 Boston Marathon bombing through the internet and TV. One young man, falsely reported to be a suspect, actually turned himself into police to clear his name and ostensibly to save himself from misguided vigilante justice.

IGE founder Dr. Rushworth Kidder once told me that he was largely motivated to study and reform ethics after witnessing the way that technology, in his words, “leveraged” the impact of ethical malfeasance. He often referred to the 1986 Chernobyl disaster, in which technicians had defied protocol, ethics, and common sense by disabling safety shutoffs so they could continue with an experiment.

What would have been a relatively minor issue in a power-generating plant from a previous generation thus became a disaster.

Do social media and real-time news have the potential to create what we might call an Information Chernobyl? In the modern connected world, how do we set ethical standards for reporting information that has not been verified officially, especially in times of disaster or insurrection?

It’s tempting to say that “nothing should be reported until it has absolutely been proven true” but that is, of course, an impossibility. A news medium that sits on a story until every detail has been verified by official sources is not in the news business at all. It would perhaps more accurately be described as being in the business of writing history books, or even in the business of stenography, because official sources tend to release news in a way that benefits official sources.

What’s your view? If you had to write a code of ethics for news organizations, what would be in your section about truth-telling and verification in fast-moving, breaking events?


Author: admin

Carl Hausman is Professor of Journalism at Rowan University, the author of several books about media, and a commentator about the role of media and ethics in civic life.

Leave a Reply

Your email address will not be published. Required fields are marked *