Edited by David Leonhardt

The Upshot

Who Will Win the Senate? Details Make Your Own Forecast
The Latest

Locations of Presidential TV Speeches Can Give Signals

Photo
President Kennedy speaking from the Oval Office in October 1962 on missiles in Cuba. Credit Woa/Associated Press

When the current Oval Office was built under Franklin Roosevelt’s supervision in 1934, featuring an Art Deco design that was fashionable at that moment, F.D.R. had no expectation that it would turn out to be a television theater for the president (his single TV appearance was giving a speech opening the New York World’s Fair in Flushing Meadows, Queens, in 1939). But that’s what it turned out to be.

Harry Truman gave the first TV speech from the White House in October 1947, on the European food crisis. Later he gave television speeches from his Oval Office desk (for instance, after firing General Douglas MacArthur and saying farewell in 1953 before leaving office). Truman began a tradition in which presidents have been inclined to deliver some of their most important addresses into the TV camera from there — most memorably, John Kennedy on Oct. 22, 1962, revealing that there were Soviet missiles in Cuba and describing his response, and Richard Nixon on Aug. 8, 1974, resigning the presidency.

When the two Presidents Bush announced the start of Middle East wars, the Persian Gulf war in January 1991 and the war in Iraq in March 2003, they did it from that same spot. In the 1960s and early 1970s, it wasn’t hard for a president to get the three major television networks (which dominated the vast majority of the viewing audience) to broadcast a prime-time Oval Office speech. As he noted in his resignation speech, Nixon gave 37 Oval Office TV addresses during his five and a half years in office.

But since then, the TV audience has fractured into hundreds or thousands of channels, and the traditional networks have far been less willing to give a president carte blanche to interrupt their prime-time programming. In recent times, presidents have tended to reserve the Oval Office format for their most important TV statements, but not always.

Tonight, for example, when President Obama reveals his plan for dealing with ISIS, he will be speaking not from the Oval Office but from the state floor of the White House residence, which was rarely used for direct-to-camera presidential TV appearances before Lyndon Johnson moved his news conferences in 1964 from the State Department auditorium to the East Room of the White House.

The site chosen by Obama staffers for the president’s ISIS speech may signal their intention to make sure that this address does not generate unwanted echoes of certain historical moments. This was why, for instance, in October 2001, when President George W. Bush announced United States airstrikes that would inaugurate the American role in the war in Afghanistan, he did it from the Treaty Room in the residence, where, he informed his TV audience, “American presidents have worked for peace.”

One senior Bush administration official frankly explained that day to The New York Times, “We didn’t want the Oval, which would invite comparisons to his father at the start of the gulf war or Roosevelt at the start of World War II.”


Photo
Tim Cook introducing Apple Pay on Tuesday in Cupertino, Calif. The system would replace credit cards. Credit Jim Wilson/The New York Times

Apple Pay Tries to Solve a Problem That Really Isn’t a Problem

I recently bought a cup of coffee, but I did not have any cash handy. I used a credit card, and the result was a veritable dystopia that will surely haunt my sleep forever.

First, I had to reach into my back pocket and remove my leather wallet. Then I had to pick out a plastic card, taking care not to pull out my driver’s license or Metro fare card. Somehow I managed to succeed on the first try. Then I swiped my credit card on a device positioned near the cash register. (Should the magnetic strip face right or left? That was my horrific choice.) Then I returned the plastic card to my wallet and went on with my day, scarred yet unbroken. I understand my credit card company will be including the $2.25 I owe them for that coffee on some sort of invoice later in the month, the receipt of which will surely will be yet another brutal reminder of the burdens of that day.

I kid, of course. Charging a cup of coffee or pretty much anything else is not a big deal. At most stores it is a remarkably seamless process, particularly now that most retailers have gotten out of the habit of requiring signatures for smaller purchases. But that’s not how Tim Cook sees it.

Mr. Cook, the Apple chief executive, introduced a new mobile payments service Tuesday as part of the company’s big product rollout. The idea is that instead of experiencing the misery of fishing around for a credit card, you put your phone up to a transponder and touch the screen, and your transaction is complete.

It’s a dangerous business to bet against Apple’s ability to make a product that you didn’t think you needed as part of your daily life. But “Apple Pay” looks as if it may be one of those offerings that don’t live up to the company’s hype. It would seem that in Mr. Cook’s mind, the current process of a retail transaction is something actually resembling the series of horrors described above. The core challenge Apple faces is that buying things with a credit card isn’t nearly as onerous a process as they make it out to be.

Mr. Cook showed a video at the product rollout of a woman burrowing in her purse for a credit card, navigating past a box of Tic Tacs — Tic Tacs! — and struggling to open her wallet in order to find her card, then being asked to show her driver’s license before completing the transaction. It had a lot in common, actually, with those infomercials in which actors manage to horribly bungle the most basic tasks until some new product solves a nonproblem.

Apple Pay does appear to be more secure than plastic credit cards. As Mr. Cook pointed out in the presentation, a credit card reveals all the necessary information for a thief to exploit and go on a shopping spree, whereas Apple Pay requires the purchaser’s fingerprint to run a charge. The only problem from Apple Pay: The costs of fraud are borne by credit card issuers, and sometimes retailers themselves. Just ask Target, and now Home Depot, both of which have faced huge data breaches and are paying the price.

So you can see how banks and retailers will be enthusiastic about switching to a more secure way of paying. Indeed, Apple has already lined up giant banks — including Bank of America, Chase and Wells Fargo — and giant retailers, including McDonald’s, Walgreens and Macy’s, to use the service.

So Apple Pay certainly has the potential to revolutionize how people buy goods. But security chips widely in use in Europe are gradually becoming available in American credit cards. The recent breaches are only making that process more urgent for card issuers.

But the bigger question for Apple Pay is whether consumers find it handy enough to convert from credit and debit cards.

Despite the video of a fumbling, bumbling purchaser, the act of buying something with a credit card is pretty efficient. The same woman who fumbles and bumbles in retrieving credit cards from her wallet begins the “after” segment with her iPhone already in her hand. But surely she would have had to retrieve her phone from the same Tic-Tac-laden purse!

And whatever the benefits of buying items with a phone offers, consumers will also have to deal with some key disadvantages. Because using Apple Pay requires that you use your fingerprint ID, you can’t do the equivalent of handing your credit card to a friend or family member and have them make a purchase for you. (No sending the intern to pick up coffee with instructions to put it on your card, for example.)

And if your phone battery dies or you spill something and your phone goes kaput, you could easily find yourself broke and with no way to get home unless you keep plastic cards in your wallet as a redundancy measure, which is the whole thing Apple says it is trying to make unnecessary to begin with.

There’s a broader lesson for anyone trying to overhaul these purchasing mechanics, which applies to Apple, digital-payment companies like Square, and even the enthusiasts of digital currency Bitcoin who envision that the cryptocurrency will become central to electronic transactions of all sorts.

Capitalism has long made it easier to buy and sell things, with a centuries-long evolution from barter to metal coins to paper money to the credit card. But each of those represented a major advance in convenience over its predecessor. If Apple Pay or competing mobile payments products are to succeed, they’ll need to convince us they have a better way.


Photo
Sarah Palin speaking in 2012. Her assertion that President Obama’s health care plan included a “death panel” helped derail a proposal for Medicare to reimburse doctors for voluntary end-of-life consultations with patients. Credit Stephen Crowley/The New York Times

Can We Have a Fact-Based Conversation About End-of-Life Planning?

Dealing with health care needs at the end of life is a difficult but unavoidable issue in an aging society with rising health care costs like ours. After a failed attempt to deal with the issue as part of the Affordable Care Act, it may again be returning to the policy agenda. Can we avoid another catastrophic bout of misinformation?

The debate over end-of-life planning has largely been dormant since 2009, when the former Alaska governor Sarah Palin’s false claim that President Obama’s health care plan included a “death panel” spelled the end of a proposal for Medicare to reimburse doctors for voluntary end-of-life consultations with patients. The Obama administration briefly issued and then withdrew a regulation that would have added end-of-life consultation coverage to Medicare in early 2011, but is likely to revisit the issue after receiving a recommendation from an influential American Medical Association panel.

Unfortunately, the lesson from the “death panel” controversy is that this issue is vulnerable to demagoguery if it becomes linked to people’s partisanship or feelings about controversial political figures and issues. For example, after Ms. Palin’s comments became widely known, and other prominent Republicans began to echo her claims, the myth came to be deeply held among the public. People’s negative predispositions toward Mr. Obama or his plan overwhelmed their critical faculties.

In an analysis of 2009 polling data, for instance, I found that Republicans who thought they were well informed about Mr. Obama’s plan were more likely to be misinformed than those who said they didn’t know very much about it — the same pattern observed in 1993 for the myth that President Clinton’s health care plan would take away your choice of doctor.

The same biased reasoning processes make it difficult to undo these mistaken beliefs once they are entrenched. In a 2011 experiment, Jason Reifler, Peter Ubel and I found that corrective information about the “death panel” claim successfully reduced belief in the myth among people with neutral or warm feelings toward Ms. Palin who were less knowledgeable about politics.

However, we found that misperceptions did not decrease significantly among people with mixed or positive feelings toward Ms. Palin who were more knowledgeable about politics — the individuals who are best equipped cognitively to resist unwelcome information. In fact, exposure to corrective information attributed to “nonpartisan health care experts” actually increased belief in the myth among respondents who feel very warmly toward her — a finding that is consistent with the “backfire effect” Mr. Reifler and I found in prior research.

The “death panel” belief has persisted in the years since Ms. Palin’s comments. Though the wording of the question is imperfect, polling data from the Kaiser Family Foundation shows that approximately one-third of Americans still believe in the myth — a proportion that has remained relatively stable since 2010. (Similarly, a 2012 academic survey using different wording found that one in two Americans endorsed the myth and only about one in six knew with high certainly that it was false.) This persistence may be the reason that the Obama administration has avoided the issue until now.

Will things turn out differently this time? Support for covering voluntary end-of-life planning is actually remarkably strong across the political spectrum. In addition to the American Medical Association panel’s recommendation, both private insurers and states such as Colorado and Oregon are now offering coverage for these consultations. Even critics of President Obama’s health care plan such as National Review’s Wesley J. Smith and Senator Johnny Isakson, Republican of Georgia, are in favor of advance planning.

Given the strength of this bipartisan consensus, adding coverage for end-of-life planning might seem unlikely to attract significant opposition or revive previous misconceptions. But a risk-averse administration may still elect to dodge the issue given Mr. Obama’s weak approval ratings and precarious political standing. A lesson of 2009, after all, is that it only takes one ambitious critic to spark a conflagration.


Here’s some of our best and most popular interactive work about politics and the economy:
We also like sports. Here's some of our best stuff on that topic:

More Posts

Plans that limit patients’ choices tend to be cheaper, but there has been concern they will restrict care. A study suggests this concern is overblown.

The polling in the state is suspect, but the data suggests that the G.O.P. has an advantage in a race that could swing the Senate.

About the Upshot

The Upshot presents news, analysis and data visualization about politics and policy. It will focus on the 2014 midterm elections, the state of the economy, upward mobility, health care and education, and occasionally visit sports and culture. The staff of journalists and outside contributors is led by David Leonhardt, a former Washington bureau chief and Pulitzer Prize winner for his columns about economics.

Read more