TIME ebola

Why Ebola Isn’t Really a Threat to the U.S.

Ebola will not likely spread within the United States

Give us this—when Americans overreact, we do it all the way. Over the past week, in response to fears of Ebola, parents in Mississippi pulled their children out of a middle school after finding out that its principal had traveled to Zambia—a nation that is in Africa, but one that hasn’t recorded a single Ebola case. A college sent rejection notices to some applicants from Nigeria because the school wouldn’t accept “international students from countries with confirmed Ebola cases”—even though Nigeria has had less than 20 confirmed cases and the outbreak is effectively over.

The American public is following its leaders, who’ve come down with a bad case of Ebola hysteria. That’s how you get even-tempered politicians like New York Governor Andrew Cuomo musing that the U.S. should “seriously consider” a travel ban on West African countries hit by Ebola, while some of his less restrained colleagues raise the incredibly far-fetched possibility of a terrorist group intentionally sending Ebola-infected refugees into the U.S. It’s little surprise that a Washington Post/ABC News poll found that two-thirds of Americans are concerned about an Ebola outbreak in the U.S.

They shouldn’t be—and two events that happened on Monday show why. WHO officials declared Nigeria officially “Ebola-free.” And in Dallas, the first wave of people being monitored because they had direct contact with Thomas Eric Duncan, the first Ebola patient diagnosed in the U.S., were declared free of the diseases.

Nigeria matters because the nation’s is Africa’s most populous, with 160 million people. Its main city, Lagos, is a sprawling, densely populated metropolis of more than 20 million. Nigeria’s public health system is far from the best in the world. Epidemiologists have nightmares about Ebola spreading unchecked in a city like Lagos, where there’s enough human tinder to burn indefinitely.

Yet after a few cases connected to Sawyer, Nigeria managed to stop Ebola’s spread thanks to solid preparation before the first case, a quick move to declare an emergency, and good management of public anxiety. A country with a per-capita GDP of $2,700—19 times less than the U.S.—proved it could handle Ebola. As Dr. Faisal Shuaib of Nigeria’s Ebola Emergency Operation Center told TIME: “There is no alternative to preparedness.”

But Nigeria’s success was also a reminder of this basic fact: If caught in time, Ebola is not that difficult to control, largely because it remains very difficult to transmit outside a hospital. For all the panic in the U.S. over Ebola, there has yet to be a case transmitted in the community. The fact that two health workers who cared for Duncan contracted the disease demonstrates that something was wrong with the treatment protocol put out by the Centers for Disease Control and Prevention (CDC)—something CDC Director Dr. Tom Frieden has essentially admitted—and may indicate that the way an Ebola patient is cared for in a developed world hospital may actually put doctors and nurses at greater risk.

“You do things that are much more aggressive with patients: intubation, hemodialysis,” National Institute of Allergy and Infectious Diseases head Dr. Anthony Fauci said on CBS’s Face the Nation on Sunday. “The exposure level is a bit different, particularly because you’re keeping patients alive longer.” But now that U.S. health officials understand that additional threat, there should be less risk of further infection from the two nurses who contracted Ebola from Duncan—both of whom are being treated in specialized hospitals.

Even the risk of another Duncan doesn’t seem high. For all the demand to ban commercial travel to and from Ebola-hit West Africa, this region is barely connected to the U.S. in any case. Only about 150 people from that area of Africa come to the U.S. every day—less than a single full Boeing 757—and many airlines have already stopped flying. But there have been relatively few spillover cases even in African countries that are much more closer and more connected to Guinea, Sierra Leone and Liberia. Besides Nigeria, only Senegal has had cases connected to the West African outbreak—and that nation was declared Ebola-free today as well. (There have been cases in the Democratic Republic of Congo, but that’s considered a separate outbreak.) The worst Ebola outbreak ever is raging in three very poor nations—but it seems unable to establish itself anywhere else.

None of this is to deny the scale of the challenge facing Guinea, Sierra Leone and Liberia, where the Ebola has fully taken hold and the disease is still outpacing our efforts to stop it. But West Africa is where our fear and our efforts should be focused—not at home, where Ebola is one thing most of us really don’t have to worry about.

TIME vaccines

Very Good and Very Bad News in the Vaccine Wars

Just say yes.—but too many Americans say no to vaccines
Just say yes.—but too many Americans say no to vaccines Steve Debenport; Getty Images

Like any trench war, the fight to protect America's kids against disease is proceeding only inch by inch. A new report shows why there's reason for hope—and reason for worry

It’s just as well that no one knows the names of the 17,253 sets of parents in California who have opted not to have their children vaccinated, citing “philosophic” reasons for declining the shots. The same is true of the anonymous 3,097 in Colorado who have made the same choice—giving their far smaller state the dubious distinction of being dead last among the 50 states and the District of Columbia in the simple business of protecting their children against disease.

On the other hand, kudos to you, Mississippi, for finishing number one—with an overall kindergartener vaccination rate in the past school year of 99.7%—and to you, Louisiana, Texas and Utah, for finishing not far behind. Your children, by this measure at least, are the safest and healthiest in the country.

These and other findings were part of the alternately reassuring and deeply disturbing survey from the CDC’s Morbidity and Mortality Weekly Report (MMWR), looking at vaccination coverage for more than 4.25 million kindergarteners and the opt-out rates for more than 3.9 million in the 2013-2014 school year

The report’s top line number seems encouraging. The national compliance rate for the three major vaccines covered in the survey ranged from 93.3% (for chicken pox) to 94.7% (measles, mumps, rubella, or MMR) to 95% (diptheria, tetanus, pertussis).

But even those numbers don’t mean America has aced the test. Vaccination rates need to reach or exceed 95%, depending on the disease, to maintain herd immunity—the protection afforded by vaccinated people to those few who can’t be vaccinated, by giving the virus too few ways to body-surf its way across a population until it finds someone who’s vulnerable. So while a 90% vaccination rate might look like an A, it in fact may be no better than a middling C.

And in some parts of the country, the numbers are much, much worse. As I reported in TIME’s Oct. 6 issue, vaccination refusal tends to be a phenomenon of the wealthier, better educated, politically bluer parts of the country—the northeast, the Pacific coast and pockets around major universities. Those are communities in which folks know just enough to convince themselves that they know it all—which means they know better than the doctors, scientists and other members of medical community at large, who have overwhelmingly shown that vaccines are safe and effective.

That’s part of the reason New York City’s elite private schools have vaccination rates far lower than the city’s public schools, and why, according to a shocking story by the Hollywood Reporter, some schools in the wealthier neighborhoods of Los Angeles have a lower vaccination rate than in South Sudan.

Digging deeper into the MMWR report, there are other, broader causes for worry. There are the 26 states plus the District of Columbia that don’t meet the Department of Health and Human Services’ guidelines of 95% coverage for the MMR vaccine. There are the 37 states that don’t even meet the CDC’s standards for properly gathering data on vaccination rates in the first place. And there are the 11 states with opt-out rates of 4% or higher.

The anti-vaccine crowd frames the refusers as part of a brave vanguard of parents who won’t be hectored into getting their children protections that they, the parents, have decided are useless or dangerous. But it’s worth remembering what the world looked like in the era before vaccines. And you don’t have to work too hard to do that, because you know what it looked like? It looked like West Africa today, where people are being infected with the Ebola virus at a rate of 1,000 new cases per week—on target to be 10,000 by December—where entire families and indeed entire villages are dying agonizing deaths, and where whole populations would line up by the millions for the protection a vaccine would offer.

Vaccine refusal is increasingly the indulgence of the privileged. And it is, as the Ebola crisis shows, the indulgence of the foolish, too.

TIME Video Games

Fixing What’s Wrong With Gamergate Starts With You

Whatever you think about games, game journalism or recent critiques of the way video games treat women, you have an obligation to be respectful in debates, and it's a shame we still have to say that.

This is how far we have to go: the Entertainment Software Association, a U.S. video game trade association and sometime D.C. lobbyist group, is now having to remind us that threatening to do violent harm to someone is the opposite of okay.

“Threats of violence and harassment are wrong,” an ESA spokesperson told the Washington Post Wednesday. “They have to stop. There is no place in the video game community—or our society—for personal attacks and threats.”

Read those words again, slowly, because they are a measure of the distance that remains between right here and now, and the point at which our species practices general civility in all its forms of communication, where human beings can depend on each other not to be cruel, condescending, vicious and in some instances even homicidally hostile over cultural disagreements. It should be as shocking as some of these threats that in 2014, someone has to utter the words “harassment is wrong.”

And yet at least three women who work in the games industry have had to temporarily leave their homes after being threatened with horrific acts of violence, simply because they said something someone else found disagreeable. Critic Anita Sarkeesian, known for her video series deconstructing female tropes in video games, just canceled her appearance at Utah State University after someone threatened “the deadliest shooting in American history” if she was allowed to speak. (The university deemed the presentation safe to proceed after consulting with local law enforcement, but can you blame anyone so threatened?)

The locus of all this animus in recent months is a so-called movement known as “Gamergate,” another neologistic slogan born of the infamous 1970s political scandal whose tendrils have circumnavigated space-time to motivate people to lazily append and then rally behind an egress descriptor glommed onto a vague reference label. Like the Tea Party, Gamergate may have been forged with something like an original central purpose: in its case, ostensibly reforming perceived corruption in “games journalism.” But as some of its supporters began violently threatening women who wrote about the topic, it quickly snowballed into something far messier and treacherous, a perplexing mass of conflicting idea-vectors, vitriol-filled social media assaults and online forum-filled cascades of general thuggery.

In a recent Salon article celebrating Richard Dawkins’ slight backpedaling on religion, the site references an interview with the evolutionary biologist, in which Dawkins says “There is a kind of pseudo-tribalism which uses religion as a label.” He’s talking about The Islamic State of Iraq and Greater Syria (ISIS), reacting to a question about what could motivate a group to acts of utter barbarism like the beheadings for which ISIS is now infamous.

“Pseudo-tribalism” summarizes nicely. Swap “religion” for “Gamergate,” specifically for those using the term to denigrate and terrorize women, and you have the analogue. That well-meaning proponents of Gamergate have utterly failed to wrangle the slogan back from these bomb-throwers means it’s time to abandon it, to find a better way to prosecute concerns about journalistic corruption, and to wade civilly into the intellectual debate about female tropes in games.

Whatever you think of Sarkeesian’s thoughts on games and those tropes—and it should go without saying that there is room for civil debate about any critic’s thoughts on anything—there’s no room in such a debate for harassment, libel, slander, rape threats, death threats, posting intimate photos of someone without consent, outing their geographic location to intimidate them and so forth. Harassment is not debate. Harassment ends debates. It’s antithetical to dialogue, and, assuming you’re not so aberrant or sociopathic that you can’t tell the difference, isn’t meaningful dialogue what you’re after?

This is how you change the debate, and it has to happen before dialogue starts, before you even get to the level of worrying about semantic contentiousness over whether the label “gamer” is forever or forever stultified. In logic debates, there’s a thing known as the ad hominem fallacy. Ad hominem is Latin for “to the person.” It means to attack someone personally–and irrelevantly–instead of debating the actual idea or claim or argument. The litmus test is this: after you’ve typed out your comment or message board post or social media screed, does it violate this fallacy? If so, that’s what the delete button’s for.

If you don’t care about respecting someone else’s right to disagree with you, if all you want is to cause harm for some twisted sense of catharsis, what can I say but that you’re doing something that’s the opposite of noble, the opposite of productive, the opposite of moving the ball down the field in whatever direction you think is important–and when you escalate harassment to the level of violence, it’s the very definition of psychopathic.

What I find most depressing about any of this isn’t the state of journalism (it’s hardly just “games journalism,” folks) or what men think about women and women about men. It’s that as human beings in 2014, we still think it’s okay to pick up a keyboard or tablet or phone, venture to someone else’s online space, pull out our weaponized words, and open fire.

TIME Health Care

The Price of Staying Alive For the Next 3 Hours

Stayin' alive—and cheap at the price
Stayin' alive—and cheap at the price ZU_09; Getty Images

A new study suggests a little spending now can buy you a lot of time later

How much do you reckon you’d pay not to be dead three hours from now? That probably depends. If you’re 25 and healthy, a whole lot. If you’re 95 and sickly, maybe not so much. But for people in one part of the world—the former East Germany—the cost has been figured out, and it’s surprisingly cheap: three hours of life will set you back (or your government, really) just one euro, or a little below a buck-thirty at current exchange rates.

That’s the conclusion of a new study out of Germany’s Max Planck Institute, and it says a lot about the power of a little bit of money now to save a lot of suffering later—with implications for all manner of public health challenges, including the current Ebola crisis.

The new findings are a result of one of the greatest, real-time longitudinal studies ever conducted, one that began the moment the Berlin Wall fell, on Nov. 9 1989. Before that year, there were two Germanys not just politically, but epidemiologically. Life expectancy in the western half of the country was 76 years; in the poorer, sicker east, it was 73.5. But after unification began, social spending in the East began rising, from the equivalent of €2,100 per person per year to €5,100 by the year 2000. In that same period, the difference in lifespan across the old divide went in the opposite direction, shrinking from 2.5 years to just one year as the east Germans gained more time. Crunch those numbers and you get the three extra hours of extra life per person per euro per year.

“Without the pension payments of citizens in east and west converging to equivalent levels,” said Max Planck demographer Tobias Vogt in a statement, “the gap in life expectancy could not have been closed.” Increased public spending, Vogt adds, is often framed as an unfortunate knock-on effect of longer life. “But in contrast,” he says, “our analysis shows that public spending can also be seen as an investment in longer life.”

The idea that generous, tactical spending now can be both a money-saver and lifesaver is one that health policy experts tirelessly make—and that people in charge of approving the budgets too often ignore. Bill Gates often makes the point that $1 billion spent to eradicate polio over the next few years will save $50 billion over the next 20 years, not just because there will no longer be any cases of the disease to treat, but because the global vaccination programs which are necessary just to contain the virus can be stopped altogether when that virus is no more.

As TIME reported in September, British inventor Marc Koska made a splash at the TEDMed conference in Washington DC when he unveiled his K1 syringe—an auto-destruct needle that locks after it’s used just once and breaks if too much force is used to pull the plunger back out. That prevents needle re-use—and that in turn not only reduces blood-borne pathogens from being spread, it does so at a saving. According to the World Health Organization (WHO), $1 spent on K1 syringes saves $14.57 in health care costs down the line—or $280 for a $20 order of the shots.

All across the health care spectrum, such leveraging is possible. Critics of the Affordable Care Act have slammed the law for the cost of the preventative services it provides, and while it’s way too early to determine exactly how successful the law will be, the encouraging stabilization in the growth of health costs suggests that something, at least, is working.

Global health officials are making a similar, though more urgent, preventative argument concerning the Ebola epidemic in West Africa. Americans are rightly jumpy over the few cases that have landed on our shores, but the 1,000 new infections per week that are occurring in the hot-spot nations of Liberia, Guinea and Sierra Leone make our concerns look small. Frighteningly, according to the WHO’s newest projections, that figure will explode to 10,000 cases per week by December if the resources are not deployed to contain the epidemic fast.

“We either stop Ebola now,” WHO’s Anthony Banbury said in a stark presentation to the U.N. Security Council on Sept. 14, “or we face an entirely unprecedented situation for which we do not have a plan.”

Suiting up and wading into the Ebola infection zone is a decidedly bigger and scarier deal than spending an extra euro on public health or an extra dollar for a new syringe. But the larger idea of intervention today preventing far larger suffering tomorrow remains one of medicine’s enduring truths. We lose sight of it at our peril.

TIME

The Chicken Littles Were Wrong. But Americans Still Think the Sky Is Falling.

The list of false prophesies of doom by Obama's critics is long.

The U.S. economy has added jobs for 55 consecutive months, bringing unemployment below 6 percent. The budget deficit has fallen from $1.2 trillion when President Obama took office to less than $500 billion today, from an unsustainable 10 percent of GDP to a relatively stable 3 percent. More than 10 million Americans have gained health insurance through Obamacare, while medical costs are growing at their lowest rate in decades. Gasoline prices are gradually dropping. Medicare’s finances are dramatically improving.

The sky, in other words, is not falling. On the contrary, things keep getting better. Which means a lot of people have a lot of explaining to do.

To recognize that America is doing better is not to suggest that America is doing great. Wages are too low. Washington is dysfunctional. There’s too much depressing news about Ebola, gridlock and our perpetual conflicts abroad. But the Cassandras of the Obama era ought to admit their predictions of doom were wrong. There has been no hyperinflation, no double-dip recession, no Greece-style debt crisis, no $5-a-gallon-gas, no rolling blackouts, no “insurance death spiral.” Despite “job-killing tax hikes” and “job-killing regulations” and “job-killing uncertainty” created by the “job-killing health care law,” private employers are consistently creating more than 200,000 jobs a month. Our gradual recovery from the 2008 financial crisis continues apace.

Some of the wrong predictions of the last six years merely reflected the paranoia of the Tea Party right—or the cynical exploitation of that paranoia. In 2008, Newt Gingrich got some attention by warning that President Obama would muzzle Rush Limbaugh and Sean Hannity; it worked so well that in 2012, he predicted that Obama would declare war on the Catholic Church the day after his reelection. The National Rifle Association’s fever-screams that Obama would cancel the Second Amendment and seize America’s guns have not come to pass, either, although they helped boost gun sales. Sarah Palin’s “death panels” also have yet to materialize.

It’s doubtful that those opportunists ever believed their own Chicken Little rhetoric; when their doomsday warnings were proven wrong, they simply issued new doomsday warnings. But other prophecies of doom reflected a sincere view of the economy and other public policy issues that simply happened to be incorrect.

The government response to the financial crisis probably inspired the most wrongheaded commentary. Critics complained that the Wall Street bailouts begun by President Bush and continued by Obama would cost taxpayers trillions of dollars. “If we spent a million dollars ever day since the birth of Christ, we wouldn’t get to $1 trillion,” fumed Darrell Issa, the top Republican on the House government oversight committee. Ultimately, the bank bailouts cost taxpayers less than nothing; the government has cleared more than $100 billion in profits on its investments. Obama’s bailout of General Motors and Chrysler also inspired some overheated commentary; Mitt Romney wrote that if it happened, “you can kiss the American automotive industry goodbye.” But it did happen, and the American automotive industry is now thriving, saving an estimated 1.5 million jobs.

It’s fun looking back at misguided crisis predictions. Liberal critics like Paul Krugman warned that the banking system would collapse unless it was temporarily nationalized; Krugman scoffed that Treasury Secretary Tim Geithner’s “stress test” would never end the crisis. “He was right,” Krugman later admitted, “I was wrong.” Conservatives like Dick Morris warned that the president’s $800 billion fiscal stimulus package and other activist policies would create an “Obama Bear Market”; in fact, the Dow has soared more than 250 percent since bottoming out in March 2009. Conservatives like Paul Ryan have also consistently warned that the Federal Reserve’s aggressive monetary stimulus would weaken the dollar—their preferred phrase is “debase the currency”—and create crippling inflation. They have been consistently wrong, as inflation has remained stubbornly low.

After the Great Recession ended in the summer of 2009—sooner than anyone (especially historians of financial crises) predicted—Republicans quickly turned their attention to the budget deficit, which had ballooned to $1.4 trillion. They complained that America was becoming Greece, that we were spending our way into a sovereign debt crisis, that brutal increases in interest rates were on the way. But America did not become Greece. There has been no debt crisis. Interest rates have remained historically low. In fact, despite the howling on the right, non-military spending (excluding mandatory expenses like Medicare) has dropped to its lowest level since the Eisenhower administration. Oh, and speaking of Medicare, its financial position has gotten so much better—thanks to a general slowdown in health care costs—that its trust fund, which was expected to go bust in 2017 when Obama took office, is now expected to remain solvent through 2030.

That slowdown in medical costs is another example of a phenomenon that critics confidently predicted would never happen in the era of Obamacare. Also, the administration would never meet its goal of 7 million signups by April 2014. (The actual figure topped 8 million.) Yes, but they would never pay their premiums. (The vast majority did.) OK, but those premiums would surely soar. (They haven’t.) Still, the entire program will be doomed to a “death spiral” unless healthy young people sign up in large numbers. (They have.)

Nevertheless, most Americans seem to think that Obamacare is a failure, that the economy stinks, that the deficit is getting worse. There are many explanations for those beliefs, but one is surely that initial predictions of doom are uncritically reported at the time and conveniently forgotten once they’re disproven. There is no penalty in American politics for being wrong. Republicans paid no price for their confident predictions that President Clinton’s tax hikes would destroy the economy, that the Bush tax cuts would pay for themselves, that the Obama tax hikes would create a double-dip recession. Even after the BP spill, petroleum interests proclaimed that tighter regulations on offshore drilling would ravage the oil industry and punish Americans at the pump; domestic production is at an all-time high while gas prices are steadily dropping, but they haven’t changed their tune at all. Similarly, even after the financial meltdown, Wall Street moneymen said financial reforms would shred our free enterprise system; they’re still whining despite their record profits.

Obama is often guilty of rhetorical overkill, too. He’s always warning that Armageddon is just around the corner—when Republicans blocked his American Jobs Act and other infrastructure bills, when they insisted on the deep spending cuts in the “sequester,” and when they threatened to force the Treasury to default on its obligations. (Actually, that last one almost did create Armageddon.) But because he’s president, the media correctly holds his feet to the fire, pointing out that he didn’t keep his promises to fix Washington or let you keep your insurance if you like it. There’s less accountability for his critics on the left and the right.

There’s no need for sympathy; Obama volunteered for the job. He gets a cool plane and a nice house regardless of public perceptions about the state of the country. But if you want to know why voters think the false prophets were right, maybe it’s because nobody ever corrected them.

TIME

Amal Alamuddin Is The Latest Exhibit in the Museum of Disempowered Women

Human rights lawyer Amal Alamuddin Clooney speaks to media in Athens, Oct. 13, 2014.
Lawyer Amal Clooney speaks to media in Athens on Oct. 13, 2014 Yorgos Karahalis—Reuters

The new Mrs. Clooney is advising the Greek government on its campaign to regain looted sculptures. But the overlapping interests of her and her husband feels uncomfortable

Amal Clooney, lawyer, is reported to be at the epicenter of “the west’s longest-running cultural row.” The Guardian, which coined the phrase, meant the two-century-long tussle between Athens and London over the rightful home of marble sculptures removed from the Parthenon between 1801 and 1805 by the English aristocrat Lord Elgin and later sold by him to the British Museum, where they still reside. Clooney, née Alamuddin, arrived in Greece on Oct. 13 at the invitation of the Greek Culture Minister to assist with the campaign for the marbles’ return.

But the frenzy of flashbulbs and fashion commentary that greeted Clooney’s visit shows that she has become entangled in a cultural row of greater longevity and importance than the disposition of some antique artworks, however significant those may be. Throughout her adult life, this 36-year-old attorney specializing in international jurisprudence, extradition and criminal law has stood on her own merits accomplished, independent, respected. Now her identity risks being spirited away as the sculptures she seeks to repatriate once were. Even in the 21st century and among first-world elites, marriage retains the power to transform women into appendages, while celebrity culture reliably reduces females to ciphers. Since Alamuddin’s engagement and Venice wedding to actor George Clooney, she has never been more closely observed by a wider audience — or in greater danger of disappearing.

You might say this is Amal Clooney’s business. It is she who chose to say “I do” not only to “Hollywood’s most eligible bachelor” but also to celebrity-encrusted nuptials that created “intimate, exclusive” images for the happy couple, friends, family and the many millions of readers of publications such as People and Hello! to enjoy, showcasing the bride’s ability not only to anatomize the unfair trial of al-Jazeera journalists in Egypt under the military-backed government but also to wear nice dresses and skyscraper heels. It is Clooney who chose to retire the maiden name of Alamuddin under which she had scored many career successes and a client roster including Julian Assange and Yulia Tymoshenko. It was not, however, Clooney who chose to memorialize her first professional foray as the new Mrs. Clooney with banal reportage like this (“Move over, Kate Middleton! There’s a new hair queen in town!”). Clooney has always seemed to wear her startling beauty as lightly as her startling accomplishments, and there is nothing to suggest that she has changed.

The problem — and the reason the media repurposing of Clooney from queen of jurisprudence to hair-queen matters — is that there is still a dearth of women who rise to prominence through their own merits, reflecting the harsh reality of a world resolutely skewed against female achievement.

Many interlocking mechanisms keep women down, but in watching the transmogrification — and trivialization — of Clooney we are witnessing one of the most pernicious of these. I laughed back in June, when Britain’s Daily Mail turned its report about a global summit on combatting sexual violence into a slavering commentary on Clooney’s appearance. I laughed at reporting of the Alamuddin-Clooney marriage so tremulously overexcited by the groom (two-time “sexiest man alive”!) that it characterized the bride’s crowning attainment as “snaring” him. I laughed louder at the spoof headlines this spectacle inspired: “Internationally Acclaimed Attorney Marries an Actor,” etc.

I also laughed at that actor’s ham-fisted attempt earlier this year to boost the long-running initiative to reclaim the Parthenon marbles for Greece. “Even in England, the polling is in favor of returning the Pantheon [sic] marbles, the marbles from the Pantheon,” George Clooney said during a promotional tour for his movie about the restitution of art looted by the Nazis, The Monuments Men.

There’s nothing wrong and a lot right with stars using their celebrity power to publicize worthy causes (though it’s generally better to do the research first). However the overlapping interest of Mr. and Mrs. Clooney in this case feels uncomfortable. The Greek government originally approached the then Amal Alamuddin in 2 B.C. — that’s 2011, two years Before Clooney entered her life. Greece sought her services and those of her storied colleagues at the London-based law firm Doughty Chambers for one reason only: their collective legal expertise. Now Mrs. Clooney’s involvement in the case has been ascribed a new and more tenuous value. “We will of course be discussing all our legal options but what we really want is to keep the issue alive,” a “well-placed policy maker” told the Guardian. “There would be no better way of doing that than getting Hollywood involved and, hopefully, [George] Clooney too.”

A brilliant lawyer and strong female role model is being misappropriated, to be put on show as the latest exhibit in the Museum of Disempowered Women. Never mind restoring the marbles to Greece: give us back Amal!

TIME Diet/Nutrition

3 Mindset Shifts That Help Weight Loss

Broken scale
Tim Robberts—Getty Images

In a recent Facebook thread about weight loss that I was following, one commenter wrote that if she could write a diet book, she’d call it “Eat Less” and then leave all the pages blank. Drop the mic, call it a day, solve our obesity mess with a two-word prescription.

Most of us who have read anything about diets, obesity, and weight loss would nod in agreement. We have too much food, too much sugar, too many processed foods, and too many choices. And the reality is that we could likely engineer a one-size-fits-most diet that would push everybody back to healthy weights. Example: Eggs and berries for breakfast, grilled chicken salad with nuts for lunch, and fish with vegetables and avocado for dinner might get us there if we followed that plan every day (adjusting for variables like vegetarian options and allergies). Most of us who have read anything about diets, obesity and weight loss would also agree that it’s nowhere near that easy.

The diet dilemma has everything to with food. And nothing to do with food.

It really has more to do with adjusting our mindset so that healthy choices feel right—and don’t feel like deprivation, hard work or punishment.

I’ve spent most of my career writing about health, and I’ve spent most of my life in a bleep-off relationship with the scale. I’ve had quite a few lows (almost ballooning to 300 pounds while writing diet books, getting a D in sixth-grade gym class), and I’ve also had some successes. (For what it’s worth, our individual definitions of weight-loss success need to include not just pounds, but also things like bodily satisfaction, life satisfaction, numbers like blood pressure and achievement of other goals not associated with pounds.)

We all have the ability to change our mindsets—not with a tire-squealing hard left, but by simply drifting into a new lane of thinking. These 3 switches will help you start:

Reverse the leadership model. The protocol for people who want to lose weight typically comes in two forms. You have the people who seclude themselves, privately trying to swim upstream against all of the forces that will make them gain weight. And you have the follow-the-leader model, in which the would-be dieter listens to the plan/advice/program of the trainer, the doctor, the nutritionist, the author, the infomercial-machine-seller: the person who, by degree or some other definition, knows more about the subject than anybody else. There’s nothing inherently wrong with either model, because either of them can work.

The glitch, however, comes when the follower grows tired of following. And when one grows tired of following, one consumes three pieces of Oreo pie. It’s not that the experts don’t know what they’re doing, because most of the many I’ve worked with and interviewed in my career do. It’s just that we dieters, though most don’t even know it, need a more balanced mix of following and leading. We need to harness some of the power and control back from the people who are telling us what to do. We need to lead, even if we don’t look like we should.

Leadership can come in many forms, whether it’s being the person to arrange the neighborhood walking group, or the person who prepares the family meal and makes kale chips instead of buying chocolate chips, or the person who organizes a work team to run a 5K together. The last couple years, I’ve organized weekly workouts with friends and neighbors. I’m the worst athlete in the bunch, so at first glance, the question would be, Why is blubber boy in charge? Exactly zero percent of my friends have ever given me any inclination that’s what they felt. Instead, the dynamics of the group workout are that we all push and pull each other, no matter our athletic abilities. I know I’m not as good as the others, but I also know that these workouts don’t happen unless I kickstart them.

Dieters can redefine the roles we’re supposed to take, and that’s what drives changes in the way we think and act. This is where sustained energy comes from—what we deliver to others, we get in return.

Steer the fear. In the weight-loss world, fear is almost as bad of a word as pudding. We fear the scale. We fear the doctor. We fear shopping for clothes. We fear the camera. We fear being embarrassed. The more we fear, the more we retreat—and the harder it is to climb out of whatever destructive habits we have.

As someone who once was told I had child-bearing hips, I know that the fear is real, and I know it’s not easy to squash. But instead of letting fear steer us, we need to steer the fear.

Plenty of scholarly and popular writings have addressed the issue of goal-setting, though there is some debate about whether we should set dream-big goals or more attainable goals. My take: Every year, you should set at least one physical and mental challenge that scares you just enough to help you make good choices—because those choices are a means to reaching that goal. What is “just enough”? It’s that spot right in between “of course I can do this” and “no way in the world can I do this.” For me, it was taking on the challenge of trying to complete an Ironman in 2013 (2.4-mile swim, 112-mile bike, 26.2-mile run in a 17-hour time limit). I’ve found that the canyon in the middle of those two extremes is where the growth lies. Maybe it’s not fear in the traditional sense, but that bubbling angst of uncertainty feels different from and healthier than the kind of fear that dieters tend to have. (Tell us about your new challenge with the hashtag #TIMEtosteerthefear.)

Crank the voltage. As someone who has finished last in a race (maybe two, but who’s counting?), I do subscribe to the turtle-inspired mantra of slow and steady. When it comes to weight loss, that mindset will win the race. The choices we make over time, not one day or one hour, dictate the way that our bodies will look, feel and act.

I do think it’s a mistake to think that slow-and-steady is always the answer. Especially when it comes to exercise, we need high-intensity, those short periods of working as hard as we can. Why? Because that kind of work—the kind where you’re so immersed in the activity because it’s fun and intense—is what feels good, what feels enjoyable, what feels in the moment and what gives us the post-activity high that helps us make healthy decisions, especially when it comes to food choices.

My friend and sports psychologist Doug Newburg, PhD, has taught me a lot about the concept of feel, because he has studied how it works in hundreds of elite performers. It’s different than feelings or emotions. Exercise, like eating, shouldn’t feel like a chore. For it to truly work over the long term, it has to feel more like recess than like detention. Going all in—whether it’s running, dancing, playing tennis or playing tag with your kids—excites you enough to take you out of your own head, and that’s what makes you want to do it again and again. The byproduct of playing hard is that, without thinking, you find what you were after in the first place.

Ted Spiker (@ProfSpiker), the interim chair of the department of journalism at the University of Florida, is the author of DOWN SIZE: 12 Truths for Turning Pants-Splitting Frustration into Pants-Fitting Success.

Read next: I Taught Fitness and Failed a Fat Test

TIME space

Think You Could Live on Mars? Think Again

Mars
Getty Images

A new analysis of Mars One's plans to colonize the Red Planet finds that the explorers would begin dying within 68 days of touching down

Hear that? That’s the sound of 200,000 reservations being reconsidered. Two hundred thousand is the announced number of intrepid folks who signed up last year for the chance to be among the first Earthlings to colonize Mars, with flights beginning as early as 2024. The catch: the trips will be one way, as in no return ticket, as in farewell friends, family, charbroiled steaks and vodka martinis, to say nothing of such everyday luxuries as modern hospitals and, you know, breathable air.

But the settlers in Jamestown weren’t exactly volunteering for a weekend in Aspen either, and in both cases, the compensations—being the first people on a distant shore—seemed attractive enough. Now, however, the Mars plan seems to have run into a teensy snag. According to a new analysis by a team of grad students at MIT, the new arrivals would begin dying within just 68 days of touching down.

The organizers of the burn-your-boats expedition is a group called Mars One, headed by Bas Lansdorp, a Dutch entrepreneur and mechanical engineer. As Lansdorp sees things, habitat modules and other hardware would be sent to the Red Planet in advance of any astronauts, who would arrive in four-person crews at two-year intervals—when Mars and Earth make their closest approach, which holds the outbound journey to a brief (relatively speaking) eight months. The crew-selection process would be part of (yes) a sponsored reality show, which would ensure a steady flow of cash—and since the settlers would grow their own food onsite, there would be little to carry along with them. All that would keep the overall cost of the project to a shoestring (relative again) $6 billion.

So what could go wrong? That’s what the four MIT students set out to find out, and the short answer is: a lot.

The biggest problem, the students discovered, concerns that business of breathable air. One of the things that’s always made Earth such a niftily habitable place to live is that what animals exhale, plants inhale, and vice versa. Since the Martian astronauts and their crops would be living and respiring in the same enclosed habitats, a perfect closed loop should result in which we provide them all the carbon dioxide they need and they return the favor with oxygen.

Only it doesn’t, the MIT students found. The problem begins with the lettuce and the wheat, both of which are considered essential crops. As lettuce matures, peaking about 30 days after planting, it pushes the 02 level past what’s known as .3 molar fractions, which, whatever it means, doesn’t sound terribly dangerous — except it’s also the point at which the threat of fire rises to unacceptable levels. That risk begins to tail off as the crop is harvested and eaten, but it explodes upward again, far past the .3 level, at 68 days when the far gassier wheat matures.

A simple answer would be simply to vent a little of the excess O2 out, which actually could work, except the venting apparatus is not able to distinguish one gas from another. That means that nitrogen—which would, as on Earth, make up the majority of the astronauts’ atmosphere—would be lost too. That, in turn, would lower the internal pressure to unsurvivable levels—and that’s what gets your 68-day doomsday clock ticking.

There is some question too about whether the hardware that Mars One is counting on would even be ready for prime time. The mission planners make much of the fact that a lot of what they’re planning to use on Mars has already been proven aboard the International Space Station (ISS), which is true enough. But that hardware is built to operate in microgravity—effectively zero g—while Mars’s gravity is nearly 40% of Earth’s. So a mechanical component that would weigh 10 lbs. on Earth can be designed with little concern about certain kinds of wear since it would weigh 0 lbs. in orbit. But on Mars it would be 4 lbs., and that can make all the difference.

“The introduction of a partial gravity environment,” the grad students write, “will inevitably lead to different [environmental] technologies.”

For that and other reasons, technical breakdowns are a certainty. The need for replacement parts is factored into Mars One’s plans, but probably not in the way that they should be. According to the MIT team, over the course of 130 months, spare parts alone would gobble up 62% of the payload space on resupply missions, making it harder to get such essentials as seeds, clothes and medicine—to say nothing of other crew members—launched on schedule.

Then too, there is the question of habitat crowding. It’s easy to keep people alive if you feed them, say, a single calorie-dense food product every day. But energy bars forever means quickly losing your marbles, which is why Mars One plans for a variety of crops—just not a big enough variety. “Given that the crop selection will significantly influence the wellbeing of the crew for the entirety of their lives after reaching Mars,” the authors write, “we opt for crop variety over minimizing growth area.”

Then there is the question of cost—there’s not a space program in history whose initial price tag wasn’t badly lowballed—to say nothing of maintaining that biennial launch schedule, to say nothing of the cabin fever that could soon enough set the settlers at one another’s throats. Jamestown may not have been a picnic, but when things got to be too much you could always go for a walk by the creek.

No creeks here, nor much of anything else either. Human beings may indeed colonize Mars one day, and it’s a very worthy goal. But as with any other kind of travel, the best part of going is often coming home.

Read next: 20 Breathtaking Images Of The Earth As Seen From Space

TIME Diet/Nutrition

What McDonald’s New ‘Transparency’ Campaign Is Hiding

mcdonalds-sign
Getty Images

"Most of the cattle we get our beef from are treated with added hormones"

McDonald’s announced today that it’s making a greater effort at transparency and engagement with its new campaign, “Our Food, Your Questions.” McDonald’s has a serious image problem and a sagging bottom line, which might explain its sudden willingness to fling the barn door open as a way to shed its reputation for serving mass-produced, unhealthy food. Showing the public how the sausage is made may win favor with some consumers, but a better strategy for the fast food giant would be to make truly meaningful commitments to sustainability.

McDonald’s realizes people have big questions about the quality and origins of their food. So the company that serves 28 million people daily in the U.S. is now promising straightforward answers. McDonald’s is releasing behind-the scenes web vignettes and infographics, which will apparently illustrate the production process behind its products like Chicken McNuggets and the McRib, and how they go from “farm to restaurant.” It also says it will listen to real customers’ questions online and answer honestly in real time.

McDonald’s has also enlisted professional skeptic and former “MythBusters” co-host Grant Imahara, who is featured in a series of videos addressing consumers’ persistent doubts and questions. “We know some people–both McDonald’s fans and skeptics–continue to have questions about our food from the standpoint of the ingredients or how food is prepared at the restaurant. This is our move to ensure we engage people in a two-way dialogue about our food and answer the questions and address their comments,” Kevin Newell, EVP-chief brand and strategy officer for McDonald’s USA, told BurgerBusiness.com.

Until now, what happened behind the curtain at McDonald’s has been invisible to most of us. But because the company’s supply chain is so long, and it sources raw ingredients from such a wide array of locations and facilities, it would be impossible for any one tour, vignette, or infographic to show more than a sliver of what goes on at the farm, factory, and processing levels.

And while it’s angling for the farm-to-table crowd, as the world’s largest buyer of beef and pork with hamburgers for as low as one dollar, McDonald’s current practices will probably still be considered factory-farm-to-table.

“McDonald’s is making important progress away from gestation crates in its pork supply chain, though nearly all of its eggs in the U.S. still come from birds locked inside battery cages so small they can’t spread their wings,” Paul Shapiro, Vice President of Farm Animal Protection at the Humane Society of the United States, told me. “This is in contrast to McDonald’s policies in Europe and U.K., where its eggs are all cage-free.”

Online, McDonald’s answers some questions about its products. So far, I didn’t see any questions (or answers) about antibiotic use or whether its eggs are cage-free, even in its section on “sourcing and sustainability.” Here’s what they do answer. On beef hormones: “Most of the cattle we get our beef from are treated with added hormones, a common practice in the U.S. that ranchers use to promote growth.” On feeding animals GMO feed: “Generally speaking, farmers feed their livestock a balanced diet that includes grains, like corn and soybeans. Over 90% of the U.S. corn and soybean crops are GMO, so cattle, chickens and pigs in our supply chain do eat some GMO crops.”

And while it says it no longer uses so-called “pink slime” in its burgers, it does use an anti-foaming agent, dimethylpolysiloxane, in the oil it uses to cook Chicken McNuggets. It also uses azodicarbonamide, AKA “the yoga mat ingredient,” in its buns and sandwiches, saying it has many uses: “Think of salt: the salt you use in your food at home is a variation of the salt you may use to de-ice your sidewalk.” As for why its U.S. menu contains items that are banned in Europe? “Every country has different food safety and regulatory standards and, because of this, ingredients will vary in our restaurants around the world. But no matter where you’re dining with us—in the U.S. or abroad—you can be assured of the quality and safety of our food.”

Most people simply don’t think of McDonald’s as a healthy place to eat, despite its efforts to offer more menu choices. Its insidious marketing of fast foods to kids hasn’t won it any points either. With U.S. sales down, recent food safety scandals in China, and labor issues here, its rivals are eating McDonald’s for lunch and breakfast, too.

The truth is, McDonald’s is facing a marketplace where people increasingly want good food served fast, as opposed to fast food. Millennials are now driving the food bus and they’re heading straight to Chipotle and other establishments that are offering healthier options, including foods without genetically engineered or artificial ingredients and meat from animals raised without antibiotics.

An estimated 80 percent of all antibiotics sold in the U.S. are being fed to animals on factory farms for purposes other than treating diseases. McDonald’s producers uses antibiotics to “treat, prevent, and control disease” in its food-producing animals.

Using antibiotics to prevent disease and promote faster growth (the company has phased out the latter since 2003, though some say using them to prevent disease has the same effect)—rather than merely to treat infections—allows producers to raise many animals together in dirty, crowded spaces. And it has contributed to antibiotic resistant bacteria, which the World Health Organization and the Centers for Disease Control now widely regard as an international epidemic.

From food safety scandals to the serious public health impacts of eating fast food, consumers increasingly want truth, trust, and transparency in their food. But transparency demands responsibility and is toothless on its own. Today’s eaters want to see where their food comes from so they can make informed choices and also advocate for change.

If McDonald’s really wants to connect with consumers, it should take a hard look at the practices behind the ingredients it uses and begin to change them incrementally. It could take a real stand for sustainability—including changing to suppliers and producers who raise meat without antibiotics. As the biggest fast food company in the nation, McDonald’s choices are no small potatoes. A change like that could mean a much happier meal.

See more at: How McDonald’s Could Serve Up a Happier Meal

TIME vaccines

How Words Can Kill in the Vaccine Fight

Farrow: Right ideas, wrong words
Farrow: Right ideas, wrong words NBC/Getty Images

To own the argument you've got to own the language. At the moment, the dangerous anti-vaxxers are winning that war

Chances are you wouldn’t sit down to a plate of sautéed thymus glands, to say nothing of a poached patagonian tooth fish; and the odds are you’d be reluctant to tuck into a monkey peach too. But sweetbreads, Chilean sea bass and kiwifruit? They’re a different matter—except they’re not. All of those scrumptious foods once went by those less scrumptious names—but few people went near them until there was something pleasant to call them. Words have that kind of power.

That’s true in advertising, in politics and in business too. And it’s true when it comes to vaccines as well—but in this case those words can have a lethal power. The bad news is that in the vaccine word game, the good guys (they would be the ones who know that vaccines are safe, effective and save from two to three million lives per year) are being caught flat-footed by the bad guys (those would be the ones whose beliefs are precisely opposite—and therefore precisely wrong).

The battle plays out on Twitter, with the handy—and uninformed—handle #CDCWhistleBlower repeatedly invoked by virtually every fevered anti-vax tweet like a solemn incantation. The term refers to Dr. William Thompson of the Centers for Disease Control and Prevention, who supposedly blew the lid off of the great vaccine conspiracy by confessing to irregularities in a 2004 study that deliberately excluded data suggesting a higher rate of autism in African-American boys who had been vaccinated. Scary stuff alright, except that the study was poorly conducted, the data was left out for purely statistical and methodological reasons, and the paper itself has now been withdrawn. But the hashtag stain remains all the same—with the usually noble whistleblower label being put to low purpose.

Something similar is true with the widely cited Vaccine Injury Court, another frightening term, except that no such thing exists—at least not by that name. It’s true there is an Office of Special Masters which, under a smart 1986 law, hears the claims of parents who believe their children have been injured by vaccines. The panel was created to provide no-fault compensation in all such cases, since drugs that are as vital and are administered as widely as vaccines could never be manufactured or sold affordably if the companies themselves had to pour millions and even billions of dollars into defending themselves against claims.

It’s true too that the court has paid out about $2.8 billion to parents and families since 1989, but those awards are overwhelmingly for relatively minor side effects that are fully disclosed by the ostensibly secretive CDC for any parents caring to look on the agency’s website. And to put that $2.8 billion in perspective: The money went to 3,727 claimants over an approximate generation-long period during which 78 million American children were safely vaccinated, preventing an estimated 322 million illnesses and 732,000 deaths. If you’re crunching the numbers (and it’s not hard to do) that factors out to a .0048% risk of developing what is overwhelmingly likely to be a transient problem—in exchange for a lifetime of immunity from multiple lethal diseases.

But brace for more anyway because October is, yes, Vaccine Injury Awareness Month. Because really, what does a dangerous campaign of misinformation need more than 31 catchily named days devoted to itself?

Still, there’s no denying that catchiness works, and on this one the doctors and other smart folks are going to have to get off the dime. MSNBC’s Ronan Farrow—who either is or isn’t to your liking depending in part on whether MSNBC itself is—has emerged as a smart, persuasive, often brilliantly cutting advocate for the vaccine cause. And on his Oct. 10 show he deftly filleted the arguments of a vocal anti-vax mother whose child is undeniably suffering from a number of illnesses, but who wrong-headedly blames them on vaccines. In this show as in others he invites his audience to learn the truth about vaccines and to connect with him and one another via the handle #VaccineDebate.

And right there he tripped up. For the billionth time (as Farrow knows) there is no debate. Just as there is no climate change debate. Just as there is no moon-landings-were-faked debate. And just as there was nothing to the tobacco company’s disingenuous invention of a “cigarette controversy,” a fallback position they assumed when even they knew that cigarettes were killers and that they couldn’t straight-facedly say otherwise, so the best they could do was sow doubt and hope people stayed hooked.

Little more than 30 seconds spent listening to Farrow talk about vaccines makes it unmistakably clear where he stands—but the very fact that we now live in a hashtag culture means that it’s by no means certain he’s going to get that 30 seconds. So step up your game, smart people. You want to get the vaccine message out, do it in a way that works in the 21st century. And if that means a hashtag, why not #VaccinesWork or #VaccinesAreSafe or #VaccinesSaveLives. Of course, there’s also the more thorough and satisfying #AntivaxxersDon’tKnowWhatThey’reTalkingAboutSoPleaseStopListeningToThem, but that gets you exactly halfway to your 140-character limit. So keep it brief folks—and make it stick.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser