In 2001, food behavior scientist Brian Wansink reported that he’d mailed a 12-page survey to 1,002 people, trying to find out whether knowing about soy’s health benefits made them more likely to eat it.
In 2003, he published findings from an eight-page survey sent to 2,000 people about the personality traits of household cooks.
And in 2004, he reported sending out 1,600 surveys about what made customers loyal to soy.
In each instance, Wansink reported receiving the same number of responses: 770.
This spring, in the wake of a scandal over Wansink’s more recent work, an independent researcher confronted some of the journals that published these reports, asking how Wansink could have gotten precisely the same number of responses to what seem to be very different surveys.
Now one of the papers will be corrected, 14 years after publication, BuzzFeed News has learned.
It’s only the latest blow for Wansink, a high-profile professor at Cornell University who over the last year has faced scientific misconduct allegations for at least 50 of his studies. Journals have so far retracted three papers — one of them twice, as BuzzFeed News reported last month — and corrected at least seven. (That total doesn’t include yet another problematic paper about vegetable-naming that stands to be corrected or withdrawn.)
Nicholas Brown, a graduate student at the University of Groningen in the Netherlands, first noticed in March that several of Wansink’s papers from the 2000s mentioned 770 survey respondents.
“This seems like a quite remarkable coincidence,” Brown wrote in late May to the editor of Food Quality and Preference in an email.
Last week, the editor, Armand Cardello, told him that Wansink’s 2003 study about household cooks will be corrected as a result of his inquiries. The correction should appear in the journal’s next issue, Cardello confirmed to BuzzFeed News by email.
After receiving Brown’s email, Cardello said, the journal re-analyzed the study’s original data, which indeed contained 770 entries. The correction will address “information regarding the origins of the survey from which the data are derived, details of its administration, some additional details on methodology, and some minor edits to data entries within several tables of the paper,” Cardello said.
“I cannot speak to reasons why 770 respondents also appear in other of Dr. Wansink’s papers,” Cardello said. But he added, “This re-analysis confirmed that the results and conclusions of the original study were correct and valid.” (Cardello, a retired senior scientist with the US Army, has collaborated with Wansink on two other papers. He said their relationship did not influence the correction.)
Wansink did not respond to requests for comment from BuzzFeed News. But in correspondence with Cardello over the past several months, he offered seemingly conflicting explanations. Wansink initially said that all of this data came from one large survey mailed in the summer of 1999, whose responses were separated into different publications. Later, though, he told Cardello that the survey was one in a series that spanned several topics and were sent twice a year from around 1998 to 2003.
“I do not know the details of the study as it was way before my time,” Sampath Parthasarathy, editor of the Journal of Medicinal Food, told BuzzFeed News by email, adding that he would forward questions to Wansink. The other journals that published the survey data — Appetite, the Journal of Sensory Studies, and the publication formerly known as the Journal of the American Dietetic Association — did not respond to requests for comment.
Neither did the University of Illinois at Urbana-Champaign, where Wansink worked when most of the papers were published.
Cornell, Wansink’s current employer, responded with the same statement it had released when news of his last retraction broke.
“We are taking the questions raised about Professor Wansink’s work quite seriously,” said Joel Malina, Cornell’s vice president for university relations, in a statement. “The University is undertaking timely and appropriate action, in compliance with our internal policies and any external regulations that may apply.”
The surveys are described in the studies as having different lengths, recipients, and payments.
For example, the 2001 study, published in the Journal of Medicinal Food, sought to find out whether knowing about soy’s nutritional value made people more likely to eat it. A 12-page survey was sent to 1,002 adults across the US, and $6 was paid per returned questionnaire. The 770 respondents were 59% female and averaged 44 years old.
In contrast, the 2003 study in Food Quality and Preference says that 2,000 eight-page questionnaires were mailed across the US with an “honor payment” of $3 for replying. The survey asked people who cooked meals at home to fill out questions about their cooking habits, the foods they often ate, and their personality traits. Of the 770 respondents, 61% were female, it says.
And in 2004, a study in the Journal of Sensory Studies reported on a survey about what made customers loyal to an ingredient such as soy. It was sent to 1,600 North Americans with a $5 check, plus “a chance to receive a number of gifts through a lottery.” This time, the 770 respondents were 57% female and 42 years old on average.
Two other Wansink studies also hinge on surveys with 770 responses. One was published in 2004 in the Journal of the American Dietetic Association (now the Journal of the Academy of Nutrition and Dietetics), and another in Appetite in 2006.
In both cases, a survey reportedly went out to 2,000 people — “Americans” in one, “North Americans” in the other — and offered $6 for completing it. In both cases, the 770 respondents were 37 years old, 61% female, and about 70% Anglo-American.
These two studies reported on slightly different questions, however. One was about how often people ate fruits and vegetables, sweet snacks, and salty snacks, whereas the other looked at the cooking habits and food preferences of “fruit lovers” versus “vegetable lovers.” Although the 2006 study mentions the 2004 study at the end, neither makes clear if it is discussing the same or different surveys.
In yet other articles, published in 2005, 2007, and 2014, Wansink cites a survey with 770 respondents as evidence of the benefits of packaged single-serving snacks. “Results from a survey of 770 North Americans indicated that 57% of respondents would be willing to pay up to 15% more for these portion-controlled items,” he wrote in one.
In September, Cardello, the editor of Food Quality and Preference, wrote back to Brown saying that he had talked to Wansink about the surveys in question.
The professor’s explanation then was that the data came from a survey mailed in the summer of 1999. Cardello pasted Wansink’s explanation into a Sept. 6 email, which Brown shared with BuzzFeed News:
“This was an annual survey, and the 1999 survey covered different topics related to cooking behavior, food preferences (Wansink, Bascoul, and Chen 2006), soy consumption (Wansink and Chan 2001; Wansink, Sonka, and Park 2004), vegetable intake (Wansink and Lee 2004), leisure activities, and perceptions of new products.” These four papers all reported 770 survey respondents.
But that didn’t make sense, Brown shot back at Cardello.
Wansink, according to his own research, had reportedly surveyed different numbers and kinds of people, and offered them different amounts of cash. So, Brown asked, how could the data have come from a single survey?
Cardello responded saying that, according to Wansink, the survey was one in a series that spanned several topics and were sent twice a year from around 1998 to 2003.
“According to Dr. Wansink, the default mailing was usually to 1000 people who were paid $6.00 per person, but the mailing for the questionnaire used in the present article was to 2000 people who were paid $3.00,” Cardello told Brown. He’d been told by Wansink that the other articles published from these surveys used the “default amount.”
(This second explanation was the only one that Cardello gave to BuzzFeed News. Asked if he was concerned about Wansink’s seemingly changing stories, he said, “No red flags. I recommend that you ask Dr. Wansink about details or any contradictions you see.”)
Brown continued to push back with Cardello. Some of those other surveys, Brown pointed out in an Nov. 18 email, had reportedly been sent to 1,600 people for $5 and 2,000 for $6 — not the “default amount” of 1,000 for $6. So again, he asked: What was the truth?
On Tuesday, Cardello wrote back: “I understand your concern about the recurring number of 770 respondents in multiple of Dr. Wansink’s papers.” However, he repeated, the journal was confident in its own study’s data, aside from the “minor errors” that will be corrected.
Brown had also asked Cardello if he could examine the original data for himself, but the editor declined.
It’s a request that Brown plans to keep making as he scrutinizes more of Wansink’s research.
“My default assumption of the work coming from this lab is that it’s not trustworthy, it’s not reliable, just based on the amount of retractions and corrections that have been coming,” he told BuzzFeed News. “I would like to see the data in the interest of transparency.”
Stephanie Lee is a senior technology reporter for BuzzFeed News and is based in San Francisco.
Contact Stephanie M. Lee at firstname.lastname@example.org.
Got a confidential tip? Submit it here.