• July 10, 2014

Document Sheds Light on Investigation at Harvard

Document Sheds Light on Investigation at Harvard 1

Gaye Gerard, Getty Images

The evolutionary psychologist Marc D. Hauser

Enlarge Image
close Document Sheds Light on Investigation at Harvard 1

Gaye Gerard, Getty Images

The evolutionary psychologist Marc D. Hauser

Ever since word got out that a prominent Harvard University researcher was on leave after an investigation into academic wrongdoing, a key question has remained unanswered: What, exactly, did he do?

The researcher himself, Marc D. Hauser, isn't talking. The usually quotable Mr. Hauser, a psychology professor and director of Harvard's Cognitive Evolution Laboratory, is the author of Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong (Ecco, 2006) and is at work on a forthcoming book titled "Evilicious: Why We Evolved a Taste for Being Bad." He has been voted one of the university's most popular professors.

Harvard has also been taciturn. The public-affairs office did issue a brief written statement last week saying that the university "has taken steps to ensure that the scientific record is corrected in relation to three articles co-authored by Dr. Hauser." So far, Harvard officials haven't provided details about the problems with those papers. Were they merely errors or something worse?

An internal document, however, sheds light on what was going on in Mr. Hauser's lab. It tells the story of how research assistants became convinced that the professor was reporting bogus data and how he aggressively pushed back against those who questioned his findings or asked for verification.

A copy of the document was provided to The Chronicle by a former research assistant in the lab who has since left psychology. The document is the statement he gave to Harvard investigators in 2007.

The former research assistant, who provided the document on condition of anonymity, said his motivation in coming forward was to make it clear that it was solely Mr. Hauser who was responsible for the problems he observed. The former research assistant also hoped that more information might help other researchers make sense of the allegations.

It was one experiment in particular that led members of Mr. Hauser's lab to become suspicious of his research and, in the end, to report their concerns about the professor to Harvard administrators.

The experiment tested the ability of rhesus monkeys to recognize sound patterns. Researchers played a series of three tones (in a pattern like A-B-A) over a sound system. After establishing the pattern, they would vary it (for instance, A-B-B) and see whether the monkeys were aware of the change. If a monkey looked at the speaker, this was taken as an indication that a difference was noticed.

The method has been used in experiments on primates and human infants. Mr. Hauser has long worked on studies that seemed to show that primates, like rhesus monkeys or cotton-top tamarins, can recognize patterns as well as human infants do. Such pattern recognition is thought to be a component of language acquisition.

Researchers watched videotapes of the experiments and "coded" the results, meaning that they wrote down how the monkeys reacted. As was common practice, two researchers independently coded the results so that their findings could later be compared to eliminate errors or bias.

According to the document that was provided to The Chronicle, the experiment in question was coded by Mr. Hauser and a research assistant in his laboratory. A second research assistant was asked by Mr. Hauser to analyze the results. When the second research assistant analyzed the first research assistant's codes, he found that the monkeys didn't seem to notice the change in pattern. In fact, they looked at the speaker more often when the pattern was the same. In other words, the experiment was a bust.

But Mr. Hauser's coding showed something else entirely: He found that the monkeys did notice the change in pattern—and, according to his numbers, the results were statistically significant. If his coding was right, the experiment was a big success.

The second research assistant was bothered by the discrepancy. How could two researchers watching the same videotapes arrive at such different conclusions? He suggested to Mr. Hauser that a third researcher should code the results. In an e-mail message to Mr. Hauser, a copy of which was provided to The Chronicle, the research assistant who analyzed the numbers explained his concern. "I don't feel comfortable analyzing results/publishing data with that kind of skew until we can verify that with a third coder," he wrote.

A graduate student agreed with the research assistant and joined him in pressing Mr. Hauser to allow the results to be checked, the document given to The Chronicle indicates. But Mr. Hauser resisted, repeatedly arguing against having a third researcher code the videotapes and writing that they should simply go with the data as he had already coded it. After several back-and-forths, it became plain that the professor was annoyed.

"i am getting a bit pissed here," Mr. Hauser wrote in an e-mail to one research assistant. "there were no inconsistencies! let me repeat what happened. i coded everything. then [a research assistant] coded all the trials highlighted in yellow. we only had one trial that didn't agree. i then mistakenly told [another research assistant] to look at column B when he should have looked at column D. ... we need to resolve this because i am not sure why we are going in circles."

The research assistant who analyzed the data and the graduate student decided to review the tapes themselves, without Mr. Hauser's permission, the document says. They each coded the results independently. Their findings concurred with the conclusion that the experiment had failed: The monkeys didn't appear to react to the change in patterns.

They then reviewed Mr. Hauser's coding and, according to the research assistant's statement, discovered that what he had written down bore little relation to what they had actually observed on the videotapes. He would, for instance, mark that a monkey had turned its head when the monkey didn't so much as flinch. It wasn't simply a case of differing interpretations, they believed: His data were just completely wrong.

As word of the problem with the experiment spread, several other lab members revealed they had had similar run-ins with Mr. Hauser, the former research assistant says. This wasn't the first time something like this had happened. There was, several researchers in the lab believed, a pattern in which Mr. Hauser reported false data and then insisted that it be used.

They brought their evidence to the university's ombudsman and, later, to the dean's office. This set in motion an investigation that would lead to Mr. Hauser's lab being raided by the university in the fall of 2007 to collect evidence. It wasn't until this year, however, that the investigation was completed. It found problems with at least three papers. Because Mr. Hauser has received federal grant money, the report has most likely been turned over to the Office of Research Integrity at the U.S. Department of Health and Human Services.

The research that was the catalyst for the inquiry ended up being tabled, but only after additional problems were found with the data. In a statement to Harvard officials in 2007, the research assistant who instigated what became a revolt among junior members of the lab, outlined his larger concerns: "The most disconcerting part of the whole experience to me was the feeling that Marc was using his position of authority to force us to accept sloppy (at best) science."

Update 3:47 p.m., August 20: A letter from Michael D. Smith, Dean of Arts and Sciences at Harvard University, confirms allegations against Hauser, saying, "it is with great sadness that I confirm that Professor Marc Hauser was found solely responsible, after a thorough investigation by a faculty investigating committee, for eight instances of scientific misconduct under FAS standards." To read the full text of the letter, click here.

Comments

1. nordicexpat - August 19, 2010 at 03:54 am

My hope is that this episode will lead to a general practice where raw data is published along with the results.

2. latino - August 19, 2010 at 06:13 am

poor monkeys...

3. gringo_gus - August 19, 2010 at 07:21 am

"In fact, they looked at the speaker more often when the pattern was the same. In other words, the experiment was a bust."

There was a comic strip in the UK (the Cloggies) in the 70's. A subplot involved a bloke with alleged paranormal powers. To his objection, he was asked to predict the outcomes of 1000 coin tosses before each happened. He got it wrong. 100% of the time. He was pictured running off grinning as the experimenters realized that maybe their experiment wasn't a bust after all....

4. coahuiltejano - August 19, 2010 at 07:26 am

Harvard is being taciturn? No way! You mean in the same way that they were taciturn about the plagiarism of Alan Dershowitz?

5. mafjazz - August 19, 2010 at 07:43 am

What is the thesis of his new book? That we are designed by nature to have a tendency to be drawn toward the dark side? If so, is his new book more of an autobiography?

6. optimysticynic - August 19, 2010 at 08:04 am

What is extraordinary here is the courage and integrity of the students in the lab. Not all Harvard faculty may not have honor, but apparently the students do.

7. richardtaborgreene - August 19, 2010 at 08:52 am

We all enjoy misfortunes of others in the indirect way of being glad this particular error was not done by us. Loyalty to truth beyond nation, grant, king, media, celebrity boat, was the call of those first scientific societies in Europe. Our age barely remembers truth and our media show it little and grudging respect. I for one believe we all or our children will pay an enormous price for disrespecting truth as much as we collectively so. I am sad about that and what little I have done has not moved the ball much.

In THAT context, these research assistants have been loyal to truth and moved science a step forward, in our venal dirty media-drenched age. I am proud of them and of their insistence that truth be output from their personal involvements, lab, and work. Good job whoever hired and trained them. Good job for Harvard and us all they did by pushing for truth.

I am not particularly angry at this offending professor. We all are utterly drenched in a culture of personal ambition, greed, getting ahead, and lives given seriously to nothing beyond self and lifestyle. It is not surprising that a huge bunch of monkeys (human type) raised in such a disgusting stew of cultural tripe) stray from important directions all the time. So though people like me may want to feel superior to him and gloat over the his-ness of all this and the lack of my-ness in it---finally, he gave in to a huge omnipresent nasty culture we all generate and we all collectively have more than a little causal influence on his straying. Feeling superior to other people is a very poor way to correct system faults in the end.

8. iris411 - August 19, 2010 at 08:53 am

what concerns me the most is whether the current peer-review system is adequate in keeping the academic bullying at bay and whether it is adequate to keep out such dishonesty?
In fields of such as psychology and biology, at least one has the raw data to begin with. When it comes to humanities ... how do we tell...where can we go for evidences...

9. aar8413 - August 19, 2010 at 09:08 am

So, a monkey looking at a speaker is taken as evidence that he noticed a change in tone?
What if he thought, "I wish they'd turn that damn thing off. It never changes."

10. rightwingprofessor - August 19, 2010 at 09:16 am

Richard,
How can you not be angry?! This professor is a crook and he's enjoying the tenured life at Harvard University. He should be in jail for committing fraud to secure federal research dollars that could be spent on someone who actually wants to pursue an honest research agenda.

11. cwinton - August 19, 2010 at 09:32 am

As reported, this is a simple case for which most institutions would have sacked the charlatan and taken affirmative steps at damage control, rather than circling the wagons as Harvard has done. I mean, after all what we have is a self-proclaimed researcher who decided on the conclusions, fabricated the data, wrote it up and published it (after all who would reject a paper from someone at Harvard), and used the Harvard imprimatur to obtain grant money to do more of the same. Reputations are hard won and easily lost. I think Harvard itself needs to reflect on that.

12. ksledge - August 19, 2010 at 10:01 am

I feel so bad for the students and mentees who have gone through his lab over the years. Bravo to those who organized this revolt against him even when it probably seemed politically expensive for them to do so at the time.

13. swish - August 19, 2010 at 10:09 am

Not saying this is so, but couldn't a couple of research assistants with a grudge set up this scenario and doctor the videorecordings? The fact that the deception would have been so blatant -- and that Hauser would have been so openly courting disaster by doing it right in front of assistants -- gives me at least a moment's pause.

If the experiment is repeated by unrelated researchers and the monkeys consistently behave as Hauser described, that'd be evidence in support of him.

14. worddancer - August 19, 2010 at 10:27 am

This looks like the perfect storm: the professor is a bully; the professor possesses bad judgment (even if he is not being patently dishonest, his brand of wishful thinking undermines any claim he can make to be an honest and accurate observer), and the university does not institute a prompt and transparent investigation.

And then we have the obvious and predictable consequences: students leave the field, and the general public regards professors (en masse) as charlatans and bullies.

As someone who has been in academia for almost 30 years--and as someone who cares a lot about what credence should be assigned to claims about the evolutionary basis of human morality--I am very, very chagrined by this debacle. I call upon Harvard to step up NOW and do the right thing, rather than continue to hunker down and protect the professor even when there are such patently high costs to the well-being of his graduate students, the field, and academia writ large.

And I profoundly hope that other readers of the CHE will do what I have done: write to the President of Harvard and the APA calling for a swift and responsible investigation and resolution.

15. dcoffice10 - August 19, 2010 at 10:43 am

@ Richard, if I read your comment correctly you seem to say that the professor has little personal responsibility but society and our culture are to blame? How is it that we as a society always look to blame someone or something else instead of taking personal responsibility?

Shame on this professor for abusing his position of power, he is supposed to be a leader in his field and while you seem to think science has taken a step forward wtih this case, I would say one step forward and about 20 steps backwards.

16. amnirov - August 19, 2010 at 10:58 am

Seems like a fairly crappy experiment one way or another.

17. rosieredfield - August 19, 2010 at 11:05 am

Good scientific practice would require that all of the people doing the coding be 'blinded' ('earplugged'?) to the sounds the monkeys were hearing (e.g. viewing the recording only with the sound turned off). And because there might be other visual cues in the recording, the coding ideally should be done by people who weren't involved in carrying out the experiment.

All scientists should know that we can't completely trust our judgement when evaluating experiments that test our own hypotheses.

18. jrscholar - August 19, 2010 at 11:08 am

Bravo for the grad students for challenging the prof. And as iris411's question about where do we go in the humanities, this is why historians demand the use of footnotes. If someone questions our work, they can always follow the document trail. It was people doing exactly this that led to the Michael Bellesiles case among others.

19. gahnett - August 19, 2010 at 11:11 am

dcoffice10:

One way that a culture can affect an individual into bullying is through continuation of standards that don't necessarily reflect the standards that should be set. This happens in biology all the time.

Consider the use of model organisms. The standards for genetic/cellular analysis is much more feasible using a system like flies or worms, yet we have to use mice or related if we want a higher probability of understanding the basis of vertebrate problems. Yet, mice take a lot longer to breed and the tissue examination is more cumbersome. So, to ensure the continuation of publication and such, you publish based on accepted standards of the mouse group but fly and worm people may object to the lack of cell resolution or whatever.

So, a higher mouse official would be complicit by not forcing the required standard of examination at the cell level on students who desire to learn at a higher resolution.

There is no "academic dishonesty" here but it does show an example of how an established culture can lead to bullying and the inherent, associated complexity of issues.

20. 11122741 - August 19, 2010 at 11:11 am

it's not so much that this professor cheated but that cheating is so prevalent today and particularly among senior professors and in psychology. Recall Cyril Burt's 20 years of fudging IQ data. Makes one wonder about other behaviorist data that has come out of Harvard in the past.

21. andyj - August 19, 2010 at 11:16 am

An experiment that fails to produce the hypothesized results is not a failed experiment as suggested repeatedly in the article. The scientific method is about, discriminating truth - something that really does exist in the sciences, however elusive. If the students are right, as they seem to be, Professor Hauser's experiment failed in its reporting only, not in the pattern of actual findings. On the face of it, it is not obvious that Harvard is covering things up.... they may be, but due process would produce the same "observed results" regarding their handling of the case as would a cover up. Let the truth emerge from diligent and objective inquiry, both in science and social justice.

22. jr4040 - August 19, 2010 at 11:47 am

This sounds familiar, doe anyone remember Eric T. Poehlman who received a 1 year FEDERAL jail sentence for falsifying data regarding menopausal women in a $542,000 NIH grant. Hormones were to women based on the data! He pled guilty and blamed it on pressure of grant (external) funding... But he already had the funding and could not follow through with the research. He upset one of his grad students who questioned his data so the student went right to the dean. So many researchers do not understand how powerful grad assistants/students are and need to give them more respect. If you have a grad student who is out to get you then its like walking on egg shells, don't slip up or they will report you. Even if you are not falsifying data.

http://ori.dhhs.gov/misconduct/cases/press_release_poehlman.shtml

This guy falsified data in a government grant as Poehlman did, he stole from the government, he needs to face federal charges and jail time. Especially if he made up some of the data that he said was "collected" using grant money.

Does anyone remember Jan Hendrik Schon? Same deal, no federal money though. Lost his PhD, awards, and credibility, but faced NO criminal charges because everything was funded by his employer Bell Labs.

Http://en.wikipedia.org/wiki/sch%C3%B6n_scandal

Unfortunately, the only way to stop this is to conduct GCP (Good Clinical Practice) experiments and have an auditor verify the work. This is required for many NIH grant now, especially the ones with pharmaceuticals. Other publications are "at the author's word". Even if the raw data was submitted and the data was reviewed by the editors, it still could be manipulated. Submitting and publishing raw data will do NOTHING to stop this!

23. jr4040 - August 19, 2010 at 12:00 pm

Good call rosieredfield! It should have been blinded. Thats what we call "scientific misconduct" and it falls back on Harvard. There has been A LOT of questionable methods and findings from some of the labs at Harvard and I wonder how many of the articles get published.

Here is a proposed mentality. "I am at Harvard, I don't need to have solid and ethical methods because my research will be published anyway... It is from Harvard"

I don't think I am wrong here and the fact that articles get published because where they are from and not because of what they did or found is shameful to the academic community... Like everything else, it all comes down to politics and Harvard does not want to lose it's reputation and the ability to publish anywhere. My bet is they will keep this as low-key as possible.

Personally, I am happy that went down at Harvard because it should start opening the minds of reviewers and editors that not everything from Harvard (or other prestigious research programs) is gold.

24. 12080243 - August 19, 2010 at 01:06 pm

There are many differences between Harvard and the University of Southern Mississippi (USM). One is the integrity of its faculty and administrators. USM punishes faculty who report misconduct. See, "University and AACSB Diversity," published in the proceedings of the American Accounting Association Annual 2010 Meeting (http://commons.aaahq.org/post/3d4bfd4201), for a discussion of how USM punished a professor who brought to the attention of USM faculty and administrators, and then to the Association to Advance Collegiate Schools of Business (AACSB), faculty plagiarism. As a matter of fact, the College of Business at USM still publicly reports one of the plagiarized documents. The Academic Integrity Policy is on its website and was taken from Syracuse University "without proper citation"--a term used in an email by one of the plagiarizers we got through a freedom of information request. Note that Syracuse provided extensive citations for the sources of ideas and words of its Academic Integrity Policy but when USM's College of Business copied Syracuse's Academic Integrity Policy, its faculty and administrators did not copy Syracuse's citation list nor give credit to Syracuse. Since USM's Academic Integrity Policy was prepared for the reaccreditation process, the AACSB was among those who received the plagiarized policy. The AACSB was, therefore, advised of the plagiarism but it decided that the plagiarized Academic Integrity Policy did not violate AACSB standards. The AACSB refused to explain why.

Chauncey M. DePree, Jr., DBA
Professor
School of Accountancy
College of Business
University of Southern Mississippi
m.depree@usm.edu

25. drhypersonic - August 19, 2010 at 01:26 pm

Assuming the article is correct, it sounds like a pretty straight-forward case of data-fudging. The problem with this goes well beyond the taxpayer funding lost with this experiment. The impact that misleading results could have on the research of others, and the waste of resources that, at this time, are increasingly hard to come by. Kudos to the students who had the courage to stand up.

26. bekka_alice - August 19, 2010 at 02:04 pm

Totally agreed with optimysticynic - these students deserve gold research stars for their integrity. It says one has left Pyschology; I sincerely hope that's not because of this.

27. alan_kors - August 19, 2010 at 03:36 pm

Without prejudging the final decision, I'm struck by two things:

1. The integrity and courage of the graduate students (and the seeming rarity of these essential academic traits), not least in the humanities and social sciences;

2. The slowness of the process (three years since the document cited here was presented to Harvard?), which itself threatens negative sanctions on graduate students who appear to deserve eveyone's gratitude.

Francis Bacon once wrote that a "failed" experiment sheds great light... if it is analyzed as "failed." That seems doubly true here.

28. wassall - August 19, 2010 at 03:40 pm

Chuck Berry said it best: Too Much Monkey Business!

29. tom__k - August 19, 2010 at 03:46 pm

"The document is the statement he gave to Harvard investigators in 2007."

I find it rather strange that the students/research assistants who claim to be involved in this case made a statement in 2007 for a paper that was published in 2002, and the experiments must have been done even earlier. Why would they wait at least 5 or 6 years to come forward? Surely, if this were a systematic pattern of misconduct, there would be more recent cases. Also, has anybody ever heard about a research assistant staying in a lab for 5 or 6 years? Who knows what Hauser really did, but this doesn't make any sense at all.

30. dboyles - August 19, 2010 at 03:48 pm

A wonder the monkeys were not fired for non-compliance.

31. dgle6511 - August 19, 2010 at 03:56 pm

@tom__k: This particular experiment wasn't related to the 2002 paper. As the last paragraph of the story says, the experiment was tabled. Presumably the research assistants' 2007 statement catalyzed an examination of Prof. Hauser's work going back several years.

32. tom__k - August 19, 2010 at 04:08 pm

@dgle6511

Shall we take this as evidence that there wasn't any widespread misconduct after all? If the investigation went back several years at least until 2001 or 2002, surely they would have found more problems if this had been a widespread problem.

Also, the article mentions email exchanges between Hauser and the research assistant in question about that very paper if I understand correctly. So if I understand this article correctly, a grad student and a research assistant reported in 2007 about misconduct that allegedly occurred in 2002 or before. That sounds a bit fishy.

33. 11182967 - August 19, 2010 at 04:13 pm

#17 is on the right track. I'm not an experimental scientist, but it seems obvious that in any experiment--but especially one the results of which depend to such a great extent on human ovservation--that there should be at least two independent observations(of the video, in this case) by separate individuals, neither of whom should be the creator of the experiment. Further, the evidence--the video--should be readily available to anyone who wished to assess it, and the actual experiment should be readily repeatable.

Hauser's experimental procedure seems so unscientific as to make it almost impossible to judge whether the claimed data is faked, the result of poor observation (by one observer or the other), or just the result of sloppy experimentation. This is the sort of work which, had I seen the equivalent in a student research paper, I would have returned with the comment that the work was not yet ready to be graded. Why would a responsible journal publish this stuff in the first place?

34. kyr3zei06 - August 19, 2010 at 04:42 pm

@#17 and #33

I assume that you haven't actually read any of the papers in question. Blind coding and checking interobserver reliability is pretty standard, in Hauser's lab as well as in any other animal lab as can be seen from the methods sections. That said, behavioral experiments (as well as neuroimaging/neurophysiological experiments) are inherently messy, especially with animals, but everybody knows this, and so far nobody has come up with better methods.

35. stockholm - August 19, 2010 at 05:06 pm

There is some confusion here. The study mentioned in this article is not referencing the retracted 2002 study. It seems to be a follow-up study with different stimuli and a different species.

36. tombartlett - August 19, 2010 at 05:17 pm

@stockholm Right. The data from the experiment mentioned in the article was never published (it was "tabled" as the last graf of the articles says). It was similiar to the 2002 Cognition paper, but it involved rhesus monkeys rather than cotton-topped tamarins. So the research assistant's statement was about an experiment that had just recently happened -- not one that took place 5 years before.

37. cwinton - August 19, 2010 at 05:55 pm

I think we can infer that the retraction of the 2002 study indicates a pattern of similar misconduct dating back at least 10 years. Articles elsewhere in the press indicate that a great deal of Mr. Hauser's "research" is similarly problematic. It is very generous to describe his methods as sloppy since they practically reek of scientific misconduct. I feel very sorry for those who have assumed that because of his Harvard position, the integrity of his published results is beyond challenge and so have cited his assertions in their own work. Harvard cannot duck their own culpability, since they deliberately chose to drag the investigation out, literally for years, and have yet to be forthcoming about their findings. It also appears a few publishers need to reconsider how they review submissions such as Mr. Hauser's for publication.

38. skulkofsky - August 19, 2010 at 06:26 pm

Even if he did not knowingly falsify data (which I suspect is probably the case), what is described represents poor research methods. I use coding often in my research and my standard practice is to have two coders independently code at least 25% of the data. There are a number of statistics that can be calculated to determine if the coders are reliable. I am curious if those articles reported such statistics. The coders should be blind both to the hypothesis and, if possible, to any experimental condition. The sample to be coded by both coders should be chosen randomly, and not cherry picked to be the "best" or "easiest" cases. So, at a minumum Dr. Hauser is guilty of very sloppy science.

39. alleyoxenfree - August 19, 2010 at 09:24 pm

The new "star" system of professors demands that those profs be relentless stars. How else can donors in a celebrity culture be persuaded to donate to starred chair positions and buildings? As a culture, we've lost respect for learning and - with institutions like Harvard leading the way - have wholly given in to the notion of sexy, star professors (or journal editors, or presidents).

Meanwhile, those doing excellent, serious teaching and academic research, some of which produces negative results, some of which takes a good long time to pan out, some of which is basic research without immediate industry application - all those people are considered not cool enough for school.

This is the natural end result. Elevate people who play fast and loose but who are charming, and you get bad scholars who are "popular."

40. legalgibbon - August 20, 2010 at 12:17 am

"The method has been used in experiments on primates and human infants. Mr. Hauser has long worked on studies that seemed to show that primates, like rhesus monkeys or cotton-top tamarins, can recognize patterns as well as human infants do."

I just wanted to note that humans *are* primates. Perhaps a more accurate way of making these statements would be, "The mehtod has been used in experiments on humans and other primates. . . . studies that show that non-human primates, like rhesus monkeys or cotton-top tamarins. . . ."

--legalgibbon

41. rbh_iii - August 20, 2010 at 01:10 am

There's one potential explanation of the differences between coders that relies on sloppy records rather than malice, and that is an asynchrony of some sort in the trial/event numbers across the several coders. That's not fraud, but rather incompetence.

However, that's a tough hypothesis to defend in light of the several retractions already done. That'd be an awful lot of sloppy work.

42. sallybogacz82 - August 20, 2010 at 02:09 am

I think this sorry tale show how important it is for independent labs to replicate experiments. No-one likes to do this, because everyone wants to do new and exciting research, and funding is directed towards doing new work. But I think it should be made part of the education of every doctoral student to take an experiment, such as Marc Hauser's, and attempt to replicate it. It would provide invaluable training for the graduate students and help to keep researchers honest.

43. 3224243 - August 20, 2010 at 07:26 am

@#24 - yeah, we've read your rant a number of times now. Stop posting already.

44. 11182967 - August 20, 2010 at 09:07 am

#34: Your point is well taken--this sort of experimentation is obviously less precise and controllable than what takes place in some other kinds of experimental context. And even if you're reading cloudy partical trails after the fact at CERN accurate observation certainly takes training--the observers need to be well calibrated so they're looking for the same signals and recording them in the same manner--just as in holistic grading according to a rubric. But I've always been taught that when there are significant discrpeancies there need to be additional (trained) observers to resolve inconsistencies or, if the inconsistencies persist, that the data must be scrapped--and that, in case, the experiment must be publically visible, if not repeatable. Maybe fMRI will provide some additional experimental opportunities for this sort of experimentation--although there will remain issues of observation and intepretation.

45. trendisnotdestiny - August 20, 2010 at 09:15 am

@ to all readers

This reminds me of David Callahan's book (RichardTaborGreene's comment as well) that we are living in such a deeply embedded cheating culture:

Sports: Steroids, Gambling, Performance Enhancers, NBA Referees, College Athletic Departments, Academic Grade Fixing and NFL Surveillance Tapes...

Business: Madoff, Enron, Goldman Sachs, Tyco, Blackwater-Xe, Phillip Morris/Altria, Arthur Anderson, Merrill Lynch, Massey Mines, BP, Exxon, Healthsouth, AIG, Lehman, Dow, Credit Suisse, Lockheed Martin, Boeing, Halliburton

And many others http://www.cheatingculture.com/aboutbook.htm

Journalism
Electronic Piracy
Academia
Law
Medicine
Pharma
Taxes
Workplace Theft

I am not trying to sell you on this book and have no relation to him other than as someone who has compiled an interesting narrative in the middle of the larrgest de-regulatory period in our history... Eroded rules and governance leads to cheating (see the S&L Crisis for how profitable this could be or Payday Lenders)....

46. janacat - August 20, 2010 at 09:39 am

It is always fascinating, the speed with which academics -- who presumably rely on careful research when coming to conclusions in their own work -- condemn universities for not acting RIGHT NOW when (unproven) allegations arise against a student or faculty member. When it's at Harvard, it's a feeding frenzy. Anybody remember how the faculty at Duke shamed itself into demanding that a group of students and a coach be punished IMMEDIATELY for something no one knew they had done?

Marc Hauser may have screwed up. He may not have. Harvard is probably trying to find out, rather than playing to the cheap seats.

47. andyj - August 20, 2010 at 11:06 am

Thank you, #46. I can only hope that the rush-to-judgment postings which comprise the majority of comments here, as in the public response of most to the false accusations at Duke, do not represent the majority of those in the academy. If they do, our students are poorly served. Mr. Hauser may be guilty of misconduct, sloppiness, perceptual bias, or outright malicious fraud. We don't know. Thanks for the appeal to prudence, although I predict that it will be largely ignored or dissed in postings to follow.

48. _perplexed_ - August 20, 2010 at 11:57 am

I'm all for prudence, but Harvard officials were informed in 2007, and we are just learning of it now; and what we are being told is awful sketchy. Isn't it about time to release detailed information or at least explain what the holdup is?

49. smwoodson - August 20, 2010 at 05:03 pm

amnirov & rosieredfield have it right. Regardless of anything else, this is not good experimental design. Why was the experiment funded in the first place? And doesn't that say a whole lot about academe?

50. rgelman - August 20, 2010 at 05:27 pm

I am surprised that there is a lack of comment as to why the case is so serious. Fundamentally, the function of Higher Education is to share/pass on and create knowledge - not for an institution or an individual to become famous (nice if they do for the right reasons and then they deserve accolades.)
Our institutions are charged with the task of guaranteeing that its participants take this as a fundamental principle for being part of this particular kind of group. This fundamental principle should govern our day to day activities.

We therefore must assume that students and non-students alike trust faculty, staff and administrator to be honest about what is in the record, what is added to the record, and access to how the record is and was created. It follows that plagiarism, cheating, fabrication, failure to replicate, or make it possible to do so, all count as moral violations. Importantly moral violations break trust and often elicit anger. In the case in hand, there is enough in the record to make it clear that we are dealing with a moral violation. Worse yet, someone who writes about morality!

51. jbeez - August 20, 2010 at 07:08 pm

Did everyone just gloss over gringo_gus' comment (#3)?

Perhaps the results are "wrong" because Hauser was looking for the "wrong" response!

Hauser's experiment expected the monkeys to notice CHANGES in the patterns, when, as the article points out, the most frequent response from the monkeys was to notice SIMILARITIES in patterns. They turned their head when the patterns were repeated. Hauser expected them to turn their heads when the patterns differed.

I'm beginning to wonder if this "controversy" is in error, and if Harvard is missing the obvious.

52. jirka - August 21, 2010 at 07:26 am

andyj: An experiment which does not confirm the hypothesis is not a failed experiment, but it is still less desirable for the reseacher. If the monkeys did look more often at he speaker when the sound pattern changed, it would be proof that they can discriminate the patterns. Negative result does not prove that the monkeys are not able to discriminate sound patterns since it can also mean that turning the head is not a valid measure of discrimination. It is not an interesting result and would have less chance of publication.
Btw. Dr. Hauser showed a regrettable lack of scientific curiosity:
If he took seriously the indication that the monkeys mabye look more often at a loudspeaker emitting a familiar sound he might have obtained some interesting results.

53. marcomauas - August 21, 2010 at 11:51 am

What did he do? Why is he in trouble?
He probably didn't take too seriously what Freud discovered about morals.
It is called Superego.
It is like coca-cola. The more you think and try to be a moral--moralist person, the more thirsty it gets. It is a positive feedback caused by the fact that morals have a connection with drive.
Freud didn't trust moralists, because he knew, from his clinic, that the more you are far from your desires/flesh/drive, the more your Superego is connected with that. Hypocrisy has a corporeal root, and it turns against your ego-supposed-clean-from-any-bad-desire.
If you research morals, dont't forget Superego--your Superego. It means you are part of the "paysage" . A part invisible for your moral eyes.

54. aserieux - August 21, 2010 at 11:56 am

Harvard wanted Dr. Hauser. Dr. Hauser wanted to Shine for Harvard. He actually added a bit of luster to Harvard's image. If you don't get caught all stakeholders are happy, right? This attitude seem to be ingrained in American culture. Think about it. Is this at all surprising?

55. es_jr - August 21, 2010 at 02:20 pm

This reminded me of when, back in the day, I was asked to write a piece of software for the Greek Olympic Boxing committee and how (Olympic) boxing matches are scored : on each of the four sides of the ring there is a ref with a box with a button. If boxer A lands a punch and 3 out of 4 refs hit the button, boxer A scores a point. Apart from the fact the one ref maybe a myopic octogenarian and the other may have his eye on the cleavage around him instead of the action in the ring, the fact remains that in some cases 'replaying' the match with 4 different refs on the knobs may and will produce a different result.
Hardly science...unless one observes a clear statistical threshold.

56. khesriram - August 21, 2010 at 03:16 pm

I admire the graduate students' commitment to pursuing the truth, at so many levels ...

57. ellenhunt - August 21, 2010 at 05:23 pm

I went to the dean with a complaint about falsified data. The dean buried it. It appeared that he tried to use a rule on followup of informal versus formal complaints to let it die. I went to the ombudsman. We had a "meeting" and I was introduced to the "investigator". A year later I checked in and the ombudsman "lost the paperwork". It took them months to find it.

The report came back and was a whitewash. I documented that not one of 10 witnesses I had cited in my detailed report had been consulted. Only the culprits and a chair who was buddies gave statements.

I wrote this up and appealed to the Chancellor. The Chancellor actually wrote me back saying that what had been done was appropriate. He said that it was not necessary to check with anybody else.

The chair went after my thesis advisor. I managed to graduate by a conspiracy worthy of an Agatha Christie novel between several professors who formed a committee and approved my thesis which was made up of several first author publications of mine.

I am, to this day, revolted, disgusted, completely appalled at the Kafkaesque nightmare.

58. contrarianpundit - August 21, 2010 at 07:44 pm

Thank you, Professor Hauser, for proving that tamarin monkeys are more sophisticated than previously thought: they're able to fool Harvard professors into believing they have richer cognitive capacities than they actually do.

59. andrewthesmart1 - August 21, 2010 at 11:43 pm

Noone should be surprised that Harvard dragged their feet about investigating this. Marc Hauser is a famous guy and probably generates a lot of value for the Harvard brand. Elite research universities are rackets. The approach taken by Harvard seems to be to try to sweep it under the rug until CRAP everybody is talking about it so we have to acknowledge it. Then Harvard has to take a sanctimonious stance pointing out its commitment to keeping the scientific record correct. Hauser may be guilty and you don't get to his position without a healthy (and/or pathological) ego. And I do agree with the comments here that chalk some of this up to the insidious culture of ambition and acheivement that has ruined science. In my opinion, this goes for any science "hard" or "soft". Prestige, tenure, fame, wealth, "look how smart I am" - are these attributes anyone associates with science? Yet it's the pursuit of these things rather than knowledge that dominates academic culture. To me, this is a case where a corrupt system has to make an example of someone who violated the professed rules of the system in order to prove to everyone and itself that the system is not corrupt. The irony of calling this "Hausergate" is much more accurate than people realize because in the original Watergate Richard Nixon was made to be the evil one to divert attention away from the fact that the entire system is corrupt.

60. rleesmith18 - August 22, 2010 at 09:20 am

Are the any studies of the pervasiveness of this kind of cheating. It seems to be cropping up everywhere.

61. klymkowsky - August 22, 2010 at 12:26 pm

Is there any serious concern that this entire area of research is "too soft" to support serious conclusions about the mechanisms or evolution of consciousness?

62. nie_wieder - August 22, 2010 at 12:56 pm

A comment on Marc Hauser's status as a "popular" professor: I teach undergraduate and graduate students in a humanities department. Our liberal-arts faculty receives increasing criticism from the administration and the small but influential "education" program for being too concerned with research and course content and insufficiently concerned with "education" and "caring" about students. I was therefore interested to read the comments from Harvard students about Hauser on the "RateMyProfessors.com" website. He received a 4.2 for "overall quality" in comparison to the "professor average" for Harvard of 2.95; he also received the red-pepper icon signifying that he is considered "hot". One student wrote, "it is so rare to have a professor at harvard who engages with undergrads, and spends time with them, both working on research and concerned with education. hauser is a pleasure to be with." Of the 25 ratings, only one student noted his "lack of concern for rigorous empirical methods"; that comment was criticized as "conceited" by a student who continued, "Hauser is one of the few profs at harvard who is completely open to hanging out with students, is very approachable, and does freshman advising. not many like him at harvard." Given the growing importance attached to institutional evaluations by students of faculty and the growth of customer-oriented business models in academia, perhaps the Hauser affair deserves a place in discussions of the purpose of institutions of higher learning.

63. rogerthat - August 22, 2010 at 01:11 pm

This stuff goes on every day in every institution and profession.

I'm an advertising copywriter who has worked in a number of countries and has seen certain of my kind lift their careers to great heights entirely on BS, stolen ideas and scam ads (unusually creative ad campaigns that either never actually ran, or did run but the media buy was paid for by the ad agency/staffers that produced it, solely to win awards; most awards contests state that all submitted work must have actually run in order to qualify [clients tend to resist running highly creative ads, as evidenced by the wonderful work we're all subjected to daily]). If you win a bunch of international awards as an ad 'creative' you are guaranteed big money, thousands of sexy young admirers, paid speaking gigs at important industry events and rapid promotions. There are lots of such practitioners in our industry, as well as the coat-tailers, those who manage to get their names associated with award winning-work to which they've contributed nothing.

I have a friend who's finishing his ophthalmology degree and was hired on a year-long contract by an LA hospital. He told me of incompetent surgeons with whom he operated who buggered up virtually everything they touched. But because of their power to limit or destroy the careers of young doctors and others, maintained their positions, despite inflicting regular and often irreparable harm on patients. He didn't dare report these persons for fear of being blackballed within the medical system, which is evidently very good at looking after its own.

The spookiest aspect of the whole Dishonesty-Incompetence Complex is the habit universities have of giving good grades to illiterates, slackers and cheaters because their parents are paying $200k for the punk's education and demand top results (not of their kids, but of the institutions). Those are the twits who end up out there designing buildings, bridges, nuclear facilities or running critical institutions on spurious qualifications and further infecting with their corruptions the entire system that propelled them to their positions of influence.

64. freeexpress - August 22, 2010 at 05:58 pm

Reminds me of the Andrei Shleifer affair. If you want more about how Harvard coddles its errant faculty, check this out:

http://www.thecrimson.com/article/2006/2/10/tawdry-shleifer-affair-stokes-faculty-anger/

"Harvard" and "corruption" seem to be converging in meaning.

65. arrive2__net - August 23, 2010 at 02:41 am

I think that jirka made a good point that getting at the truth is what research is about, "not supporting the hypothesis" is not failure, if indeed the data does not uphold the hypothesis. On the other hand if verifying the hypothesis is necessary to continue or justify a grant or publication, clearly the temptation to falsify will be there. If a researcher who is pursuing a research line falsifies some findings, the researcher really does themself a diservice since the need to get findings that aren't there will continue and get stronger.

If Harvard does have a valid case against Hauser, then I admire their integrity. I would trust an institution more if they have a history of enforcement. Still, it is good to see that Hauser does have the chance to speak out. You should hear from both sides.

Bernard Schuster
Arrive2.net

66. arrive2__net - August 23, 2010 at 02:44 am

... make that "disservice" and "themselves"..

67. cosmos1138 - August 23, 2010 at 07:51 am

rosieredfield - wow thanks for pointing out good methodology - as a educational psychologist who switch from the hard sciences (immunology) its nice to see good clear thinking in the humanities ...BRAVO!!

68. yossarian - August 23, 2010 at 12:08 pm

Having been at or associated with Harvard as a research scientist for almost 20 years I can state with assurance that Harvard took the action that was in Harvard's self-interest as they and their lawyers very carefully calculated it. Ethics or integrity (scientific, academic or individual) played no role in their calculations (except as far as the appearance of such values would be taken into account in their overall calculations). I've NEVER seen anyone connected with the Harvard administration "do the right thing" if their calculations indicated that it would cost them ANYTHING. As several commentators above have pointed out above: why should we be surprised given that we live in a totally amoral - at best - society of winner takes all. I'd also like to add my recommendation to Callahan's book " The cheating culture" mentioned above. I remember his interviewing a guy who was widely considered the best American bicyclist ever who didn't dope himself. I can't remember his name which is sad given the doping scandals with Armstrong and Landis - their names I know.

Add Your Comment

Commenting is closed.

subscribe today

Get the insight you need for success in academe.