Total Pageviews

Monday, November 21, 2011

Scientific integrity


Scientific integrity

That a society trusts its scientific community to be truthful in all its manifestations is a given in any civilised society. We are all aware of scandals involving sometimes young and sometimes prominent scientists being disgraced for having falsified scientific data in peer-reviewed scientific publications in scholarly journals. That is the newsworthy end of this thorny issue but beneath that headline-catching level lie some equally challenging issues of scientific integrity. The first of these relates to the belief of many MEPs that scientific advisers to the European Food Safety Agency (EFSA) should be, as George Smiley, would have said “Persil grade”, clean as the driven snow with no links to the food industry.  That pressure has led to several high level confrontations between EFSA and the European Parliament plus a plethora of NGOs in the food area. Now an expert in the biology of the lactating yak is unlikely to be considered as a likely expert for a panel of EFSA but someone with a life’s investment in public health nutrition research is likely to be very attractive. The problem is that the lactating yak expert attracts zero interest from the food industry while the lifetime devotee to public health nutrition cannot be free from food industry links. If their research is world class, everyone will want to talk with them, pay their travel to give talks, co- fund or fund their research and generally get to know such an expert. And if the EU, through its competitive research programme funds this expert, then for sure, there must be industry links because that is an absolute requirement of funding. So the MEPs cannot have it both ways. If they want the best, they will have to accept that both the regulator and the regulated will visit the best. 

Having a link with the food industry appears to suggest to MEPs that there is a higher likelihood that independent scientific thinking is likely to be compromised. However, this does not appear to apply to NGOs. A scientist who is an active member of an environmental NGO could be compromised if he or she were to be a member of an advisory committee of an EU institution if the topic involved GM foods. And would a strict vegan be a truly independent chair of an expert group on some nutrient, which has a strong line with animal based foods like iron or zinc? The simple solution here is to require that such potential conflicts of interest be declared so that anyone reading a report involving such individuals knows the background of the experts. But that doesn’t count for much with MEPs. NGOs are inherently good whether they be environmental NGOs or vegan NGOs. It is only industry that seems to matter to the guardians of scientific integrity. Seems strange to me!

Finally, we burrow down to what is ultimately the most sinister aspect of scientific integrity, namely being honest in interpreting primary (my discovery) or secondary (your discovery) data. In a very important paper published in the International Journal of Obesity {2009-(1-50}, researchers at the University Of Alabama reported a study in which they tracked the manner in which primary data (my discovery for example) is cited in secondary data (your reported discovery citing my original findings in support). They chose a study, which examined the link between the development of obesity and sugar-sweetened beverages (SSBs). The original primary data showed no statistically significant association between SSB intake and obesity. Often, authors of papers, which report “negative” data, grasp at straws of positivity. In this case, a subset analysis showed a suggestion of such a link. However, the subset analysis was always a sideshow while the main event, the true objective of the study, showed zilch evidence linking SSBs and obesity. They then tracked all the studies published in English that cited the paper. Of these, 84% inaccurately reported the primary data. That is, they chose to ignore the main and “true” conclusion of the paper and chose instead to focus on the sideshow, the non-intended analysis, which did suggest that in some sub-groups there was a possible link between SSBs and obesity. This is a minor snapshot of a paper that shows a massive systematic bias of researchers toward that interpretation of data, which suits their agenda.

Just about everybody reading this will recognise this bias in all spheres of human activity. However, for science, which purports to be built on the truth, this is a major problem. If scientists, select from here and there to suit their agenda, the “sayonara” objectivity.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.