Showing posts with label h-index. Show all posts
Showing posts with label h-index. Show all posts

2 January 2014

How journals like Nature, Cell and Science are damaging science

The following great article appeared in the  The Guardian, Monday 9 December 2013, and contains a number of accurate and important points. Many people in Academia would agree with it, but few would dare speak about it. I reproduce it below verbatim. Please notice that the emphasis is mine. Note, in particular, how the author speaks about the concept of "impact factor" (a gimmick!), and how it urges scientists (including administrators) to seek the truth, rather than fall victims of a fake-bonus practice. Indeed, one's achievements should NOT be measured as a function of impact factor and other silly criteria (like the h-index), but as a function of one's research and scholarship. But this would pose extra work for the heads of university divisions, for the committees deciding on one's promotion, and for the funding agencies. But, hey, learning something about the work of the person you judge ain't that bad, right?


The incentives offered by top journals distort science, just as big bonuses distort banking

Litter in the street
The journal Science has recently retracted a high-profile paper reporting links between littering and violence. Photograph: Alamy/Janine Wiedel

I am a scientist. Mine is a professional world that achieves great things for humanity. But it is disfigured by inappropriate incentives. The prevailing structures of personal reputation and career advancement mean the biggest rewards often follow the flashiest work, not the best. Those of us who follow these incentives are being entirely rational – I have followed them myself – but we do not always best serve our profession's interests, let alone those of humanity and society.

We all know what distorting incentives have done to finance and banking. The incentives my colleagues face are not huge bonuses, but the professional rewards that accompany publication in prestigious journals – chiefly Nature, Cell and Science.

These luxury journals are supposed to be the epitome of quality, publishing only the best research. Because funding and appointment panels often use place of publication as a proxy for quality of science, appearing in these titles often leads to grants and professorships. But the big journals' reputations are only partly warranted. While they publish many outstanding papers, they do not publish only outstanding papers. Neither are they the only publishers of outstanding research.

These journals aggressively curate their brands, in ways more conducive to selling subscriptions than to stimulating the most important research. Like fashion designers who create limited-edition handbags or suits, they know scarcity stokes demand, so they artificially restrict the number of papers they accept. The exclusive brands are then marketed with a gimmick called "impact factor" – a score for each journal, measuring the number of times its papers are cited by subsequent research. Better papers, the theory goes, are cited more often, so better journals boast higher scores. Yet it is a deeply flawed measure, pursuing which has become an end in itself – and is as damaging to science as the bonus culture is to banking.

It is common, and encouraged by many journals, for research to be judged by the impact factor of the journal that publishes it. But as a journal's score is an average, it says little about the quality of any individual piece of research. What is more, citation is sometimes, but not always, linked to quality. A paper can become highly cited because it is good science – or because it is eye-catching, provocative or wrong. Luxury-journal editors know this, so they accept papers that will make waves because they explore sexy subjects or make challenging claims. This influences the science that scientists do. It builds bubbles in fashionable fields where researchers can make the bold claims these journals want, while discouraging other important work, such as replication studies.

In extreme cases, the lure of the luxury journal can encourage the cutting of corners, and contribute to the escalating number of papers that are retracted as flawed or fraudulent. Science alone has recently retracted high-profile papers reporting cloned human embryos, links between littering and violence, and the genetic profiles of centenarians. Perhaps worse, it has not retracted claims that a microbe is able to use arsenic in its DNA instead of phosphorus, despite overwhelming scientific criticism.

There is a better way, through the new breed of open-access journals that are free for anybody to read, and have no expensive subscriptions to promote. Born on the web, they can accept all papers that meet quality standards, with no artificial caps. Many are edited by working scientists, who can assess the worth of papers without regard for citations. As I know from my editorship of eLife, an open access journal funded by the Wellcome Trust, the Howard Hughes Medical Institute and the Max Planck Society, they are publishing world-class science every week.

Funders and universities, too, have a role to play. They must tell the committees that decide on grants and positions not to judge papers by where they are published. It is the quality of the science, not the journal's brand, that matters. Most importantly of all, we scientists need to take action. Like many successful researchers, I have published in the big brands, including the papers that won me the Nobel prize for medicine, which I will be honoured to collect tomorrow.. But no longer. I have now committed my lab to avoiding luxury journals, and I encourage others to do likewise.

Just as Wall Street needs to break the hold of the bonus culture, which drives risk-taking that is rational for individuals but damaging to the financial system, so science must break the tyranny of the luxury journals. The result will be better research that better serves science and society.

26 June 2011

(Some of the) funny aspects of the Academy of Athens, II

I continue with another hilarious example about a member of the Academy of Athens. One of its members is a theoretical physicist who is known, in Greece, by a large number of common people, people who have nothing to do with physics or science. I've heard his name being mentioned by taxi drivers, manicurists and air hostesses, among others. He is, according to these people, the greatest scientist of all times.

His name is Dimitri Nanopoulos. But how come everyone knows him? On what basis do these people know this academician? Why is he so popular? Is he a popularizer of science? Some time ago I searched to find what's going on. I was very surpised when a mainstream Greek newspaper had a full-page dedicated to the advertisment of a car (Lexus) together with a picture of the aforementioned academician. Fair enough, I thought, he's trying to make some (more) money. But then I read the following phrase below his photo:
Professor Nanopoulos has achieved international reputation. Doing research mainly in Cosmology and High Energy Physics, he is considered today one of the four greatest theoretical physicists of all times.

Just as the distinguished theoretical physicist methodically "besieges" the next scientific revolution, so does Lexus constantly seeks perfection.
So, let's see: Newton, Einstein, Maxwell, and Nanopoulos. What about Richard Feynman, Freeman Dyson, Lev Landau, Henri Poincare, ... , ... ? Well, according to the Lexus advertisement, there is no doubt. The set must contain 4 people. One of them is Nanopoulos.

But where does this claim come from, and what does it mean? I looked further.  According to wikipedia,
He is one of the most regularly cited researchers in the world, cited more than 35,800 times over across a number of separate branches of science.
So, perhaps, the phrase "greatest number of citations" has been changed to "greatest scientist". Is that so? Does number of citations necessarily mean greateness? Yes, says Nanopoulos.

Shortly after the advertisement appeared in several Greek newspapers, in a public letter, 12 emminent Greek physicists write:
[Nanopoulos] knows well that such comments are at the border of being ridiculous, provocatively insulting one's intelligence, and denigrate the Greek scientific community.
Moreover, the 12 scientists ask the president of Greece to be careful when appointing such a self-bragging person to positions of responsibility, such as the president of the Council of Research and Technology (and others).

Nanopoulos replied by characterizing the authors of the letter as "scientists" [i.e. scientists in quotes], and mentioned that people like Al Gore also advertize various products [yes, but Gore is not a scientist]. He also said:
Regarding my achievements in the domain of science, I attach my CV as well as a comparative table of my works and citations, without comment.
In the attached table, he lists the total number of citations to the 12 other scientists (26862) and compares it to the number of citations to his own papers (31412). Therefore [he implies], I am better than the sum of all these other "scientists".

There is another comparison he makes, and this is ridiculous. It concerns the so-called h-index:
A scientist has index h if h of [his/her] Np papers have at least h citations each, and the other (Np − h) papers have at most h citations each.
This silly measure of success was devised several years ago and is taken seriously by lazy administrators, but not by scientists. It is well known (i) that citations alone do not measure one's greateness and (ii) that it is not too hard to boost up one's citations by forming alliances. Moreover, not all citations are necessarily positive (I can cite a paper for its wrong results). However, not only has the h-index (and a variety of other indices) has been glorified, but a "science" has also been formed, the so-called Bibliometrics or Scientometrics. For instance, it is not hard to find papers looking at statistics of indices and "mathematics" of indices. The drive to summarize one's achievements by a single number has thus provided jobs to many other people who can now write papers on citation indices, thereby increasing their own citations!

A good critique of the lunacy around the h-index and other bibliometrical concepts is the paper "Citation Statistics", by Robert Adler, John Ewing and Peter Taylor, a report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS):
The drive towards more transparency and accountability in the academic world has created a "culture of numbers" in which institutions and individuals believe that fair decisions can be reached by algorithmic evaluation of some statistical data; unable to measure quality (the ultimate goal), decision makers replace quality by numbers that they can measure. This trend calls for comment from those who professionally “deal with numbers”— mathematicians and statisticians.
 To summarize:
  1. A Greek academician, D. Nanopoulos, uses the h-index as a measure of his achievements. This can be witnessed in numerous web cites, in his talks, in his wikipedia entry, in his letter to the President of  Greece, etc.
  2. His having one of the greatest number of citations (and a big h-index) has been [presumably] translated and equated to his being one of the four greatest physicists of all times.
  3. The car company Lexus has used this, presumably in cooperation with Nanopoulos, to advertize their car.
Something is fundamentally wrong with all that. Perhaps it is because Nanopoulos is a professor in a horrible place, Texas A&M, where the heat, the conservatism, the guns around you, the pressure towards being the biggest (it's Texas) can drive you crazy. Nanopoulos is also being advertized as "a constant claimant of a Nobel prize"... As I said, funny things happen at the Academy of Athens....



T H E B O T T O M L I N E

What measure theory is about

It's about counting, but when things get too large.
Put otherwise, it's about addition of positive numbers, but when these numbers are far too many.

The principle of dynamic programming

max_{x,y} [f(x) + g(x,y)] = max_x [f(x) + max_y g(x,y)]

The bottom line

Nuestras horas son minutos cuando esperamos saber y siglos cuando sabemos lo que se puede aprender.
(Our hours are minutes when we wait to learn and centuries when we know what is to be learnt.) --António Machado

Αγεωμέτρητος μηδείς εισίτω.
(Those who do not know geometry may not enter.) --Plato

Sapere Aude! Habe Muth, dich deines eigenen Verstandes zu bedienen!
(Dare to know! Have courage to use your own reason!) --Kant