Publishers withdraw more than 120 gibberish papers

The publishers Springer and IEEE are removing more than 120 papers from their subscription services after a French researcher discovered that the works were computer-generated nonsense.Over the past two years, computer scientist Cyril Labbé of Joseph Fourier University in Grenoble, France, has catalogued computer-generated papers that made it into more than 30 published conference proceedings between 2008 and 2013. Sixteen appeared in publications by Springer, which is headquartered in Heidelberg, Germany, and more than 100 were published by the Institute of Electrical and Electronic Engineers IEEE, based in New York. Both publishers, which were privately informed by Labbé, say that they are now removing the papers.

Just because something is published under the banner of “SCIENCE!” does not mean is it actually scientific. Via Publishers withdraw more than 120 gibberish papers : Nature News & Comment.

Facts Are Not Enough; You Need Theory As Well

You want to find empirical studies that show free trade to be harmful to free-trading nations?  No problem; you can find them.  You want to find empirical studies that show government stimulus spending to be a sure-cure for what ails a slumping economy?  There are plenty of such data-rich studies out there.  You want to find empirical studies that show that violent crimes aren’t deterred by the death penalty?  Not a problem.  You want to find empirical evidence that increased rates of handgun ownership increase citizens’ likelihood of dying of gunshot wounds?  Many such studies are available.

You can also find plenty of empirical studies showing the opposite of what is shown by all of the above studies.  And these other studies are, as a group, no less carefully done than are the studies that they contradict.  And these other studies, also, are done by scholars no less credentialed and no less objective than are those scholars who produce the contrary findings.

That’s the reality of the social sciences.  It’s not an exercise in simple observation of simple and self-defining facts, only one or two of which change at any time.

via Where Are My Data?!.

Causal Density In Statistical Models Yields Unreliable Results

When there are many factors that have an impact on a system, statistical analysis yields unreliable results. Computer simulations give you exquisitely precise unreliable results. Those who run such simulations and call what they do “science” are deceiving themselves.

Beware also of models that fit historical data (especially “corrected” or “adjusted” data) but do not provide accurate predictions. This applies to macroeconomics, climate change, the stock market, social “sciences”, and other complex systems with high causal density. You have to be very very careful you aren’t fooling yourself with these models; and, as noted by Feynman, “yourself” is the easiest person to fool. Via Causal Density is a Bear | askblog.

Global Temps At Low End Of Model Predictions — The Models Might Be Wrong

Over the past 15 years air temperatures at the Earth’s surface have been flat while greenhouse-gas emissions have continued to soar. The world added roughly 100 billion tonnes of carbon to the atmosphere between 2000 and 2010. That is about a quarter of all the CO? put there by humanity since 1750. And yet, as James Hansen, the head of NASA’s Goddard Institute for Space Studies, observes, “the five-year mean global temperature has been flat for a decade.”

Temperatures fluctuate over short periods, but this lack of new warming is a surprise. Ed Hawkins, of the University of Reading, in Britain, points out that surface temperatures since 2005 are already at the low end of the range of projections derived from 20 climate models (see chart 1). If they remain flat, they will fall outside the models’ range within a few years.

The mismatch between rising greenhouse-gas emissions and not-rising temperatures is among the biggest puzzles in climate science just now. It does not mean global warming is a delusion. Flat though they are, temperatures in the first decade of the 21st century remain almost 1°C above their level in the first decade of the 20th. But the puzzle does need explaining.

The mismatch might mean that—for some unexplained reason—there has been a temporary lag between more carbon dioxide and higher temperatures in 2000-10. Or it might be that the 1990s, when temperatures were rising fast, was the anomalous period. Or, as an increasing body of research is suggesting, it may be that the climate is responding to higher concentrations of carbon dioxide in ways that had not been properly understood before. This possibility, if true, could have profound significance both for climate science and for environmental and social policy.

All emphasis mine. I will add that if the models are wrong, the scientists defining those models have also been wrong (or at the very least they have had insufficient information to build models with sufficient predictive capability). Via Climate science: A sensitive matter | The Economist and PowerLine (where there is more commentary).

Thank Evolution For Your Alcohol Tolerance

The boozing gene can be traced back 10 million years to the common ancestor humans share with chimpanzees and gorillas, new research claims.

It is believed these ancient forebears were the first to pick up fruits fermenting on the ground after they developed a lifestyle away from the trees.

Individuals able to stomach the boozy fruit would have survived better in this new environment than those who could not, programming the ability into their descendants’ genetic codes.

The theory could explain why humans, chimps and gorillas are able to digest alcohol, while our tree-dwelling cousins like orangutans cannot.

“The cause of, and solution to, all life’s problems.” Via Scientists trace the boozing gene: Taste for drink ‘originated 10million years ago in common ancestor of humans and chimps’ | Mail Online.

It’s Not Enough To Have Data; You Also Need A Theory. Multiple Theories Can Fit The Same Data.

You want to find empirical studies that show free trade to be harmful to free-trading nations?  No problem; you can find them.  You want to find empirical studies that show government stimulus spending to be a sure-cure for what ails a slumping economy?  There are plenty of such data-rich studies out there.  You want to find empirical studies that show that violent crimes aren’t deterred by the death penalty?  Not a problem.  You want to find empirical evidence that increased rates of handgun ownership increase citizens’ likelihood of dying of gunshot wounds?  Many such studies are available.

You can also find plenty of empirical studies showing the opposite of what is shown by all of the above studies.  And these other studies are, as a group, no less carefully done than are the studies that they contradict.  And these other studies, also, are done by scholars no less credentialed and no less objective than are those scholars who produce the contrary findings.

That’s the reality of the social sciences.  It’s not an exercise in simple observation of simple and self-defining facts, only one or two of which change at any time.

Therefore, theory is important.  Among other roles, theory directs our attention to what patterns to look for, and helps us to better understand what empirical findings warrant our suspicion more than others.  Obviously, theory should never be used as dogma to prevent our learning from careful empirical studies.  Nor, however, should well-accepted and coherent theories be tossed aside simply because a handful of people produce a few studies that are inconsistent with that theory – especially if other careful empirical studies support the theory.

So while it’s always a good instinct to ask “What do the data say?  What does history tell us about this matter?”, it’s just as scientifically naive to ridicule thoughtful discussion of theory (including discussion of pitfalls in interpreting data) by suggesting that the discussion is useless because it presents no data as it is to suggest that theory should never be subjected to empirical tests.

via Where Are My Data?!.

Mother Jones Is Wrong: Mass shootings are not trending upwards. Data included.

The figure below displays the number of mass shootings — incidents and victims — from 1976 through 2010. These reflect all mass shootings in which at least four victims were killed that had been reported to the FBI by local law enforcement authorities as part of the routine collection of crime statistics. Unlike the Mother Jones approach, these data do not exclude cases based on motive, location, or victim-offender relationship. They only exclude incidents in which fewer than four victims (other than the assailant) were killed, murders committed with a weapon other that a firearm, or isolated cases that may have occurred in jurisdictions that did not report homicide data to the FBI. Also, only because of the usual time lag in crime reporting, the figures for 2011 in 2012 are not yet available.

According to these expanded figures, there have been, on average, nearly 20 mass shootings a year in the United States. Most, of course, were nowhere as deadly as the recent massacres in Colorado and Connecticut that have countless Americans believing that a new epidemic is upon us and have encouraged healthy debate concerning causes and solutions. Notwithstanding the awful tragedies of this past year, there has been no upward trend in mass shootings.

What is abundantly clear from the full array of mass shootings, besides the lack of any trend upward or downward, is the largely random variability in the annual counts. There have been several points in time when journalists and other people have speculated about a possible epidemic in response to a flurry of high profile shootings. Yet these speculations have always proven to be incorrect when subsequent years reveal more moderate levels.

(Emphasis mine.) You mean Mother Jones picked their data to suit their narrative? Quel horreur! Via Mass shootings not trending – James Alan Fox – Crime & Punishment blog – Boston.com.

The Temperament Of The Dissident

The dissident temperament has been present in all times and places, though only ever among a small minority of citizens. Its characteristic, speaking broadly, is a cast of mind that, presented with a proposition about the world, has little interest in where that proposition originated, or how popular it is, or how many powerful and credentialed persons have assented to it, or what might be lost in the way of property, status, or even life, in denying it. To the dissident, the only thing worth pondering about the proposition is, is it true? If it is, then no king’s command can falsify it; and if it is not, then not even the assent of a hundred million will make it true.

This explains *so much* about me. Via Dissident Of The Month « Chateau Heartiste.

The Economist Gives Up on Global Climate Treaties

The Economist also brings us big news on the “settled science” of climate change. A new study has found soot to be twice as bad for climate as was previously thought, making it the second most damaging greenhouse agent after CO2. This is actually good news for two reasons.

First, soot is easier to control than CO2, and targeting that kind of pollution provides lots of benefits that have nothing to do with climate change: it’s a dangerous pollutant and a health threat on its own. Second, controlling soot will seriously slow the speed of climate change. One of the study’s authors told the Economist that fully addressing the soot problem would strip half a degree from potential warming, buying politicians and scientists more time to make informed decisions.

This is where we’ve been for some time: the global approach to reducing CO2 emissions is a dead end, and while the overall science about the climate seems well established, there are some significant fiddly bits that haven’t yet been worked out. There may be more surprises like soot in the works, some good, some bad, but in any case the details, the timing, and the consequences of climate change are less clear than the overall arc, and the case for particular policies is often significantly weaker than the overall case that climate change is under way.

via The Economist Gives Up on Global Climate Treaties | Via Meadia.

The Climate Issues You Should *Really* Be Worried About: Solar Output Variation And Ice Ages

The average G-type star shows a variability in energy output of around 4%. Our sun is a typical G-type star, yet its observed variability in our brief historical sample is only 1/40th of this. When or if the Sun returns to more typical variation in energy output, this will dwarf any other climate concerns.

The emergence of science as a not wholly superstitious and corrupt enterprise is slowly awakening our species to these external dangers. As the brilliant t-shirt says, an asteroid is nature’s way of asking how your space program is doing. If we are lucky we might have time to build a robust, hardened planetary and extraplanetary hypercivilization able to surmount these challenges. Such a hypercivilization would have to be immeasurably richer and more scientifically advanced to prevent, say, the next Yellowstone supereruption or buffer a 2% drop in the Sun’s energy output. (Indeed, ice ages are the real climate-based ecological disasters and civilization-enders — think Europe and North America under a mile of ice). Whether we know it or not, we are in a race to forge such a hypercivilization before these blows fall. If these threats seem too distant, low probability, or fantastical to belong to the “real” world, then let them serve as stand-ins for the much larger number of more immediately dire problems whose solutions also depend on rapid progress in science and technology.

Via “2013 : WHAT *SHOULD* WE BE WORRIED ABOUT?” at Edge.org.There’s no direct link to the essay I’ve quoted; search for the essay title “Unfriendly Physics, Monsters From The Id, And Self-Organizing Collective Delusions” on that page.