From the mid-1990s we have developed and enhanced causal discovery software for learning causal Bayesian networks from sample data using an MML measure. We have now made linear CaMML available after some years of absence, when we exclusively focused on developing discrete CaMML. Discrete CaMML has been available for some years via Rodney O'Donnell's github web page, but we link into it now through our "Software" page. The user interfaces for these programs are not exactly user friendly; however, a GUI has been developed for discrete CaMML and should become available reasonably soon.
We are pleased to announce that Bayesian Intelligence will again be running a series of Bayesian network training workshops in Melbourne this year.
There will be six days of workshops, with the introductory BN training running in both April and June:
- April 4th: Introduction to BNs
- April 5th: More on BNs
- June 27th: Introduction to BNs
- June 28th: More on BNs
- Sep 26th: Programming BN solutions with Netica (the basics)
- Sep 27th: Programming BN solutions with Netica (advanced topics)
People are invited to register for any combination of the training days that best suits their background in BNs and their interests.
—Kevin B Korb
I was interviewed by ABC Newsline's Kesha West and the result edited into this video report on the future of Artificial Intelligence and the Technological Singularity.
—Kevin B Korb
I have a lot of respect for Crikey, the online Australian newsletter. They report on a lot of things other media outlets won't touch, especially bias and disinformation in those other media outlets. But the other day, while reading a piece by Bernard Keane on the inconsistency of Tony Abbott's rejection of a carbon tax, after his having previously advocated one, I read:
Insistence that the planet is not getting warmer — or, as Abbott until recently insisted, is getting slightly cooler — has become more difficult to maintain publicly, despite the faulty logic of linking weather to climate. (B Keane, Crikey, 5 Feb.)
It is certainly the case that weather is not the same as climate. This is pretty clearly revealed by the fact that many global warming deniers are weather forecasters, whereas hardly any climatologists are deniers. (NB: there are a lot more forecasters than climatologists!) But denying a link between weather and climate is simply absurd. The relation between climate, the prevailing weather in a region and season, and the weather itself, on any given occasion, is stochastic: the climate system, plus specific, highly variable, conditions together determine the specific weather. That establishes a kind of probabilistic dependency —i.e., a link —which is widely recognized in society.
For example, only ignorant people or fools now deny that smoking causes lung cancer. This is so despite the fact that many smokers never get lung cancer. Some of them die too soon from other causes, such as emphysema. But many smoke contentedly for decades with no sign of the cancer showing up. Lawyers for tobacco companies used to point this out in trying to make the case that specific complainants had no basis for complaint, because their lung cancers might have been amongst those caused by pesticide exposure or smog or a stray cosmic ray striking a susceptible cell in the lung. But these defences have been abandoned, with even tobacco companies accepting some culpability for the disease in specific cases. The situation is analogous with weather and climate change. Was Katrina specifically due to global warming? Well, obviously not in its totality; but global warming heating the Gulf of Mexico likely contributed to its intensity. Specific events will never be entirely attributable to a broad-scale change, because the broad-scale change will never be entirely responsible for a specific event in all its specificity. Denying a linkage on that basis, however, is a nonsense. An accumulation of extreme weather events, and a statistical assessment showing that the probability of their extremity without global warming is rapidly vanishing, will eventually silence those who claim weather tells us nothing about climate.
Probabilistic dependencies are real. The link they establish, in fact, is just that between a stochastic hypothesis and the evidence which confirms it. Denying such a link is tantamount to denying the statistical foundations of empirical science.
If you want to "sound reasonable" by making some concession or other to global warming deniers, then you should do so by reporting something that is factual, rather than counterfactual. You can point out that many deniers have good dress sense, or sometimes use grammatical sentences, for example. Buying into their dogma about weather versus climate change can all too easily turn into buying into their rejection of science.
—Kevin B Korb
The Chennai Institute of Mathematical Sciences held its 50th birthday party in Pondicherry, India, 4-8 January 2013, organized by Ronojoy Adhikhari and Rahul Siddharthan. This was a lively meeting attended by statisticians, physicists, biologists, climate scientists, computer scientists and others, united by an interest in applying Bayes Theorem in solving all kinds of scientific problems — and divided, as usual, by the many possible interpretations of Bayes Theorem. I look forward to the day(?) when Bayesians can find consensus, not over what in particular probabilities may be, but over the fact that they may be diverse things. Acknowledging objectivity needn't come at the price of abandoning subjectivity (see, e.g., David Lewis's "A subjectivist's guide to objective chance" in R. Jeffrey (ed) Studies in Inductive Logic and Probability, vol III, 1980).
In any case, there were many interesting presentations and discussions, including, among many others: Devinder Sivia (Oxford) presenting Bayesian methods of data analysis, Rajesh Rao (Washington) describing recent Bayesian models of brain function, Erik van Nimwegen (Basel) using Bayesian networks to predict protein contacts, Balaji Rajagopalan (Colorado) analysing climate change with extreme value models. I gave talks on Bayesian network modeling, causal discovery of Bayesian nets, and discretization. Most of these were filmed and will be made available on Youtube. When that happens, I'll update this post.
The Science Slam final will be held in few days in Cologne. The English-speaking world needs something like this as well! (see the list of countries at wikipedia.) What is the Science Slam?
I translate the corresponding page from www.scienceslam.de:
Was ist ein Science Slam?
The Science Slam offers students and researchers an opportunity to present their research projects in an entertaining 10-minute show on stage.
In contrast to a Poetry Slam any sort of aid is allowed: Power point, props or live experiments are welcome. When the Science Slam ends, the audience decides which Slammer goes home the winner.
The aim of Science Slam is to encourage scientists to present their work in a clear and easily understandable way. At the same time, the entertaining lectures for non-specialist audiences give people the chance to get infected by the enthusiasm of the slammers for their projects. Although research is the focus here, the scientific value of the lecture plays a subordinate role. Rather, the emphasis is on communication, and on showing the public what young scientists are devoting their energies to.
− Kevin B Korb
Bad science comes in a number of varieties, at least including the following:
- Sloppy science. This might include poor experimental design, poor measurements, slovenly reasoning, insufficient power in one's tests, failure to blind experimenters or subjects, etc. Presumably, the intentions are right, but the execution is wrong.
- Pseudo-science. This is fake science. The fakery may be intentional or unintentional. For example, cultists may intentionally generate some large-scale fantasy, while their followers unsuspectingly take it seriously. If the pseudo-scientific methods employed have the look and feel of science, then this is due to simulation or accident, and not due to the proper employment of scientific methods. For Karl Popper, demarcating real from pseudo-science was a kind of mission. He proposed a "falsificationist" criterion: that theories which were (or could be) protected from any possible contrary evidence were non-scientific. Unfortunately, this could never quite be made to work; there are no logical limits to what can be defended, or not, since, as Quine put it, all of our ideas are tied together in a "Web of Belief" (Quine and Ullian, 1978). Still, Popper was certainly on to something: those, such as climate change deniers, who spin excuses and rationalizations no matter what the evidence may be good propagandists, but they are not good scientists.
- Cheats. This is also fake science, but most likely not with a view to promoting a false story about the world, but instead a false story about the researcher.
Ben Goldacre's book Bad Science (Fourth Estate, 2009) treats miscreants and violators of scientific method primarily in the first two categories. Being a journalist (and MD) he, perhaps naturally, focuses largely on the aberrations and violations perpetrated by journalists. On his account, they've done quite a lot of damage. For example, around 2005 there were repeated scandals in the UK concerning rampant MRSA in UK hospitals, but the findings were all traceable to a single lab, "the lab that always gives positive results". Apparently, journalists responded to that description by anticipatory salivation, rather than anxious palpitation. It's a ludicrous, and sad, story.
For newcomers to scientific or medical research, Goldacre's book is an entertaining, accessible introduction to a host of issues you will need to know about: experimental design, bias in statistics, cheating by pharmaceutical companies in research and in advertising, the silliness of homeopathy, how we fool ourselves into believing what we want to believe and what measures can be taken to minimize our own foolishness.
For those well versed in these kinds of issues, the book, while a good source of anecdotes, is just a little disappointing. It's important to provide accessible accounts of science and method, but Goldacre goes just a bit far in dumbing things down, in my opinion. Popular science writers should not be assuming that their readers are idiots. He proposes as his motto: "Things are a little more complicated than that". Indeed, they are. Still, on the whole, this is a good and positive contribution to the public understanding of science.
(17 Nov 2012) I think perhaps I was a bit too negative at the end of the note above. Goldacre's book can be seen as an extended plea for a more evidence-oriented treatment of science journalism and, in particular, as a protest against the view that science is just too complicated for ordinary folk to understand — a view which he rightly condemns for promoting appeals to authority for arbitrating scientific disputes, rather than appeals to evidence. The result is a serious dumbing down of public policy debates, including a tendency to portray all sides of a scientific dispute as having equal support, because all sides can call upon any number of "experts". This message certainly needs to be spread. The quality of public debate about topics that concern science is very poor indeed.
— Ann E Nicholson
The proceedings of our recent workshop on applying Bayesian networks to real-world problems will be coming out soon (a preliminary version is available for on-line viewing here). The workshop was co-located with the 28th Conference on Uncertainty in Artificial Intelligence (UAI 2012), on Catalina Island, California on August 18, 2012.
Bayesian networks are by now a well-established technology for reasoning under uncertainty, supported by numerous mature academic and commercial software tools. They are being applied in many domains, for example, environmental and ecological modelling, bioinformatics, medical decision support, many types of engineering, robotics, military, financial and economic modelling, education, forensics, emergency response, and surveillance. This workshop solicited submissions describing real-world applications, whether as stand-alone BNs or BNs embedded in larger software systems. We suggested authors address the practical issues involved in developing the applications, such as knowledge engineering methodologies, elicitation techniques, defining and meeting client needs, validation processes and integration methods, as well as software tools to support these activities.
The resultant workshop included presentations on a good variety of applications, including oil drilling, managing river catchments, analysing HIV mutations, gang violence, and understanding students' reading comprehension. Many of the applications responded to a workshop theme by illustrating models of temporal reasoning, using dynamic Bayesian networks (DBNs), continuous-time Bayesian networks (CTBNs) and partially observable MDPs (POMDPs).
The workshop demonstrated an active and growing community of modellers taking what were until recently research techniques for Bayesian modelling and applying them to solving a diverse range of important problems in the wider community.
The ABNMS will meet in Wollongong in the last week of November 2012. For details see their conference website. Submissions of abstracts are due by the end of August.
— Colin Howson
A Bayesian evaluation of the evidence, old and new, for the existence of a God of the sort the Abrahamic religions postulate reveals that there really isn't any: on the contrary, such evidence as can be found is very strongly against such a being. In my new book Objecting to God (Cambridge, 2011) I employ Bayesian probability to counter many 'pro-God' arguments in the recent literature, particularly those bits of it discussing the alleged extreme improbability of fine-tuning and the development of complex life-forms. In particular, the "Anthropic Argument" for the existence of a God is no more compelling than its underlying "Anthropic Principle", which I show to be fallacious.
Not only do the Abrahamic religions lack any credible evidential foundation, but their influence is largely malign, embodying codes of ethics both primitive and repressive. In my book I argue on the contrary for a humanitarian ethics based on a more modern version of Aristotle's notion of eudaimonia. Another novel feature of my book is its drawing a parallel between the logico-mathematical paradoxes of the late nineteenth and early twentieth centuries and the ancient theological paradoxes arising from the notion of an omniscient, omnipotent, perfectly good deity. I show how Tarski's celebrated theorem(s) on the indefinability of truth refutes the postulate of omniscience. I also present a critical discussion of Richard Dawkins's well-known attempt to prove that the hypothesis of God is itself extremely improbable.
Colin Howson is a Professor of Philosophy at the University of Toronto and Emeritus Professor in the Philosophy Department at the London School of Economics. For a more detailed and careful presentation of these ideas read his book Objecting to God (Cambridge University Press, 2011).