I enjoyed the conference, ‘Circling the Square’ (20-22 May 2014) organised by Reiner Grundmann and colleagues from the Science,Technology and Society Priority Group at Nottingham University. Bringing together academics from the natural and social sciences (and others), the conference explored how scientific knowledge is (or should be) used for policy. Some reactions have been collated by Brigitte Nerlich on the Making Science Public blog.
There were many facets to the discussion, but here I will make just a few observations. As a former natural scientist now attempting to become a social scientist, I appreciated the refreshingly frank (and generally good-natured) exchanges on the different world views across the natural/social science divide (or continuum).
Each ‘side’ at times felt mis-characterised by the other. Some of the stereotypes of natural scientists that emerged in the conference were highlighted in Oxford biophysicist Sylvia McLain’s talk and described in her blog. ‘Scientists are poor communicators’ ‘…answer the wrong questions’, ‘…are self-indulgent’. Andy Williams (Cardiff) gave an excellent keynote talk on the worrying tendency for university press releases to hype the underlying science, with the complicity of the scientists involved (possibly driven by the need for ‘impact’ as I discuss further below). On the other hand, physicist Philip Moriarty (Nottingham) was scathing on how sociologists write to make simple concepts complex (see Warren Pearce’s blog post on this). Steve Rayner (Oxford) noted that social science is also split across the descriptive/ interpretive divide (‘mindless bean-counters’ versus ‘fuzzy arm-wavers’) – the bean-counters having more affinity with the natural sciences and the arm-wavers with the humanities.
An interesting discussion emerged on the differing views of ‘what facts are’. Financial mathematician Tim Johnson (Heriot Watt) brought some clarity. He noted that scientists often say ‘it’s a fact’ when they mean ‘it’s a model’. He suggested that the financial crisis may have been partly caused by the over-reliance on models as ‘facts’. The ‘facts’ useful for policy may not be the ‘facts’ that science provides.
One thing that united the participants (apart from the outstanding quality of the conference catering) was criticism of the ‘impact agenda’ and its pernicious influence on ‘curiosity-driven’ research and particularly on (UK) research council funding applications. Brian Collins (Director of the Centre for Engineering Policy, UCL, and former Chief Scientific Adviser to the Department for Business Innovation and Skills) appeared to suggest that funding for a telescope might be justified by potential downstream healthcare advances.
#circlesq Collins suggest justifying public money for astronomy by its contributions to health care. Hmmmm.—
Roger Pielke Jr. (@RogerPielkeJr) May 20, 2014
Where do we go from here? The value of the conference lay partly in exposing people to concepts they were unfamiliar with from some outstanding speakers, and starting to break down some of the misconceptions. More lofty outcomes may well flow in the future – perhaps building on Reiner Grundmann’s table of what academia, media, ‘civil society’ and policy-makers expect from each other – which may not be what each group wishes or is able to provide.