The Values Added by Professional Journalists – and Collaboration with Academics

As a student of, and advocate for, digital citizens of the Fifth Estate, I have been seriously interested in journalism studies. So I welcomed the opportunity to attend a symposium organized by the School of Media and Communication at Leeds University by virtue of being a Visiting Professor at the School this year. It was entitled ‘Distinctive Roles for Public Service Journalism in Challenging Times’. The event brought practitioners, mainly from the BBC, together with academics, for a set of well-chosen topics, outlined below. The symposium adhered to the Chatham House Rule, so I can’t attribute quotes to individuals, but I will try to capture some of the ways in which the discussions stimulated my own thinking about ‘public service journalism’ in the Internet Age.

Held on 27 November, the one-day event was organized by Professor Stephen Coleman at Leeds, and Ric Bailey, from the BBC, who is a Visiting Professor at Leeds. I presume that Ric Bailey took a strong role with Stephen in bringing speakers from the BBC and Ric moderated the entire day of discussion. This academic-practitioner collaboration was key to the day’s success.

The symposium began with a presentation by Joanna Carr, Head of Current Affairs at the BBC, who covered key challenges facing public service broadcasters. This was followed immediately by a panel led by Joanna and John Corner, a Visiting Professor in the School of Media and Communication at the University of Leeds, formerly based at Liverpool University,on the challenges of reporting and explaining complex issues covered by the media, such as ‘austerity’, climate change, or Brexit. The presentation and panel drove home some key themes for me of the entire day – mainly around the thought and craft that professional journalists put into their strategies for putting audiences at the heart of their work.

I approached this panel with some level of skepticism about complexity as an issue. First, my own academic colleagues too often lament that their work is too complex to convey in a more accessible way. But they nevertheless come up with engaging titles for their books, and abstracts for their articles, so it is not impossible to simplify. Complexity is not an acceptable excuse for being unclear. Secondly, I can never forget an editor of a prestigious news magazine once telling me that she instructions to her writers was to ‘simplify and then exaggerate’. I’m simplifying, but nevertheless her phrase worried me. Simplification might be a central problem facing journalism.

However, this panel won me over to the challenges facing good journalists. It drove home the degree that leading journalists are truly focused on reaching their audiences with coverage that is both engaging and understandable. As one speaker reminded us: “You can’t force people to eat their greens”, or to listen to their news coverage.

So the ‘craft skills’ that journalists bring to the table in selecting, defining, and communicating stories is a huge contribution to the public, what one panelist referred to as ‘BBC simplification’ is not to simplify and exaggerate to gain readers or viewers, but simplify to deliver a public service. They seek to avoid ‘elite speech’, even though some well-regarded journalists believed in talking to elites rather than the mass public, and not simply report what the subjects of the news say freely, but to structure and sequence the flow of complex stories and determine what needs to be ‘dug out’ through good interviewing skills, often conducted in a highly politicized space. Their efforts are clearly around adding value to the news, not simply reporting it.

There was an interesting discussion of the differences in complexity across issues, such as Brexit versus climate change. Some complex issues are abstract and don’t have the ‘lighting flashes’ that that make some events, such as a crash, relatively easier to report. It also seemed to me that some issues are complicated but some well-known fundamentals, such as climate change, while others, like Brexit, are impossible to know precisely as they are unfolding and unpredictable futures – what the former US Secretary of Defense, Donald Rumsfeld, famously called ‘known unknowns’.[1]

The second panel focused on data journalism, kicked off by Professor Chris Anderson of Leeds, who spoke about some of the continuities and discontinuities that data journalism brings to traditional journalistic practices. John Walton, who leads the BBC data journalism team at the BBC, followed with an overview of their work. Chris focused more on the discontinuities, but I kept thinking of data journalism as a continuation and growing sophistication of a long tradition of journalists valuing data. Social scientists are often advised to provide some percentages in their press releases to increase the likelihood of a story being picked up. But today, the best news organizations are developing more sophisticated teams within their own organization, like the BBC journalism team, to locate and analyze data that can create news items, often in collaboration with others. Of course, the same trend towards more collaborative and team research is evident across the social sciences as data sciences in academia as well.[2].

C. W. Anderson, from his Blog

After lunch, Professor Jay Blumler gave a brief talk that identified some of the new challenges facing investigative journalism. He surveyed the changing context of journalism as well as the enduring value of journalistic roles, such as in exposing wrongs, before providing a litany of challenges facing investigative journalism, such as when the targets of investigative journalism are overwhelmed and find it difficult to reply in a timely and comprehensive manner. He also argued for journalists more explicitly considering the social implications of journalism, such as the degree to which investigative reporting might lead politicians and other public figures to consider themselves ‘sitting ducks’ for the media. What impact will this have on the willingness of individuals to step into the public arena? His talk was followed by responses and additional input from Gail Champion, Editor of the BBC programme, File on 4, and Phil Abrams, who gave impressive examples of stories that got things right, and a few where they ‘got things wrong’, but learn from them.

Jay Blumler and Bill, 2018

This panel was followed by one focused on the enduring challenge of moving journalism beyond its centre of gravity in the London/Westminster bubble, such as with the decision to locate the new Channel 4 headquarters in Leeds. Professor Katy Parry led off this panel, followed by Tim Smith, Regional Head of BBC for Yorkshire, and Andrew Sheldon, Creative Director of True North TV. I found it amazing that the politics of broadcasting in the UK remains so focused on the nations and regions, such as in respect to the distribution of production and original content. The BBC and other major broadcasters in the UK have such national prestige that the locations of new headquarters, such as Channel 4’s recent decision to build in Leeds, can be very significant to attracting talent outside the London bubble. But even more interesting to me was the degree that the Internet and social media as well as on-demand streaming video was not viewed as a threat to broadcasting in the UK, as it would be in the US. In fact, examples arose of Netflix investing in UK content and production skills.

The final summary panel featured the symposium’s academic organizer, Professor Stephen Coleman, who nicely captured and built on the key themes of the day. His remarks were followed by a panel-led discussion. Stephen emphasized the motives of what he called ‘public service journalism’ by comparing public service media organizations to public universities, such as Leeds, where there are legitimate demands for a commitment to justice, accountability, and a civic – citizen – orientation.

Gillian Bolsover & Stephen Coleman, Leeds University

This was of course a friendly and receptive audience for journalists. Nevertheless, I was left more convinced than ever that public service broadcasting is alive and well in the UK through the BBC and other public service broadcast journalism, and that collaboration between practitioners and academics, as orchestrated on the day, adds real value to both.

Notes

[1]https://academic.oup.com/jxb/article/60/3/712/453685

[2]Dutton, W. H., and Jeffreys, P. (2010) (eds), World Wide Research: Reshaping the Sciences and Humanities. Cambridge, MA: The MIT Press.

 

A Metric for Academics: A Personal Suggestion

Every year in the US, and at various intervals in other countries, academics must pull together what they have done to provide administrators with the data required for their indicators of performance. Just as metrics provided baseball teams with a new tool for more systematically choosing players, based on their stats, as portrayed in the popular film Moneyball, so universities hope to improve their performance and rankings by relying more on metrics rather than the intuitions of faculty. Metrics are indeed revolutionizing the selection, promotion, and retention of academics, and units within universities. Arguably, they already have done so. The recruitment process increasingly looks at various scores and stats about any given candidate for any academic position.

Individual academics can’t do much about it. And increasingly, the metth-1rics will be collected without the academic even doing any data gathering, as data on citations, publications, and teaching ratings get generated in the course of being an academic. Academic metrics are becoming one more mountain of big data ready for computational analysis.

I am too senior (old) to be worried about my own metrics. They are not great, but they are as good as they will ever be. My concern is most often with administrators tending to count everything that can be counted, rather than trying to develop indicators that get to the heart of academic performance. Of course, this is extremely difficult since academics seldom agree on the rating of their colleagues. A scholar who is a superstar to one academic is conceptually dead from another academic’s perspective. So this controversy is one of many factors driving academia towards more indicators or hard evidence of performance. The judgments of scholars vary so dramatically. At least by counting what can be counted, there is some harder evidence that might be indicative of what we try to measure – quality.

So what can we count? It varies by university, but I’ve been in universities that count publications, of course, but every kind of publication, from refereed journal articles to blogs. And each of these might be rated, such as by the status of the journal in which an article appears, or the prestige of the publisher of a book. But that is only the beginning. We count citations, conference papers, talks, committees, awards, and more. Therefore, we perennially worry about whether we published enough in the right places, and did enough of anything that is counted.

In the UK, there has been an effort to measure the impact of an academic’s work. There have been entire conferences and publications devoted to what could be meant by impact and how it could be measured. Arguably, this is a well intentioned move toward measuring something more meaningful. Rather than simply counting the number of publications (output), why not try to gauge the impact (outcomes) of the work? It is just that it is difficult to reliably and validly measure impact, given that the lag between academic work and its impact can be years or decades. Take Daniel Bell’s work on the information society, which had a huge impact, which went well beyond what might have been expected in the immediate aftermath of his publication on The Coming of Post-Industrial Society. Nevertheless, indicators of impact will inevitably be added to all the other growing number of indicators, even thought universities will spend an unbelievable amount of time trying to document this metric. th

In this environment, because I am a senior in academia, I sometimes get asked how a colleague should think about these metrics. Where should they publish? How many articles should they publish? Which publisher should they submit their book for publication? It goes on and on.

I try to give my opinion, but my most general response, when I feel like it will be accepted as advice and not criticism, is to focus on contributing something new to your field. Rather than think about numbers, think about making a contribution to how people think about your field.

This must go beyond the topic of one’s research. It is okay to know what topics or areas an academic works in, but what has he or she brought to that field? Is it a new way for doing research on a topic, a new concept for the area, or a new way of thinking about the topic?

In sum, if an academic’s career was considered, by another academic familiar with their work, could they say that the person had made an original, non-trivial contribution to the study of their field? This is very subjective and difficult to answer, which may be why administrators move to hard indicators. Presumably, if someone has made an important new contribution, their work will be published and cited more than someone who has not. That’s the theory.

However, the focus on contributing new ideas can give academics a more constructive motivation and an aim to guide their work. Rather than feeling that your future is based on getting x number of journal articles published, you make publication a means to a more useful end in itself, furthering progress in your field of study. If you accomplish this, the numbers, reputation, and visibility of your work will take care of themselves. What would be a new contribution to your field? That is exactly the right question.