Wonderful to see a chapter by me, Frank Hangler, and Ginette Law, entitled ‘Broadening Conceptions of Mobile and Its Social Dynamics’ in Chan, J. M., and Lee, F. L. F. (2017), Advancing Comparative Media and Communication Research (London: Routledge), pp. 142-170. It arrived at my office today.
The volume evolved out of an international conference to mark the 50th anniversary of the School of Journalism and Communication at the Chinese University of Hong Kong in 2015. But the paper’s origins date back to a project that I did during my last months at Oxford in 2014, and early in my tenure at MSU, as the Principal Investigator with Ginette and Frank, of a project called ‘The Social Shaping of Mobile Internet Developments and their Implications for Evolving Lifestyles’, supported by a contract from Huawei Technologies Co. Ltd to Oxford University Consulting. This led first to a working paper done jointly with colleagues from Oxford University and Huawei: Dutton, William H. and Law, Ginette and Groselj, Darja and Hangler, Frank and Vidan, Gili and Cheng, Lin and Lu, Xiaobin and Zhi, Hui and Zhao, Qiyong and Wang, Bin, Mobile Communication Today and Tomorrow (December 4, 2014). A Quello Policy Research Paper, Quello Center, Michigan State University.. Available at SSRN: http://ssrn.com/abstract=2534236 or http://dx.doi.org/10.2139/ssrn.2534236
The project moved me into a far better understanding and appreciation of the significance of mobile, but also its varied and evolving definitions. Before this paper, I was skeptical of academic work centered on mobile as I considered it one area of Internet studies. However, by the end of the project, I became convinced that mobile communication is a useful and complex area for research, policy and practice, complementary to Internet studies. In the working paper, we forecast the disappearance of the mobile phone device, which seemed far-fetched when we suggested this to Huawei, but is now becoming a popular conception. So look forward to a future in which that awkward scene of people walking along looking at their mobile will come to an end, in a good way.
This paper illustrates the often circuitous route of academic work from conception to publication, which is increasingly international and collaborative. So thanks to the editors, my co-authors, Oxford Consulting, and Huawei for your support and patience. Academic time is another world. But it was all worth doing and the wait.
On my last trip to China, I was meeting with a former social science colleague at Tsinghua University, Professor JIN Jianbin, who received a new research grant to study public perspectives on science, such as around research on genetically modified crops. Our conversation about genetically modified organisms (GMOs) quickly touched on a variety of other issues, such as the public’s acceptance of research on climate change, on which sizeable proportions of the public in China, the US and other nations often dismiss, if not distrust, scientific opinion.
Of course, some level of public distrust of scientific authorities is not new. I recall some famous work by political scientists in the US who studied the politics of conspiracy theories around the fluoridation of water that was prominent across American communities since the 1950s, but which – surprisingly – carries on to this day. So while it is not new, distrust of the political motivations behind scientific opinion is arguably growing.
Some indicators have suggested that diffuse public support for scientific institutions is not declining. However, there is some limited and more recent evidence that universities and academics are being perceived as more partisan. And anecdotally, science is increasingly questioned as biased by researchers who are claimed to be in the pockets of the sponsors of their research, illustrated by controversies over pharmaceutical research.
Such assaults on the integrity of science have led universities and research institutions to place a higher priority on the prevention and detection of conflicts of interest rising in the conduct of research. Finally, symptoms of this growing distrust seem evident in the divisions over a rising number of issues, with GMOs, climate change, vaccinations, and evolution, being among the more prominent. Perhaps the controversies surrounding science simply reflect the many issues that have broad public implications, such as for the digital economy or public health, while issues such as the moon landing were more removed from immediate public impact on the redistribution of resources.
The bad news is that these controversies are likely to slow progress, such as on efforts to reduce man made climate change. In some cases these controversies are dangerous, such as in leading parents not to vaccinate their school children.
However, there might be some positive outcomes here, if not good news. One positive outcome of this developing problem might be that scientists will place a greater priority on better explaining their work to a wider public. Already, the study of science communication is a burgeoning field around the world, illustrated by new research being launched by my colleague JIN Jianbin, Professor of Journalism and Communication at Tsinghua University in Beijing. And an increasing number of research councils and foundations stress the importance of public outreach.
Of course, scientists explain their research findings and their implications as a matter of practice. Not to be forgotten or dismissed is perhaps the most effective albeit long-term form of science communication, which is teaching in colleges and universities. Yet there are questions about whether top scientists, whatever their field, are as closely involved in teaching as they could be. For example, my former university, the University of Southern California, placed a priority on putting top senior scholars into the entry level undergraduate courses, which I thought was brilliant, but which is exceptional.
But arguably, most communication about scientific issues remains focused on peer-to-peer rather than public facing communication. Peer-to-peer communication is conducted through journal publications and academic conferences and presentations. And when public facing, it is often limited to top-down or what I have called ‘trickle-down’ science, with scientists expecting their publications to be read and interpreted by others, and not themselves – the primary researchers.
However, and here I could be wrong, it seems that the worse possible development might be what I see as a trend toward scientific persuasion, often based on appeals to authority and scientific consensus or by lobbying, such as through petitions, rather than by effective communication of research. Any scientist is quick to dismiss or place less credibility in appeals to authority. Why should the public be different? Where is the evidence? And once scientists move into the role of a lobbyist, petitioner, or activist, they diminish their credibility as scientists or researchers. Surely this kind of context collapse, when a scientist becomes political, or a doctor runs for a political office, invites the public to view scientists and academics as partisan political actors rather than scientific actors, and see them in ways that parallel other political actors and lobbyists.
How can scientists explain their work to a larger public? First, they need to recognize the need and value of effectively communicating their work to a broader public. This aim is rising across academia, such as in research councils insisting on research including components on outreach, and academic quality being judged increasingly by its impact. Unfortunately, this can sometimes drift into a tick box exercise in budgeting for conferences and seminars involving business and industry and the government, while serious efforts to communicate to the general public with an interest in the topic needs to be tackled directly. Academics need to guard against this tick box mentality.
Another concern is that this need for public outreach might simply lead to a greater focus on media coverage, getting the press to pick up stories on a scientist’s research. There is nothing wrong with this, universities love such coverage, and it can be helpful, but news coverage is generally overly simplistic, too often misleading, and potentially adding to the problems confronting good scientific communication. Researchers need to hold journalists and the media more accountable, and address inaccuracies or overly simplified messages in the press, cable news shows, and mass media.
Another, and a possibly more effective and more recently practical approach, is to communicate directly to the public. Join the conversation. Write reports on your research findings that are understandable to those in the educated public that might be seriously interested in your work now or in the future. You can reach opinion leaders in your areas of research, and thereby foster effective two-step flows of communication to the general public. Don’t worry about a mass audience, but aim to reach a targeted audience of those with a serious interest in your topic. When they search online for information about your topic, make sure that accessible presentations of your research will be found.
Unfortunately, too many academics are taught not to join the conversation, and to avoid blogging or writing for a general audience. Instead, they are taught to focus more than ever on only reaching the top peer reviewed journals in their field and being read and cited by their peers. As noted above, this too often leads to a weak form of trickle down science, which is not in the long-term interest of the scientific enterprise.
We should question this conventional wisdom in academia. Personally, I don’t believe there is a necessary risk to scientific publishing by also trying to communicate to a more general audience. That is what teachers do, and when researchers try to teach and communicate with their students, they can find problems with their arguments, and ways to improve how they convey their ideas.
So – scientists – offer up your best ideas to the public, not as your peers, but as smart and educated individuals who do not know about your work – even why it is relevant. Some of my most meaningful experiences with communication about my research have been exactly when I – focused on Internet studies – sat next to a physicist or mathematician over a meal who asked me about my research and vice versa. What am I working on? Why is it important? If we can do this over lunch or dinner, we can do it for a larger public online.
Perhaps this is more difficult than it sounds, but we need to accept the challenge. Arguably, the scientific challenge of the 21st century is effective communication to the larger public.
I’ve argued on this blog that the idea of enabling the press to ask questions from outside the White House Press Office, in fact, outside the Washington DC Beltway, was a good idea. Some anecdotal evidence is being reported that the strategy is working. USA Today reported that over 13 White House press briefings, Sean Spicer has taken questions ‘from 32 outside-the-Beltway outlets’. This is a great example of using the Internet to enable more distributed participation. The Washington press is obviously defensive when people complain about the ‘media bubble’ in the briefing room, but the potential for what was once called ‘pack journalism’ is real, and location matters. Geographically distributing contributions is symbolically and materially opening the briefings up to more diversity of viewpoints and issues.
Inevitably, more voices means more competition among the journalists in asking questions. But there are already too many in the room, and why it is fair to give more access to the outlets that can afford to station staff in Washington DC is not clear to me. That said, the Skype seats will always be the cheap seats, and be less likely to get their turn in the question and answer sessions.
Every year in the US, and at various intervals in other countries, academics must pull together what they have done to provide administrators with the data required for their indicators of performance. Just as metrics provided baseball teams with a new tool for more systematically choosing players, based on their stats, as portrayed in the popular film Moneyball, so universities hope to improve their performance and rankings by relying more on metrics rather than the intuitions of faculty. Metrics are indeed revolutionizing the selection, promotion, and retention of academics, and units within universities. Arguably, they already have done so. The recruitment process increasingly looks at various scores and stats about any given candidate for any academic position.
Individual academics can’t do much about it. And increasingly, the metrics will be collected without the academic even doing any data gathering, as data on citations, publications, and teaching ratings get generated in the course of being an academic. Academic metrics are becoming one more mountain of big data ready for computational analysis.
I am too senior (old) to be worried about my own metrics. They are not great, but they are as good as they will ever be. My concern is most often with administrators tending to count everything that can be counted, rather than trying to develop indicators that get to the heart of academic performance. Of course, this is extremely difficult since academics seldom agree on the rating of their colleagues. A scholar who is a superstar to one academic is conceptually dead from another academic’s perspective. So this controversy is one of many factors driving academia towards more indicators or hard evidence of performance. The judgments of scholars vary so dramatically. At least by counting what can be counted, there is some harder evidence that might be indicative of what we try to measure – quality.
So what can we count? It varies by university, but I’ve been in universities that count publications, of course, but every kind of publication, from refereed journal articles to blogs. And each of these might be rated, such as by the status of the journal in which an article appears, or the prestige of the publisher of a book. But that is only the beginning. We count citations, conference papers, talks, committees, awards, and more. Therefore, we perennially worry about whether we published enough in the right places, and did enough of anything that is counted.
In the UK, there has been an effort to measure the impact of an academic’s work. There have been entire conferences and publications devoted to what could be meant by impact and how it could be measured. Arguably, this is a well intentioned move toward measuring something more meaningful. Rather than simply counting the number of publications (output), why not try to gauge the impact (outcomes) of the work? It is just that it is difficult to reliably and validly measure impact, given that the lag between academic work and its impact can be years or decades. Take Daniel Bell’s work on the information society, which had a huge impact, which went well beyond what might have been expected in the immediate aftermath of his publication on The Coming of Post-Industrial Society. Nevertheless, indicators of impact will inevitably be added to all the other growing number of indicators, even thought universities will spend an unbelievable amount of time trying to document this metric.
In this environment, because I am a senior in academia, I sometimes get asked how a colleague should think about these metrics. Where should they publish? How many articles should they publish? Which publisher should they submit their book for publication? It goes on and on.
I try to give my opinion, but my most general response, when I feel like it will be accepted as advice and not criticism, is to focus on contributing something new to your field. Rather than think about numbers, think about making a contribution to how people think about your field.
This must go beyond the topic of one’s research. It is okay to know what topics or areas an academic works in, but what has he or she brought to that field? Is it a new way for doing research on a topic, a new concept for the area, or a new way of thinking about the topic?
In sum, if an academic’s career was considered, by another academic familiar with their work, could they say that the person had made an original, non-trivial contribution to the study of their field? This is very subjective and difficult to answer, which may be why administrators move to hard indicators. Presumably, if someone has made an important new contribution, their work will be published and cited more than someone who has not. That’s the theory.
However, the focus on contributing new ideas can give academics a more constructive motivation and an aim to guide their work. Rather than feeling that your future is based on getting x number of journal articles published, you make publication a means to a more useful end in itself, furthering progress in your field of study. If you accomplish this, the numbers, reputation, and visibility of your work will take care of themselves. What would be a new contribution to your field? That is exactly the right question.
The 6th ACM Web Science Conference will be held 23-26 June 2014 on the beautiful campus of Indiana University, Bloomington. Web Science continues to focus on the study of information networks, social communities, organizations, applications, and policies that shape and are shaped by the Web.
The WebSci14 program includes 29 paper presentations, 35 posters with lightening talks, a documentary, and keynotes by Dame Wendy Hall (U. of Southampton), JP Rangaswami (Salesforce.com), Laura DeNardis (American University) and Daniel Tunkelang (LinkedIn). Several workshops will be held in conjunction with the conference on topics such as Altmetrics, computational approaches to social modeling, the complex dynamics of the Web, the Web of scientific knowledge, interdisciplinary coups to calamities, Web Science education, Web observatories, and Cybercrime and Cyberwar. Conference attendees will have an opportunity to enjoy the exhibit Places & Spaces: Mapping Science, meant to inspire cross-disciplinary discussion on how to track and communicate human activity and scientific progress on a global scale. Finally, we will award prizes for the most innovative visualizations of Web data. For this data challenge, we are providing four large datasets that will remain publicly available to Web
I have agreed to co-chair the next Web Science Conference, Web Science 2014, which will be held in 2014 at Indiana University. The lead chairs are Fil Menczer and his group at Indiana University, and Jim Hendler at Rensselaer Polytechnic Institute, and one of the originators of the Semantic Web. The dates are 23-26 June 2014.
My mission is to help bring social scientists and humanities scholars to this conference to ensure that it is truly multi-disciplinary, and also to help encourage a more global set of participants, attracting academics from Europe but also worldwide.
For those who are not quite sure of the scope and methods of Web Science, let me recommend a chapter in my handbook by Kieron O’Hara and Wendy Hall, entitled ‘Web Science’, pp. 48-68 in Dutton, W. H. (2013) (ed.), The Oxford Handbook of Internet Studies. Oxford: Oxford University Press.The core of the Web Science community sometimes view this as a field or discipline on its own, while I would define it as a topic or focus within a broader, multdisciplinary field of Internet Studies.
In any case, I will be adding to this blog over the coming months as the conference planning progresses, but please consider participating. Information about the conference is posted at: http://websci14.org/#
Professor & Presidential Chair in Information Studies
University of California, Los Angeles
Oliver Smithies Visiting Fellow and Lecturer
Balliol College, University of Oxford
Scholars are expected to publish the results of their work in journals, books, and other venues. Now they are being asked to publish their data as well, which marks a fundamental transition in scholarly communication. Data are not shiny objects that are easily exchanged. Rather, they are fuzzy and poorly bounded entities. The enthusiasm for “big data” is obscuring the complexity and diversity of data and of data practices across the disciplines. Data flows are uneven – abundant in some areas and sparse in others, easily or rarely shared. Open access and open data are contested concepts that are often conflated. Data are a lens to observe the rapidly changing landscape of scholarly practice. This talk is based on an Oxford-based book project to open up the black box of “data,” peering inside to explore behavior, technology, and policy issues.