Science communication has no broadly accepted definition. SCI takes a very broad view of this field, and has identified eight areas of science that rely heavily on effective communication and that can be targeted with improved communications. These areas are divided into two groups—those that involve science discovery tools and dynamics and those that involve improving science understanding. SCI’s “discovery” focus areas include issues such as research collaboration (and access), informatics, study design, and tech transfer. Generally speaking, this is where most of SCI’s current work is focused (although in practice there is much overlap between issues and between the discovery “bucket” and “understanding” bucket). “Understanding” areas include issues such as science writing, STEM education, science marketing and public policy. SCI believes that science needs better communication tools and practices in these areas to help realize the full potential of research and also make faster advances in science education, science policy and other areas where science and society intersect.
The science understanding category is vastly more developed than the science discovery category, and has made great strides over the last 10 years. Science understanding is where most of the activity exists in the science communication space, from STEM teaching funders, organizations, and resources, to networks of informal science educators (for an excellent starting point, see the website of the Center for the Advancement of Informal Science Education at informalscience.org). By comparison, the science discovery category is less coherent and cohesive. And generally speaking, there aren’t any solid connections between these categories—for instance, efforts to transplant the lessons learned about effective science writing into the world of research journals.
Advanced collaboration between scientists trained in several different fields has become essential in many areas of scientific research. Interdependence, joint ownership and collective responsibility for data and data analysis is needed among many modern research teams, even those housed in the same wing of the same institution. In the face of this need there are well-established challenges to collaboration: distance, common interests, common goals, incentives, trust, organizational and legal barriers, access barriers (such as subscription costs and intelligibility), and more. Some research teams have met theses challenges and are far ahead of the collaboration curve; others lag far behind. In general, there’s a lot of room for improvement. Databases that share research findings in a timely and usable fashion are not common and research collaboration tools and best practice guidelines are still novel.
Informatics is a budding and crucial field. For now, almost every institution and area of scientific research defines informatics somewhat differently but the primary focus is always on technology — how computers can help us discover more. Some of the specific challenges currently being tackled by informatics includes integrating widely differing data formats in research datasets, standardizing data formats for future research collaboration efforts, identifying and pulling more data into depositories and architecting designs for data storage, retrieval and use, and developing new tools that can help analyze and sift through increasingly unmanageable volumes of “big data.” Despite the current attention being paid to these technology-centered challenges, there also needs to be more focus on big picture efforts. Informatics isn’t just about computer systems, but about our human ability to peer into research and make insights and connections and integrate new and helpful perspectives — in short, to do a better job of communicating. Flooding more advanced computer systems with more data will no doubt lead to remarkable breakthroughs, but being able to intuit patterns, applications, and trends by integrating and comparing the right sets of data, as well as sharing tools and best practices between disciplines could, on the other hand, lead to grand new applications, solutions, and discoveries.
Research study design is one of the most well-developed areas of science. Legions of experts have compiled and refined years of best practice guidelines on the proper design, conduct and analysis of research studies, covering everything from human subject protection to statistical methods. However, as the expectations of our information society continue to evolve at breakneck speed, holes have developed in best practice frameworks. Communication is one such shortcoming. Many studies with potentially far-reaching impact still allocate only a nominal budget for sharing findings and communicating these findings to the public — even the bare essentials like building a good website for the study, keeping it current, preparing policy briefs and press releases, and making important connections with other researchers in the field through social media, email, and other direct outreach (to the credit of researchers, conferences are also widely used as communication tools but these are not adequate by themselves to reach beyond peer communities). Other studies might have ambitious enrollment plans for potentially life-saving treatments but an inadequate budget for participant recruitment and enrollment, and no on-staff expertise for writing and designing compelling outreach materials. Databases are another shortcoming of modern research studies. Like communication components, data collection and analysis isn’t usually designed with sharing in mind. Data is kept under lock and key until journal articles are published, and comparison with other research datasets is rarely a consideration, let alone a practical objective. Study selection itself is a third shortcoming — the issue of whether many research studies are even necessary. Publish or perish pressures may be producing a glut of studies that didn’t need to be done in the first place or that should have been done better. In summary, designing studies with better communication and data components, increasing collaboration, and reducing the pressure to publish studies will all help improve current research study designs.
“Technology Transfer,” as the term is normally used, usually encompasses issues focused on acquiring and licensing patents. It’s an important focus of many higher education institutions who see more effective tech transfer programs (rightly or wrongly) as a potential economic engines for their universities and local economies. Some tech transfer organizations have had more success than others. The most recent survey conducted by the Association of University Technology Managers (AUTM), indicates that 11 of its roughly 200 member universities accounted for more than half of all the licensing and royalty revenues generated from university patents in 2010. In addition, only 16% of tech transfer offices retained enough of the generated revenue to cover their ongoing costs, meaning that the vast majority of these offices run at a perpetual deficit. This lack of adequate funding is one reason why some tech transfer offices are more successful than others. Another reason is institutional capacity for communication-related functions: successful offices add value to ideas and have business, marketing, and communications expertise in-house or on-call to develop promising ideas and shepherd them into the marketplace. Yet another reason is communication itself: involved transfer processes like in pharmaceutics requires keen attention to communication processes — between technical teams, teams and regulators, different organizations, and more. Focus is another issue that could be improved. Tech transfer doesn’t need to single out only on patentable technology; with adequate investment and staffing it can and should also focus more on science.
Marketing is a key component in the success of every public facing enterprise. In science, marketing simply means communicating science clearly and effectively for the benefit of both science and the general public. What are some of the situations where this applies? There are both internal and external applications of better marketing, some which overlap. Internally, the old adage about how the world will beat a path to your door if you invent a better mousetrap is unfortunately false. It always has been, but in the meritocracy of science this adage just seems like it should be true. Better communication tools mean more effective, timely, and cross-disciplinary collaboration, which might pave the road to more discovery. At the crossroads, better marketing is also useful for more mundane but essential goals like reporting to donors and raising more funds for research. Externally, better communication is critical for everything from educating and influencing policymakers to spinning successful tech transfer initiatives, enrolling participants in studies, getting kids interested in science, and more. There is much room for improvement in the current model of science marketing, and this improvement is rooted in improving the institutional resources, capacity and budgets for this kind of work, which will first require proving the cost, impact, and efficiency benefits of this approach.
Effective science writing underscores everything in science communication. Unfortunately, writing is a field where everyone feels they’re an expert, and changing an information “owner” culture like science into an editorial culture as is normal in any public-facing enterprise (where specialists are ultimately responsible for crafting messages for specific goals and customer groups) can be very difficult. In academia, which considers its primary customers to be other scientists, writing is directed mostly toward journals. Access issues aside, this peer-to-peer writing is often so dense that it becomes unintelligible, even to other scientists. Improving access to more journal articles in more disciplines is a worthy goal but understanding these articles can require a translator, not only because of the subject complexity but also because of the inaccessible writing style that has become the lingua franca of science journals. As science writing ventures into journalism there is often a lack of understanding between scientists and journalists about what constitutes effective writing — finding the right and necessary balance between clarity and accuracy. And when science writing attempts to bypass journalists and go directly to the public, scientists and their institutions rarely have the in-house expertise to do an adequate job of communicating. Explanations for this dynamic vary, but the communication field’s lack of standing in research science may bear most of the blame as well as most of the potential for reform.
Science policy in America is at the core of just about everything — water, energy, health, agriculture, conservation, climate, defense, and more. And of course, many of these issues are also linked and have multiple implications at local, regional, national and global levels. Therefore, given the importance of good science policy it’s critical to our futures to reverse the trend in America of politicizing science policy. It is no longer possible to formulate policies for the public good by drawing on a single set of scientific facts. Every camp has their own “experts” and “facts” and the public is left to choose sides. This is an outgrowth of our information society, but it is also an outgrowth of poor science communication. Creating science policy in America that is more responsive to science begins with improving the communications infrastructure of science. Sound science is necessary for informed policymaking but it is not by itself sufficient. Policy recommendations also need to include storylines and plausible options developed through collaboration with a wide range of stakeholders. And these new policy options aren’t the better mouse trap to which customers will naturally flock. Options need to be presented in ways that reach their audiences, and options need to effectively rebut the science nonsense that can flourish in our sound-bite culture. Establishing more cross-cutting foundations for science policy is also important — active collaborations between research institutions, corporations, infrastructure providers (transportation, energy, etc.), STEM education, and more.
We’ve all heard the sobering news about STEM (science, technology, engineering and math) education: there aren’t enough qualified STEM teachers in our K-12 systems, most high school students graduate without an adequate background in math and science, and students who choose to study math, science and engineering in college have very high rates of attrition compared to other majors — meaning that most switch majors or don’t finish their degrees at all. nSCI believes that better science communication can help repair this situation — in schools, in the public sphere, in the public policy arena, and more. Science education doesn’t start in the classroom, after all, but with kids getting excited about science, and science policy works better if the public understands and is inspired by science and discovery. Building and maintaining this excitement and inspiration is slow pitch softball for marketers: making textbooks more entertaining, making sure teachers are properly trained and educated and have the right kinds of support, making sure that science education has a strong hands-on component, and making sure that Congress follows through on its frequent promises to markedly increase funding for STEM education (commitments that have recently been picked up by private industry due to the increasingly urgent need for a more STEM-literate domestic workforce). STEM reform efforts also need to focus internally, however: is there a more fundamental reason why kids don’t or can’t follow through with science that’s not related to homework, teachers and funding? Examining the way science is communicated to students will help, as well as examining the role, necessity, and impact of math education requirements and methods in science education since math education is clearly the weakest link in college-level science education. This issue will be explored in detail in a future nSCI White Paper.