U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Microbiol Biol Educ
  • v.24(2); 2023 Aug
  • PMC10443302

Logo of jmbe

Developing Science Literacy in Students and Society: Theory, Research, and Practice

Nicole c. kelp.

a Department of Microbiology, Immunology, and Pathology, Colorado State University, Fort Collins, Colorado, USA

Melissa McCartney

b Department of Biological Sciences, Florida International University, Miami, Florida, USA

Mark A. Sarvary

c Investigative Biology Teaching Laboratories, Cornell University, Ithaca, New York, USA

Justin F. Shaffer

d Department of Chemical and Biological Engineering, Colorado School of Mines, Golden, Colorado, USA

Michael J. Wolyniak

e Department of Biology, Hampden-Sydney College, Hampden-Sydney, Virginia, USA

The subject of scientific literacy has never been more critical to the scientific community as well as society in general. As opportunities to spread misinformation increase with the rise of new technologies, it is critical for society to have at its disposal the means for ensuring that its citizens possess the basic scientific literacy necessary to make critical decisions on topics like climate change, biotechnology, and other science-based issues. As the Guest Editors of this themed issue of the Journal of Microbiology and Biology Education , we present a wide array of techniques that the scientific community is using to promote scientific literacy in both academic and nonacademic settings. The diversity of the techniques presented here give us confidence that the scientific community will rise to the challenge of ensuring that our society will be prepared to make fact-based and wise decisions that will preserve and improve our quality of life.

Scientific literacy can be defined in multiple ways, from how an individual processes scientific facts and concepts and interprets scientific data to how a community collectively interacts with scientific knowledge and processes. Scientific literacy skills are incredibly important for people to develop: whether they are trained scientists or not, people encounter issues pertaining to science frequently in their daily lives. In modern times, people are continually exposed to news stories about climate change, energy production, and health, exercise, and medicine, not to mention the 2019 coronavirus disease (COVID-19) pandemic. By investigating scientific literacy skill development and designing classroom or outreach activities to promote scientific literacy skills, we as science educators can help improve student and societal scientific literacy, which can lead to more-informed decision-making by individuals and societies. Whether the students in our classrooms are science majors or not, it is critical for them to develop science literacy skills and promote science literacy in their communities.

As scientists and science educators, we are passionate about promoting the science literacy of both our students and our society. The 2023 JMBE themed issue on “Scientific Literacy” will examine this concept from multiple angles, from theoretical frameworks to research on the impact of literacy interventions to practical tools for developing scientific literacy in diverse groups of learners. Here, we analyze a portion of these articles, sorted by major themes in scientific literacy that are represented in this special issue.

Theoretical frameworks that inform the development of scientific literacy

At the core of all great research studies is a theoretical framework. However, for a topic as complex as scientific literacy, how do you determine which framework is appropriate for your particular take on scientific literacy? Tenney et al. identified three different learning theories, information processing, constructivism, and sociocultural theory, and they discussed the conceptualizations of science, technology, engineering, and math (STEM) literacy and offered insightful perspectives on how to conceptualize what it means to be STEM literate ( 1 ). And, if you envision science literacy to be larger than these three theories, and to extend outside of the classroom, Elhai here helps readers redefine science literacy at the community level ( 2 ). These perspectives may be valuable to others as they work to conceptualize what science literacy means in their own circumstances.

Scientific literacy in the context of microbiology, cell biology, molecular biology, immunology, disease ecology, and other disciplines

We cannot forget that science is at the heart of science literacy, and part of science literacy is knowing basic science. In line with current scientific challenges related to public health, Ricci et al. ( 3 ) and Mixter et al. ( 4 ) provide ideas for teaching and learning about infectious disease, through art (as described here by Ricci et al.) and through system-level change and collaboration among novice and experienced educators, professional societies, and policymakers (as described here by Mixter et al.). Access to microbiology is also expanded, with Newman et al. presenting here a novel card-sorting task involving visual literacy skills ( 5 ) and Joyner and Parks outlining how to develop a public data presentation and an epidemiological model based on current events ( 6 ).

Pedagogical practices, including effective classroom tools

Undergraduates are prosumers, consuming and producing scientific information at the same time. This requires assignments to be built on each other in a scaffolded or multistep format. Joyner and Parks present a multicourse approach using modern pedagogical methods to promote communication and data and information literacy in STEM students ( 6 ). Similarly, Rholl et al. described how they let students engage with current events in interactive, multiweek activities that increase student motivation and agency ( 7 ). Sarvary and Ruesch describe how undergraduates can be taught through a multistep framework to become critical consumers of scientific evidence in a single laboratory session. These two authors have used and assessed a variety of active learning methods in the past decade to help students find, evaluate, comprehend, and cite scientific information ( 8 ). With the rise of social media, undergraduates need to be taught how to responsibly share information using these constantly changing platforms. The Social Media Reflection assignment has been successfully used in both lower- and upper-level courses, helping students assess scientific claims and fight misinformation ( 9 ).

Student understanding of the nature of science, quantitative literacy skills, and science communication

There is an intricate connection between how students understand scientific concepts, scientific process, and primary scientific data and how they are able to communicate about these topics with each other and with those outside the scientific community ( 10 ) and in their future careers ( 11 , 12 ). This is critically important for our science students who will interact with patients as future health professionals ( 11 ). One way in which students can engage with a mix of scientific facts, processes, and data is via the primary scientific literature. Developing the skills to read and understand the primary scientific literature is difficult ( 13 ). Authors in this special issue present how student skills in analyzing data in the primary scientific literature can be improved via graphical abstract assignments ( 14 ) and annotations ( 15 , 16 ), as well as by engaging in peer review ( 17 ). Beyond developing their own understanding of the science, engaging with the primary scientific literature is important for our students, as they can utilize the literature as a tool for science communication with nonscientist audiences ( 18 ). Conversely, we can utilize popular texts intended for the public in our science classrooms in order to promote science literacy and new insights about socio-scientific issues ( 19 ). In addition to specific forms of literature, empathetic and relational conversations about science are another tool by which students can build both their knowledge of the science and their abilities in science communication ( 10 ).

Community science literacy and outreach

Science literacy skills are required for everyday decision-making and are often applied by nonscientists. These nontechnical audiences are able to understand scientific evidence using primary literature ( 18 ) and develop interest in science using art ( 3 ). Attitudes toward science and trust in scientists became especially important during the COVID-19 pandemic. Mixter et al. discuss immune literacy at the individual and societal level and call for a system-level change to build this important skill not only in classrooms but also in the community ( 4 ). Service learning and community engagement can help with this effort ( 10 ).

Impacts on learning and assessment in the classroom or the community

By creating students and communities that are more scientifically literate, we can set the table for increased opportunities for these groups to learn and understand science, to translate that knowledge into making positive changes in society, and to potentially join the STEM workforce. Several articles ( 3 , 4 , 7 , 16 ) consider new approaches for using scientific literacy as a vehicle for enhancing student appreciation for specific STEM fields. Other articles focus on ways that instructors can better assess the progress that students are making toward developing both stronger levels of overall scientific literacy and mastery of particular course material ( 8 , 13 ). Finally, when students gain practice in argumentation about authentic ethical issues in research, they are better prepared to collaboratively engage with diverse communities about these challenging issues ( 12 ). There is a dynamic conversation taking place within the scientific education community on ways to translate increases in scientific literacy with gains in overall learning objectives in a variety of STEM disciplines. This conversation promises to continue to evolve best practices for reaching this goal among both traditional students and “citizen scientists” in society.

Inclusive approaches and removal of barriers to scientific information

The scientific community is becoming more cognizant of the need to consider equity and inclusion and incorporate them into strategies for improving scientific literacy in the classroom and across society. This issue explores the use of laboratory course elements as drivers of equity-based STEM education ( 20 ) as well as the development of empathetic communication skills as an effective means of reaching all members of the community regardless of their previous experiences with science and potential exposure to scientific misinformation ( 10 ). Within the classroom, different research groups are exploring how to develop literacy-based assignments that either use unconventional and more accessible means to bring new students into an exploration of science ( 3 , 14 ) or provide learning support tools that make engagement with scientific literature more accessible to all ( 18 ). A society cannot improve its overall level of scientific literacy without finding ways of making scientific knowledge accessible to all of its members, and the work presented in this issue provides a variety of approaches toward this goal.

This issue could not be coming out at a more critical juncture in our society, as the scientific community struggles to find ways to battle both disinformation campaigns about how science is done and presented and the preconceived intimidating notions that many hold about the accessibility of science to the masses. Issues such as climate change, vaccination, and environmental conservation cannot be solved by a scientifically illiterate society. As science continually evolves, so must our understanding of how to best communicate science across ever-changing platforms and audiences. It is our hope that the ideas presented in this issue will inspire both the current scientific community and future generations of scientists and teachers to continually work to make science as accessible, learnable, and exciting as possible to citizens of all ages and backgrounds.

ACKNOWLEDGMENTS

We are grateful to the Journal of Microbiology and Biology Education for advancing this vital scientific literacy conversation and for allowing us the opportunity to help assemble this exciting collection of current work on this topic. We hope that these articles will engender in you the desire to join the conversation and help to keep moving our understanding of best scientific literacy practices forward.

We declare that there are no conflicts of interest with respect to the contents of this article.

The views expressed in this article do not necessarily reflect the views of the journal or of ASM.

Plan to Attend Cell Bio 2024

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Developing a Test of Scientific Literacy Skills (TOSLS): Measuring Undergraduates’ Evaluation of Scientific Information and Arguments

  • Cara Gormally
  • Peggy Brickman

Address correspondence to: Cara Gormally () or Peggy Brickman ().

*Georgia Institute of Technology, School of Biology, Atlanta, GA 30322

Search for more papers by this author

Department of Plant Biology, University of Georgia, Athens, GA 30602

Department of Educational Psychology and Instructional Technology, University of Georgia, Athens, GA 30602

Life sciences faculty agree that developing scientific literacy is an integral part of undergraduate education and report that they teach these skills. However, few measures of scientific literacy are available to assess students’ proficiency in using scientific literacy skills to solve scenarios in and beyond the undergraduate biology classroom. In this paper, we describe the development, validation, and testing of the Test of Scientific Literacy Skills (TOSLS) in five general education biology classes at three undergraduate institutions. The test measures skills related to major aspects of scientific literacy: recognizing and analyzing the use of methods of inquiry that lead to scientific knowledge and the ability to organize, analyze, and interpret quantitative data and scientific information. Measures of validity included correspondence between items and scientific literacy goals of the National Research Council and Project 2061, findings from a survey of biology faculty, expert biology educator reviews, student interviews, and statistical analyses. Classroom testing contexts varied both in terms of student demographics and pedagogical approaches. We propose that biology instructors can use the TOSLS to evaluate their students’ proficiencies in using scientific literacy skills and to document the impacts of curricular reform on students’ scientific literacy.

INTRODUCTION

Science educators, scientists, and policy makers agree that development of students’ scientific literacy is an important aim of science education. Scientific literacy has been defined in multiple ways, all of which emphasize students’ abilities to make use of scientific knowledge in real-world situations ( American Association for the Advancement of Science [AAAS], 1990 , 2010 ; Bybee, 1993 ; Maienschein et al. , 1998 ; Millar et al. , 1998 ; DeBoer, 2000 ). For example, the National Research Council (NRC) defines scientific literacy as the ability “use evidence and data to evaluate the quality of science information and arguments put forth by scientists and in the media” ( NRC, 1996 ). Project 2061 ( AAAS, 1993 ) and the Programme for International Student Assessment describe scientific literacy as “the capacity to use scientific knowledge to identify questions and to draw evidence-based conclusions in order to understand and help make decisions about the natural world and the changes made to it through human activity” ( Organisation for Economic Co-operation and Development, 2003 ). These two definitions are the framework for our working concept of scientific literacy.

Individuals use scientific information in many real-world situations beyond the classroom, in ways ranging from evaluating sources of evidence used in media reports about science to recognizing the role and value of science in society to interpreting quantitative information and performing quantitative tasks ( Cook, 1977 ; Jenkins, 1990 ; Uno and Bybee, 1994 ; Koballa et al. , 1997 ; Ryder, 2001 ; Kutner et al ., 2007 ). Achieving scientific literacy for all is a core rationale for science coursework as part of general education (Gen Ed) requirements for undergraduates ( Meinwald and Hildebrand, 2010 ). In response to calls for reform and alignment with science education standards, instructors of these Gen Ed science courses have focused increasingly on students’ development of scientific literacy skills, including quantitative literacy ( Quitadamo et al. , 2008 ; Chevalier et al. , 2010 ; Marsteller et al. , 2010 ; Colon-Berlingeri and Borrowes, 2011 ; Brickman et al. , 2012 ). Coincident with this shift is an interest in finding ways to assess students’ development of scientific literacy skills, especially in the context of Gen Ed courses ( Labov, 2004 ; DeHaan, 2005 ).

To date, several biology concept inventories have been developed to assess students’ conceptual knowledge ( Anderson et al. , 2002 ; Garvin-Doxas and Klymkowsky, 2008 ; Smith et al. , 2008 ; Shi et al. , 2010 ; Tsui and Treagust, 2010 ). However, similar progress has lagged in the realm of evaluating students’ scientific literacy skills as defined by the NRC standards ( NRC, 1996 ). Researchers have yet to agree upon a single set of measurable skills critical for scientific literacy, beyond unanimously agreeing that these skills must include conceptual understanding, as well as views about science and society ( Bauer et al. , 2007 ). In a recent study surveying more than 150 life sciences faculty from a variety of institutions, faculty identified problem solving/critical thinking, oral and written communication, and the ability to interpret data as the three most important skills students should develop before graduation ( Coil et al. , 2010 ). However, these skills require further articulation into measurable constructs in order to effectively evaluate students’ mastery of these skills.

Several instruments have been developed to assess individual aspects of scientific literacy skills, but no single instrument measures all skills. Two surveys frequently used for international comparisons of scientific literacy include questions about non–lab-based science process skills, such as defining science, and items measuring vocabulary and basic content knowledge ( Lemke et al. , 2004 ; Miller, 2007 ). General scientific reasoning instruments were developed specifically to test cognitive skills related to critical thinking and reasoning ( Lawson, 1978 ; Facione, 1991 ; Sundre, 2003, 2008 ; Sundre et al. , 2008 ; Quitadamo et al. , 2008). However, for the average instructor, too much time and money may be required to utilize multiple instruments to measure all these skills. Situational factors, such as large-enrollment courses, may also hamper the utility of these instruments. For example, an instrument recently developed by White et al. (2011) challenges subjects to confront issues of quality, credibility, and interpretation of scientific research using open-ended responses to conclusions from individual research studies, but this instrument may be challenging to use in large-enrollment courses. The lack of a readily accessible instrument for assessing students’ proficiency in evaluating scientific arguments and sources of evidence as described by the NRC may serve as a barrier to evaluating curricular reform ( NRC, 1996 ).

Like other faculty who emphasize scientific literacy skills in classroom instruction, we had no ready means for evaluating the impact of a curricular reform in a large Gen Ed course. Several recent studies of students’ science literacy skill development in reformed biology courses have relied on multiple-choice exam questions ( Fencl, 2010 ) or a pre- and postintervention test ( Chevalier et al. , 2010 ) as a means of assessment. In both cases, the test's psychometric properties were unknown. Relying on tests such as these for evaluation purposes presents limitations for generalizing findings. To avoid this, we sought to develop a practical and psychometrically sound test for use across undergraduate introductory science courses. This test was designed to be freely available, as well as quick to administer and score.

We describe here the development process of the Test of Scientific Literacy Skills (TOSLS), as well as results from its use in Gen Ed biology courses at several different institutions. The instrument consists of 28 multiple-choice questions that are contextualized around real-world problems, for example, evaluating the reliability of an Internet site containing scientific information or determining what would constitute evidence to support a fitness product's effectiveness. The TOSLS development process included articulating the skills critical for scientific literacy, examining the validity of the instrument through student interviews and biology educator expert reviews, pilot testing, subsequent examination of psychometric properties, and finally, classroom testing of the finalized instrument in multiple, different biology courses ( Table 1 ).

Overview of TOSLS development process

1Examined literature on existing instruments to identify scientific literacy skills
2Conducted faculty survey to articulate what encompasses scientific literacy skills
3Developed and administered a pilot assessment based on defined skills
4Revised assessment based on item analyses and feedback from student interviews
5Examined instrument validity through additional student interviews and biology faculty evaluations
6Evaluated finalized instrument for item difficulty, item discrimination, and reliability
7Administered instrument in multiple contexts to demonstrate utility and measured learning gains

INSTRUMENT DEVELOPMENT

Overview of tosls development process: instrument validity.

Development of the TOSLS was an iterative process informed by the development process of recent instruments, including the Introductory Molecular and Cell Biology Assessment ( Shi et al. , 2010 ), Quantitative Reasoning Test and Scientific Reasoning Test (Sundre, 2003, 2008 ; Sundre et al. , 2008 ), and CLASS Biology ( Semsar et al. , 2011 ). Establishing instrument validity was an important part of the development process. Validity determines the extent to which the instrument measures what it purports to measure ( American Educational Research Association, 1999 ). We used multiple means to determine instrument validity, focusing on measures of content validity and construct validity ( Osterlind, 2010 ). Content validity is the extent to which the instrument measures all facets of a given social construct, in this case, skills essential for scientific literacy. Measures of content validity included building on national reports and a faculty survey about skills essential for scientific literacy and utilizing expert biology faculty evaluations. Construct validity involves statistical analyses to evaluate item validity and relationships between instrument items. Measures of construct validity included test item analyses, internal test consistency, and expert faculty biology evaluations of instrument items.

Content Validity

Insuring inclusion of major facets of scientific literacy skills..

We began by identifying key definitions provided in education policy documents and reviews in order to define the major facets of scientific literacy for this instrument ( AAAS, 1993 , 2010 ; National Academy of Sciences, 1997 ; Organisation for Economic Co-operation and Development, 2003 ; Sundre, 2003 ; Picone et al. , 2007 ; Holbrook and Rannikmae, 2009 ; Bray Speth et al. , 2010 ). We uncovered recommendations that addressed skills, such as understanding communications about science in the public domain; dealing with issues of scientific uncertainty; and collecting, evaluating, and interpreting data ( Millar, 1996 ; Ryder, 2001 ). We also heeded recent reports that recommend incorporating quantitative concepts into undergraduate introductory science courses, since quantitative literacy provides a common language across scientific disciplines ( NRC, 2003 ; Bialek and Botstein, 2004 ; Gross, 2004 ; Kutner et al. , 2007 ; Karsai and Kampis, 2010 ). Students need to develop a broad set of skills to approach scientific phenomena quantitatively ( NRC, 2003 ), as well as to apply basic quantitative concepts in their daily lives ( Kutner et al ., 2007 ). Quantitative literacy, as defined for adults’ daily lives by the National Assessment of Adult Literacy, is the “knowledge and skills required to perform quantitative tasks (i.e., to identify and perform computations, either alone or sequentially, using numbers embedded in printed materials),” which may include calculating a percentage to figure a tip at a restaurant or the amount of interest on a loan ( Kutner et al. , 2007 ). Using this literature review, we identified skills related to two major categories of scientific literacy skills: 1) skills related to recognizing and analyzing the use of methods of inquiry that lead to scientific knowledge, and 2) skills related to organizing, analyzing, and interpreting quantitative data and scientific information. We articulated the skills as measurable outcomes, herein referred to as TOSLS skills ( Table 2 ).

Categories of scientific literacy skills

QuestionsExplanation of skillExamples of common student challenges and misconceptions
I. Understand methods of inquiry that lead to scientific knowledge
1. Identify a valid scientific argument1, 8, 11Recognize what qualifies as scientific evidence and when scientific evidence supports a hypothesisInability to link claims correctly with evidence and lack of scrutiny about evidence
“Facts” or even unrelated evidence considered to be support for scientific arguments
2. Evaluate the validity of sources10, 12, 17, 22, 26Distinguish between types of sources; identify bias, authority, and reliabilityInability to identify accuracy and credibility issues
3. Evaluate the use and misuse of scientific information5, 9, 27Recognize a valid and ethical scientific course of action and identify appropriate use of science by government, industry, and media that is free of bias and economic, and political pressure to make societal decisionsPrevailing political beliefs can dictate how scientific findings are used. All sides of a controversy should be given equal weight regardless of their validity.
4. Understand elements of research design and how they impact scientific findings/conclusions4, 13, 14Identify strengths and weaknesses in research design related to bias, sample size, randomization, and experimental controlMisunderstanding randomization contextualized in a particular study design. General lack of understanding of elements of good research design.
II. Organize, analyze, and interpret quantitative data and scientific information
5. Create graphical representations of data15Identify the appropriate format for the graphical representation of data given particular type of dataScatter plots show differences between groups. Scatter plots are best for representing means, because the graph shows the entire range of data.
6. Read and interpret graphical representations of data2, 6, 7, 18Interpret data presented graphically to make a conclusion about study findingsDifficulty in interpreting graphs Inability to match patterns of growth, (e.g., linear or exponential) with graph shape
7. Solve problems using quantitative skills, including probability and statistics16, 20, 23Calculate probabilities, percentages, and frequencies to draw a conclusionGuessing the correct answer without being able to explain basic math calculations Statements indicative of low self-efficacy: “I'm not good at math.”
8. Understand and interpret basic statistics3, 19, 24Understand the need for statistics to quantify uncertainty in dataLack of familiarity with function of statistics and with scientific uncertainty. Statistics prove data is correct or true.
9. Justify inferences, predictions, and conclusions based on quantitative data21, 25, 28Interpret data and critique experimental designs to evaluate hypotheses and recognize flaws in argumentsTendency to misinterpret or ignore graphical data when developing a hypothesis or evaluating an argument

Faculty Survey.

Because expert agreement provides strong support for content validity, we sought to verify the consistency of the skills we articulated through our literature review with the opinions of faculty teaching Gen Ed courses. Alignment between these two sources would support the claim that we included major facets of scientific literacy, and, in addition, would provide evidence of utility for faculty beyond our own courses. To determine the degree of consistency, we designed an online survey to elicit feedback from faculty teaching Gen Ed biology courses nationwide (included in the Supplemental Material). Specifically, we asked faculty to list the three most important skills for scientific literacy and to rate the importance of the skills required for students to be considered scientifically literate (described in Table 2 ). Finally, we asked these faculty whether they currently teach and assess these skills. We sent this survey to life science faculty and postdocs, using email listservs from professional organizations (National Association of Biology Teachers, Association of Biology Laboratory Educators, and Society for the Advancement of Biology Education Research, among others) and textbook publishers (John Wiley & Sons and McGraw-Hill). Survey respondents ( n = 188) hailed from throughout the United States and represented a wide variety of higher education institutions, with 34% from private colleges and universities, 20% from public 2-yr colleges, 20% from public state universities, 17% from public research universities, 5% from public state colleges, and 4% from public regional universities. The majority of faculty respondents (78%) teach at least some students who are nonscience majors. Of these faculty, 40% teach Gen Ed biology courses composed solely of nonscience majors, while 38% teach courses composed of both science and nonscience majors. The remaining faculty participants teach a mix of science majors, including biology majors (12%), and courses for biology majors only (10%).

All three coauthors individually read and classified the survey responses into categories. Through discussion, we clarified and consolidated the categories we identified. Finally, one coauthor (M.L.) classified each response into the agreed-upon categories; all three coauthors discussed uncertainties as they arose in the classification process. The three most important skills that faculty listed for Gen Ed biology students to demonstrate scientific literacy strongly corresponded to our TOSLS skills. Of all skills cited by faculty respondents, the most frequent responses were related to understanding the nature of science (NOS; 15.44%), with responses such as “understand what serves as evidence in science” and “differentiate between science and non-science,” which align with skill 1: identifying a valid scientific argument ( Table 2 ). Similarly, faculty identified skills related to other aspects of NOS, with the second, third, and fourth most frequent responses closely corresponding with skill 4: understand elements of research design and how they impact scientific findings/conclusions (15.09%); skill 2: evaluate the validity of sources (13.21%); and skill 3: evaluate the use and misuse of scientific information (8.58%), respectively. Although there has been an emphasis recently on the importance of quantitative literacy, only 12.87% of all responses aligned with quantitative and graphing skills (skills 5, 6, 7, 8, 9). Responses categorized as specific content knowledge accounted for more responses than any one other skill described (21.1%).

Respondents were asked to identify the importance, on a scale from 1 (unimportant) to 5 (very important) for undergraduates in Gen Ed biology courses to develop each of the nine TOSLS skills, as well as whether they currently taught and assessed the skills ( Figure 1 ). When prompted with the skill, faculty rated the importance of teaching quantitative skills equal to that of NOS skills. The majority of faculty agreed that the TOSLS skills are important for scientific literacy. A large majority of faculty report that they currently teach all these skills (≥58.7% teach all skills, with the exception of skill 8: understanding and interpreting basic statistics, which only 44.9% of faculty report teaching). However, faculty report that they assess their students’ proficiencies in using these skills at lower rates than they report teaching these skills (≥57.5% assess most skills, with the exception of skill 8, which only 40.1% assess, and skill 3, which 40.1% assess, and skill 2, which 49.1% assess). (All skills are described in Table 2 .)

Figure 1.

Figure 1. Percentage of faculty who rated these skills (described in Table 2 ) as important to very important (4–5 out of a 5-point scale), and percentage who currently teach and assess these skills ( n = 167 faculty participants teaching a Gen Ed course).

Construct Validity

Item development built from student challenges..

Assessment items are most valuable if they can assist in documenting students’ initial confusions, incomplete understandings, and alternative conceptions ( Tanner and Allen, 2005 ). Therefore, we began our development of test items by reviewing studies that documented common student challenges in addressing problems relating to our set of scientific literacy skills ( Table 2 ). We focused primarily on reviewing literature concerning postsecondary education. We would direct interested readers to review literature on student misconceptions at the K–12 level as well, since undergraduates may continue to hold these misconceptions. Many TOSLS skills involved recognizing and analyzing methods of inquiry that lead to scientific knowledge. Students must be able to critique scientific experiments, data, and results in order to make decisions about the ill-structured problems common to science. This, in its entirety, can be thought of as analyzing the strength of evidenced-based arguments. We utilized the findings that students have trouble both formulating claims backed by evidence and providing reasoning for claims ( Bray Speth et al. , 2010 ), as well as linking claims to specific evidence ( Cho and Jonassen, 2002 ) to begin construction of short-answer questions in these areas.

Critiquing the quality of sources of evidence is also an integral part of analyzing the strength of scientific arguments. The Internet has revolutionized access to scientific information for the average person and at the same time has exacerbated the need to critically evaluate these sources. In fact, 40% of U.S. Internet users report obtaining most of their scientific information from the Internet, and 87% of users report having searched online about science at least once ( Horrigan, 2006 ). Students of all ages (primary, secondary, and higher education) encounter difficulties when evaluating the relevance and reliability of Web information ( MaKinster et al. , 2002 ; Brand-Gruwel et al. , 2009 ). Left to their own devices, very few Internet users check the source and date of the information they find ( Fox, 2006 ).

Credibility issues, such as recognizing conflicts of interest, affiliations, and expertise in sources of evidence, are also challenging for students. Even when introduced to Web evaluation criteria and asked to rank the quality of sites, students have difficulty evaluating sites for credibility and accuracy, instead using surface markers, such as currency, author, and amount and type of language used ( Britt and Aglinskas, 2002 ; Walraven et al. , 2009 ). Students often think the number of authors on a publication increases the credibility, thinking that each author adds independent corroboration of results ( Brem et al. , 2011 ). And students rarely venture beyond the initial site for independent corroboration, instead using surface markers, such as dates of posting and presence of details and percentages as evidence of accuracy ( Brem et al. , 2011 ). Students with low topic knowledge are more likely to trust poor sites and fail to differentiate between relevant and irrelevant criteria when judging the trustworthiness of sources ( Braten et al. , 2011 ). For this reason, we also included in our pilot assessment short-answer items that asked students to evaluate the quality of information from online sources, such as websites.

The TOSLS includes skills need to interpret numerical information ( Table 2 ). This is also an integral part of acquiring functional scientific literacy, because scientific claims are often supported by quantitative data ( Steen, 1997 ). Students have difficulty representing quantitative data on graphs, including labeling axes correctly and choosing the appropriate type of graph to display particular kinds of findings ( Bray Speth et al. , 2010 ). Students also have difficulty summarizing trends from data with variation, interpreting the biological meaning of a slope of a line, and interpreting graphs with interactions ( Preece and Janvier, 1992 ; Bowen et al. , 1999 ; Picone et al. , 2007 ; Colon-Berlingeri and Borrowes, 2011 ). For these reasons, we constructed multiple-choice items based on common student responses. For example, we adapted short-answer graphing questions used by Picone et al. (2007) into multiple-choice questions and provided students with short-answer questions that asked them to interpret information from graphs commonly seen in media reports found in periodicals such as the New York Times . We suggest that interested readers explore curricular resources at the National Institute of Mathematical and Biological Sciences (2012 ), as well as a recent report describing best practices for integrating mathematics in undergraduate biology ( Marsteller et al. , 2010 ).

Pilot Item Testing.

At the beginning of the Summer 2010 semester, we piloted items probing the challenges described above with students in Concepts in Biology, a Gen Ed biology course at a large research university in the southeast ( n = 80). We administered two isomorphic test forms, each containing nine short-answer questions and 14 multiple-choice questions contextualized around authentic real-world problems, such as evaluating the trustworthiness of information found from various sources (including a fictitious website) or interpreting data on meat consumption trends over the past 20 yr from a New York Times article.

Following the test administration, we analyzed students’ written responses and constructed multiple-choice responses from frequently occurring answers to the nine short-answer questions. The practice of developing distracters, wrong answers that students frequently choose, from students’ own words is a well-established strength of concept inventory development ( Sadler, 1998 ; D'Avanzo, 2008 ). We also recruited student volunteers for audiotaped cognitive interviews ( n = 2). We conducted cognitive interviews across three iterations of the instrument-development process to aid in item refinement. Cognitive interviewing is a method used to elucidate whether respondents comprehend and respond to items in the way that researchers intend ( Willis, 2005 ). We used this method to help identify unexpected problems in the wording of questions prior to expanding its use. Interviews were conducted by two graduate student research collaborators using an interview protocol that included asking student interviewees to identify unknown terminology and confusing wording in each question, as well as think-alouds in which students were asked to give their reasoning for answering questions. The graduate student research collaborators transcribed the audiotaped interviews to summarize issues raised by interviewees. Two coauthors (P.B. and M.L.), along with the two graduate student research collaborators, listened to and discussed the audiotaped interviews. Responses to interviews were used to inform further test revision. At the end of the Summer 2010 semester, the revised test forms were administered, each with 23 multiple-choice questions. We analyzed test forms for item difficulty, reliability, and test equivalence. Unreliable items (defined as point biserial correlation scores below 0.15) were revised or removed.

During Fall 2010 and Spring 2011, we piloted the further revised multiple-choice assessments in two large-enrollment Gen Ed biology courses (Fall: Organismal Biology, n = 340; Spring: Concepts in Biology and Organismal Biology, n = 498 pre, n = 378 post), administering each form pre- and postcourse. After each administration of the instrument, we examined the performance of each test question based on such indicators as item difficulty and item discrimination. We also examined the quality of the distracters for each item, looking for nondistracters (i.e., distracters that were chosen by five or fewer students) and poorly discriminating distracters. Well-written distracters should be selected more often by students with less knowledge in the domain of interest compared with those students selecting the correct answer. Conversely, poorly discriminating distracters are selected by a large number of high performers, and do not differentiate effectively among students with high and low scores on the overall test. These distracters may be poorly written or unintentionally challenging. In this study, distracters were considered to be poor discriminators when the overall test score mean for the students choosing the distracter was equal to or above the mean score for students choosing the correct answer. Poorly performing test questions were revised or removed prior to subsequent administrations.

Finally, in Summer 2011, we condensed the assessment to one form with 30 multiple-choice questions, and administered the assessment only in the beginning of the semester ( n = 70). One coauthor (P.B.) conducted an audiotaped focus group interview with students ( n = 5), to determine their reasoning through each question. The interview was transcribed. Focus group findings were used to revise distracter choices in two major ways. First, we revised answer choices to represent true misconceptions rather than confusing wording. Second, we removed terminology such as “peer review” and “unbiased,” which students said clued them in to answer a question correctly without a deep understanding. In total, the assessment was piloted and revised through five semester-long cycles. Example items are shown in Table 3 , and the complete test and the test with the answer key are included in the Supplemental Material.

Example questions contextualized around real-world issues

Expert Faculty Evaluation

Expert evaluation was critical to ensure construct and content validity. We utilized several rounds of expert faculty evaluation of the items. During Fall 2010, five expert biology educators took both isomorphic test forms. In addition to answering the test questions, they provided comments on comprehension, relevance, and clarity. We used their comments to further revise items and to determine whether items should be removed from the instrument for subsequent rounds during the Fall 2010, Spring 2011, and Summer 2011 semesters. Once the instrument was in its final form, faculty experts in biology education, external to the project, were recruited by email to evaluate the assessment during Summer 2011 ( n = 18). Criteria for expertise included teaching introductory biology at the university level to Gen Ed students and participation in one of two national professional development programs: Faculty Institutes for Reforming Science Teaching or the National Academies Summer Institute on Undergraduate Education in Biology at the University of Wisconsin. Experts evaluated each question for scientific accuracy, commented on question understandability ( Table 4 ), and answered each question themselves ( Table 5 ). This set of evaluations guided final instrument development, serving in particular as a means to identify questions and answer items that required revision or removal and guiding the reclassification of items according to skills measured.

Summary of expert responses to the three queries about the 28 TOSLS questions

Agreement of experts ( = 18)
Subject of query>90%>80%>70%
Number of questions ( = 28)
The information given in this question is scientifically accurate.2350
The question is written clearly and precisely.1585
After taking a college Gen Ed science course, students should be able to answer this question.1972
Mean pre- and posttest scores of students from each course with calculated value and effect size, as well as scores from biology faculty experts

Mean % correct (SE)Internal consistency
PretestPosttest Effect sizePretestPosttest
Project-based nonmajors at public research university61.71 (1.05)70.76 (0.96)10.51*0.830.7340.758
Traditional nonmajors at public research university58.33 (0.99)65.45 (0.92)9.65*0.480.7180.713
Private research university84.63 (1.30)84.95 (1.34)0.320.030.5810.632
Midsized state college44.29 (1.70)42.50 (1.56)1.220.12N/AN/A
Biology majors at public research university61.72 (0.71)67.13 (0.75)7.65*0.330.6820.761
Biology expertsN/A91.43 (0.98)N/AN/AN/A

a Pre- and posttest internal consistency is shown.

b * p < 0.05 (indicates significant gains).

Student Interviews

During Fall 2011, student volunteers were solicited immediately following the pre- and postadministration of the instrument for think-aloud cognitive interviews ( Willis, 2005 ). We selected students representing the diversity of the class, using information they provided about their major, gender, age, and experience in science courses. This included similar numbers of men and women from a variety of majors (education, humanities, business, math, and social sciences). A doctoral research assistant in math and science education experienced in interviewing techniques conducted individual hour-long, semi-structured interviews with 16 undergraduates ( n = 10 at the beginning of the semester and n = 6 at the end of the semester). Each student volunteer was given a copy of the TOSLS and was asked to reflect and verbally articulate the reasoning process he or she used to answer each question. The interviews were audiotaped and transcribed by the graduate assistant. Two coauthors (C.G. and P.B.) followed a systematic approach to determine what characteristics would constitute correct reasoning for each skill set of questions, attempting to determine all components that would define correct reasoning. Together, they analyzed each student response, focusing on responses provided for correct answers to the multiple-choice questions. Any discrepancies were discussed until a consensus was reached. At this preliminary stage, three general types of student responses were identified: responses that provided correct reasoning, either describing why the student chose the correct multiple-choice answer and/or why the student excluded other answers; responses that were too vague to determine whether they provided correct reasoning; and responses indicating incorrect reasoning. Responses that were too vague (e.g., “It seems to be the only one”) were noted but excluded from further analysis. Using this rubric of three general student responses for each skill set of questions, the raters coded the full data set. Any student response that could not be classified according to correct reasoning as defined by the rubric was subject to discussion about whether the list should be amended. Through this iterative process of synchronous rating and discussion, the rubric was refined (see the Supplemental Material). Finally, a single rater (C.G.) coded all the responses, using the rubric, and determined the frequencies of responses in all categories. In total, students answered 81.25% of the multiple-choice questions correctly. In terms of the reasoning students provided for the correctly answered multiple-choice questions, 4.4% of responses were vague and 94.5% of responses provided correct reasoning.

Statistical Characterization

After multiple rounds of pilot testing, individual and focus group interviews of student think-alouds, and expert reviews, we administered the final version of the TOSLS to students taking Concepts in Biology, an introductory biology class for nonmajors taught using a traditional lecture-based format (referred to herein as the “traditional nonmajors” course; n = 296). The psychometric properties of pre- and postsemester administrations of the TOSLS included item difficulty ( Figure 2 ), item discrimination ( Figure 3 ), and test reliability ( Crocker and Algina, 2008 ; Osterlind, 2010 ).

Figure 2.

Figure 2. (a) Pre- and postmeasures of item difficulty, with results from the nonscience majors in the lecture-based section and (b) the project-based section of Concepts in Biology in Fall 2011 (* p < 0.05 difference between pre- and posttest scores). Questions are grouped according to skills ( Table 2 ).

Figure 3.

Figure 3. Pre- and postmeasures of item discrimination from Fall 2011. Findings from lecture-based and projects-based sections are shown combined.

Item difficulty measures the proportion of the total sample that answered a question correctly. Item difficulties range from 0 to 1.0, with larger values representing “easier” test items. Individual item difficulties ranging from 0.30 to 0.80 are acceptable, particularly when difficulties are symmetrically distributed across a test ( Feldt, 1993 ). The average item difficulty for the TOSLS was 0.59 on the pretest and 0.68 on the posttest ( Figure 2 ). Item difficulties ranged from 0.32 to 0.88 on the pretest and 0.33 to 0.91 on the posttest.

Item discrimination indices quantify how well a test question differentiates among students with high and low scores on the overall test. Students with well-developed scientific literacy skills, for example, should be more likely to answer test items correctly than students with poorly developed skills. Item discrimination scores for the TOSLS were calculated using corrected point biserial correlations. Item discrimination scores below 0.20 indicate that the item poorly differentiates among students with high and low abilities ( Ebel, 1965 ). The average item discrimination for the TOSLS was between 0.26 and 0.27 for the pre- and posttests, respectively ( Figure 3 ). Item discrimination indices ranged from 0.05 to 0.36 on the pretest and from 0.09 to 0.41 on the posttest.

The overall reliability of the TOSLS was explored by examining the internal consistency of the test. Internal consistency estimates indicate the degree to which a group of items measure the same construct. We used the Kuder-Richardson 20 formula, a measure of internal consistency appropriate for use with binary data. Internal consistency estimates above 0.70 are considered acceptable, and values above 0.8 are considered to reflect good test reliability ( Cronbach, 1951 ). The internal reliability of the TOSLS was 0.731and 0.748 on the pretest and posttest, respectively ( Table 5 ). These scores fall within the acceptable range of reliability. An exploratory factor analysis, a principal components analysis with a Varimax rotation, indicated that one factor rather than two or more factors best accounted for the variance in the data. These results indicate that the tested skills are related and that it is meaningful to view a student's score on the TOSLS as a measure of his or her scientific literacy skills.

Instrument Administration and Measurement of Learning Gains

During Fall 2011, the multiple-choice question assessment was administered pre- and postsemester at three types of undergraduate institutions: a public research university, a private research university, and a midsized state college ( Table 6 ). We chose to administer the instrument at several different institutions, with pedagogical approaches ranging from primarily lecture-based to reformed learner-centered courses. In administering the TOSLS, we sought to demonstrate the test's utility across multiple contexts, as well as to determine the sensitivity of the TOSLS to highlight differences in learning gains. The assessment was administered in two different courses at a large public research university (very high research activity), with a primarily residential student body, located in the southeastern United States. One section of Concepts of Biology, a Gen Ed biology course, was taught primarily through lecturing (traditional nonmajors course), while the other section of the course was taught using a project-based applied-learning (PAL) curriculum (project-based nonmajors course; described in Brickman et al. , 2012 ). The assessment was administered in a second course at public research university, Principles of Biology I, an introductory lecture-based course for biology majors (referred to herein as “biology majors” course). The assessment was administered in Introduction to Environmental Biology, a Gen Ed biology course taught using PAL, required for environmental biology majors, but only an elective credit for biology majors, at a private large research university (very high research activity), with a highly residential student body, located in the midwest. Finally, the assessment was administered in Principles of Biology, a primarily lecture-based Gen Ed biology course at a midsized state college, a masters-granting medium-sized public college, located in the southeast, with a primarily nonresidential student body. Institutions were chosen by convenience sampling through word-of-mouth at research conferences and responses to our faculty survey. Consequently, we were able to implement the TOSLS in Gen Ed courses at different types of institutions and to work with faculty interested and committed to using the TOSLS for one semester in their courses.

Demographics of courses from each institution

Public research university nonmajors
Project-basedTraditionalPublic research university majorsPrivate research university nonmajorsMidsized state college nonmajors
2902965445080
Male (% of sample)38.326.440.13228.78%
GPA3.49 (0.472)3.53 (0.415)3.27 (0.452)3.62 (0.42)Not reported
Major (% of sample)
 Social sciences29.629.43.53615
 Humanities12.010.81.2123.8
 Sciences16.216.678.14021.3
 Mathematics3.13.00.300
 Business24.119.90.8012.5
 Journalism6.210.60.200
 Education6.59.51.0217.5
 Agriculture2.404.603.8
 Engineering00020
 Undecided/not reported00006.25
Number of college-level science courses completed (average)1.46 (1.49)1.12 (1.29)2.43 (1.44)2.48 (1.86)0.53 (0.98)

Mean differences in pre- and posttest scores were examined using paired-sample t tests for each class. Effect sizes, which quantify the magnitude of mean differences in standardized terms, were also calculated (Cohen's d = t [2 (1 − r )/ n ] 1/2 ) ( Dunlap et al. , 1996 ; Andrews et al. , 2011 ; Table 5 ). Results indicated that posttest scores were significantly higher than pretest scores for the three classes (i.e., project-based, traditional, and biology majors) at the public research university, according to results from paired-sample t tests. Examination of effect sizes revealed that learning gains were large in magnitude for the project-based nonmajors class, approaching medium in magnitude for the traditional nonmajors class, and small in magnitude for the biology majors class ( Table 5 ). There were no significant difference between pre- and posttest scores at the private research university and the midsized state college. Effect sizes for learning gains were negligible for these classes. It should be noted, however, that although students from the private research university did not demonstrate significant learning gains on the TOSLS over the course of the semester, they outscored all other classes on the pretest and posttest. It is also important to note that learning gains for midsized state college students may not be reflective of the gains possible across an entire semester, as the pre- and posttests were administered at midsized state college only 8 wk apart, as opposed to 16 and 14 wk between pre- and posttest administration at the public research university and the private research university, respectively. Consequently, our ability to compare learning gains from the midsized state college course with courses cross-institutionally is limited. These results may reflect differences in students’ scientific literacy development that are attributable to academic development, prior science learning, and student composition at different calibers of institutions.

We used an analysis of covariance (ANCOVA) to determine whether there was a significant difference among the three public research university classes in learning gains on the TOSLS, using pretest scores as the covariate. Data from the private research university and the midsized state college were excluded from this analysis, because the assumption of homogeneity of variance was violated when those data were included. Results from the ANCOVA yielded a significant main effect for class ( F = 11.380, p < 0.001). Learning gains on the TOSLS were quantified by estimated marginal means ( Weber, 2009 ). We chose to quantify learning gains using estimated marginal means rather than normalized learning gains, as the latter misrepresent or exaggerate gains when students are at extreme ends of the achievement spectrum in their pretest performance ( Weber, 2009 ; Andrews et al. , 2011 ). Estimated marginal means represent the estimated learning gains for each class after statistically controlling for the effect of pretest scores. These calculated means are appropriate given the ANCOVA. Post hoc pairwise comparisons with Bonferroni corrections indicated that the students in the project-based nonmajors class made significantly higher learning gains than the students in both the traditional nonmajors class and the biology majors class ( Figure 4 ). There was not a significant difference in learning gains between the traditional nonmajors class and the biology majors class. It should be noted that students in the project-based nonmajors course demonstrated significant improvement from pre- to posttest on 10 of the 12 questions in which students from the traditional nonmajors course also made improvements ( Figure 2 ). Additionally, students in the project-based nonmajors course made improvement on eight additional questions in skills 1, 4, and 6.

Figure 4.

Figure 4. Estimated marginal mean learning gains for each course, controlling for pretest scores. Letters indicate significantly different learning gains among courses ( p < 0.05).

Implications for Teaching and Learning

Opportunities for developing skills such as argumentation and scientific reasoning are important and yet often missing from science education efforts ( Newton et al. , 1999 ; Norris et al. , 2008 ; Osborne, 2010 ). We have developed this TOSLS instrument as a valid means to readily assess the impact of science, technology, engineering, and mathematics (STEM) education reform efforts on students’ development of these scientific literacy skills. In a recent survey of faculty, lack of support, articulated as not enough teaching assistants nor assessment tools, was identified as one obstacle to teaching science process skills ( Coil et al. , 2010 ). The TOSLS is a freely available, multiple-choice instrument that can be readily administered and scored in large-enrollment Gen Ed courses. The instrument contains items designed to specifically measure constructs related to using “evidence and data to evaluate the quality of science information and arguments put forth by scientists and in the media” ( NRC, 1996 ). In particular, this instrument is responsive to the priorities of biology faculty, as the results from surveying biology faculty throughout the United States were critical to defining these skills. Instrument development was informed by relevant literature and multiple rounds of testing and revision to best reflect the common challenges in students’ development of scientific literacy skills.

An interesting finding that emerged in the process of developing the TOSLS is the disconnect between instructors’ value of scientific literacy, their teaching of these skills, and their assessment of students’ skill proficiency. More than 65.8% of faculty surveyed agreed that all nine skills were “important” to “very important” to scientific literacy. Similarly, most faculty reported that they teach and assess these skills ( Figure 1 ; skills described in Table 2 ). However, when asked in an earlier open-ended question to state the three most important skills students need to develop for scientific literacy, many responses were related to biology content knowledge, rather than skills. This dissonance between what many faculty say they do and classroom reality has been documented by others and may be indicative of such concerns as the need to cover content and lack of time or expertise to develop and incorporate opportunities for skill development ( Coil et al. , 2010 ; Andrews et al. , 2011 ; Ebert-May et al. , 2011 ).

Because the TOSLS is sensitive enough to detect pre- to postsemester learning gains, its use may highlight the need to change or develop classroom activities that provide opportunities for students to develop the skills necessary to be scientifically literate citizens. This focus on developing scientific literacy skills is a major component in the push for reform in university STEM education, particularly in Gen Ed courses ( Quitadamo et al. , 2008 ; Chevalier et al. , 2010 ; Coil et al. , 2010 ; Hoskins, 2010 ). We used the TOSLS to evaluate the impact of a reformed Gen Ed biology course on student learning at a large public research university. Interestingly, we found that nonmajors students in our reformed classroom (project-based, Table 5 and Figure 4 ) made significantly greater learning gains than students in the traditional lecture-based course, even outperforming students in the biology majors course. Students in the project-based nonmajors course made greater gains than students in the traditional lecture-based course in several skill areas: skill 1 (question 1), skill 4 (questions 4 and 13), and skill 6 (questions 2, 6, 7, and 18) ( Table 2 ). Students in the traditional nonmajors lecture-based course showed improvement in only two skill areas in which project-based students did not: skill 2 (question 22) and skill 9 (question 28). We are using the TOSLS to measure longer-term gains as we follow a subset of these students in subsequent courses.

We propose that instructors can use the TOSLS to identify the gap between their intentions to teach scientific literacy skills and students’ skill proficiency. In particular, using the TOSLS may spur greater alignment of learning objectives, classroom activities, and assessments. The TOSLS is also informative in revealing student challenges and alternative conceptions in using scientific literacy skills. Instructors may use the TOSLS as a diagnostic tool in the beginning of the semester to reveal the extent of students’ literacy development. Class results may guide instructional planning. Additionally, instructors could tailor study suggestions for individual students’ skill development when using the TOSLS as a diagnostic assessment. An exploratory factor analysis indicates that the TOSLS instrument measures only one construct or trait rather than factors made up of our different scientific literacy skills, since one factor rather than two or more factors best accounted for the variance in the data. Common educational and psychological tests (Iowa Test of Basic Skills, Stanford Achievement Test) strive for unidimensional assessment ( Osterlind, 2010 ). Because the instrument measures just this single construct, one can assume that responses of students to the test items reflect progress along a scale for scientific literacy. We envision that the TOSLS may be administered to inform classroom teaching and learning practices in a variety of ways. The TOSLS can be given, in its entirety, as an in-class pretest at the start of the course, using either the paper-based version or the Web-based version (currently in testing), and again as a posttest at the end of the course. Administration of the assessment in class is a means to motivate students to take the test seriously, and the variability in the amount of time spent completing the assessment is minimized.

Implications for Research

On the basis of our classroom testing of the TOSLS across course types and institutions, we expect that the TOSLS may serve as a useful assessment for other applications, including cross-institutional comparisons, evaluation of student learning over time, or as a means of programmatic assessment for Gen Ed curricula. However, comparisons between courses or different institutions are reliable only when the assessment is administered the same way. Further, the TOSLS items may be useful to instructors and other researchers to use as models to develop additional assessment items of their own. In particular, student challenges and misconceptions reviewed herein may be useful to inform additional assessment questions.

Limitations

Although we administered the TOSLS to a variety of students across multiple institutions, the test was developed using analysis of items administered to Gen Ed students attending a large research university in biology courses. The TOSLS shows promise for use across introductory undergraduate science courses, because our instrument-development process included alignment with STEM education policy guidelines; however, more research is needed to explore the validity of the instrument for use with science disciplines beyond biology ( AAS, 1993 ; National Academy of Sciences, 1997 ; Organisation for Economic Co-operation and Development, 2003 ; AAAS, 2010 ). Additional trials with the TOSLS may be warranted to fully clarify the utility of the test for different students under different situations. Many of the items required a degree of critical thinking and reading comprehension skills that may be lacking in some students; the lower proficiency observed in state college students may reflect this challenge. Alternatively, the lower gains observed in state college students may be indicative of the amount of time needed for students to develop skills between the pre- and posttest in order to observe gains. Finally, the observed gains in scientific literacy skills for Gen Ed and science majors at a large research university were not measured for time periods greater than one semester; longitudinal studies with these students could be very informative. Tests of the TOSLS under different situational factors may help address these questions.

Twenty years ago, educators could not have foreseen the rise of the Internet and the profound change in access to scientific information. Not surprisingly, most formal educational settings have lagged in integrating information evaluation criteria into existing curricula ( Kuiper et al. , 2005 ). The TOSLS questions were designed to address both these practical evaluation skills and scientific literacy skills needed by the general public. As new resources and access to scientific information change over time, policy documents inevitably follow with suggestions for incorporating these skills into educational settings. We hope that faculty will use this test to enhance how they respond to these recommendations.

The complete TOSLS is included in the Supplemental Material. We encourage instructors interested in using the TOSLS to contact the corresponding authors with requests for additional information. We also appreciate feedback on findings and comments for revisions for future versions of the TOSLS.

Institutional Review Board Protocols

Permissions to use pre- and posttest data and student demographics and to conduct student interviews, survey of biology faculty, and expert faculty evaluations were obtained (exempt, administrative review status: protocol nos. 2011-10034-0, -1, -2, -3) from the University of Georgia Institutional Review Board. Permissions to use pre- and posttest data and student demographics were obtained (protocol no. A00001392) from the Austin Peay Institutional Review Board. Permissions to use pre- and posttest data and student demographics were obtained (expedited status: protocol no. 201108240) from the Washington University in St. Louis Institutional Review Board.

Potential conflict of interest: All authors contributed to developing the TOSLS, as well as the project-based applied learning curriculum described in the manuscript.

ACKNOWLEDGMENTS

The authors acknowledge continuing support and feedback from the University of Georgia Science Educators Research Group. Diane Ebert-May, the external evaluator for a National Science Foundation grant-funded project supporting this work, provided critical comments throughout instrument development, as did Erin Dolan, Shawn Glynn, and Jenny Knight. Carly Jordan, Virginia Schutte, Sarah Jardeleza, and Greg Francom developed early versions of some instrument items and provided valuable commentary about items through the iterative process of revising items.

  • American Association for the Advancement of Science (AAAS) ( 1990 ). Science for All Americans , New York: Oxford University Press. Google Scholar
  • AAAS ( 1993 ). Benchmarks for Science Literacy , New York: Oxford University Press. Google Scholar
  • AAAS ( 2010 ). Vision and Change: A Call to Action , Washington, DC. Google Scholar
  • American Educational Research Association ( 1999 ). Standards for Educational and Psychological Testing , Washington, DC: American Psychological Association. Google Scholar
  • Anderson DL, Fisher KM, Norman GJ ( 2002 ). Development and evaluation of the Conceptual Inventory of Natural Selection . J Res Sci Teach 39 , 952-978. Google Scholar
  • Andrews TM, Leonard MJ, Colgrove CA, Kalinowski ST ( 2011 ). Active learning NOT associated with student learning in a random sample of college biology courses . CBE Life Sci Educ 10 , 394-405. Link ,  Google Scholar
  • Bauer MW, Allum N, Miller S ( 2007 ). What can we learn from 25 years of PUS survey research? . Public Underst Sci 16 , 79-95. Google Scholar
  • Bialek W, Botstein D ( 2004 ). Introductory science and mathematics education for 21st-century biologists . Science 303 , 788-790. Medline ,  Google Scholar
  • Bowen GM, Roth W-M, McGinn MK ( 1999 ). Interpretations of graphs by university biology students and practicing scientists: toward a social practice view of scientific representation practices . J Res Sci Teach 36 , 1020-1043. Google Scholar
  • Brand-Gruwel S, Wopereis I, Walraven A ( 2009 ). A descriptive model of information problem solving while using Internet . Comput Educ 53 , 1207-1217. Google Scholar
  • Braten I, Stromso HI, Salmeron L ( 2011 ). Trust and mistrust when students read multiple information sources about climate change . Learn Instr 21 , 180-192. Google Scholar
  • Bray Speth E, Momsen JL, Moyerbrailean GA, Ebert-May D, Long TM, Wyse S, Linton D ( 2010 ). 1, 2, 3, 4: infusing quantitative literacy into introductory biology . CBE Life Sci Educ 9 , 323-332. Medline ,  Google Scholar
  • Brem SK, Russell J, Weems L ( 2011 ). Science on the Web: student evaluations of scientific arguments . Discourse Process 32 , 191-213. Google Scholar
  • Brickman P, Gormally C, Francom G, Jardeleza SE, Schutte VGW, Jordan C, Kanizay L ( 2012 ). Media-savvy scientific literacy: project-based instruction to develop critical evaluation skills . Am Biol Teach 74 , 374-379. Google Scholar
  • Britt MA, Aglinskas C ( 2002 ). Improving students’ ability to identify and use source information . Cogn Instr 20 , 485-522. Google Scholar
  • Bybee RW ( 1993 ). Reforming Science Education: Social Perspectives and Personal Reflections , New York: Teachers College Press. Google Scholar
  • Chevalier CD, Ashley DC, Rushin JW ( 2010 ). Acquisition and retention of quantitative communication skills in an undergraduate biology curriculum: long-term retention results . J Coll Sci Teach 39 , 64-70. Google Scholar
  • Cho K-L, Jonassen DH ( 2002 ). The effects of argumentation scaffolds on argumentation and problem solving . Educ Technol Res Dev 50 , 5-22. Google Scholar
  • Coil D, Wenderoth MP, Cunningham M, Dirks C ( 2010 ). Teaching the process of science: faculty perceptions and an effective methodology . CBE Life Sci Educ 9 , 524-535. Link ,  Google Scholar
  • Colon-Berlingeri M, Borrowes PA ( 2011 ). Teaching biology through statistics: application of statistical methods in genetics and zoology courses . CBE Life Sci Educ 10 , 259-267. Link ,  Google Scholar
  • Cook WD ( 1977 ). Adult Literacy Education in the United States , Newark, DE: International Reading Association. Google Scholar
  • Crocker L, Algina J ( 2008 ). Introduction to Classical and Modern Test Theory , Mason, OH: Cengage Learning. Google Scholar
  • Cronbach L ( 1951 ). Coefficient alpha and the internal structure of tests . Psychometrika 16 , 297-334. Google Scholar
  • D'Avanzo C ( 2008 ). Biology concept inventories: overview, status, and next steps . Bioscience 58 , 1-7. Google Scholar
  • DeBoer GE ( 2000 ). Scientific literacy: another look at its historical and contemporary meanings and its relationship to science education reform . J Res Sci Teach 37 , 582-601. Google Scholar
  • DeHaan RL ( 2005 ). The impending revolution in undergraduate science education . J Sci Educ Technol 14 , 253-269. Google Scholar
  • Dunlap WP, Cortina JM, Vaslow JB, Burke MJ ( 1996 ). Meta-analysis of experiments with matched groups or repeated measures designs . Psychol Methods 1 , 170-177. Google Scholar
  • Ebel R ( 1965 ). Measuring Educational Achievement , Englewood Cliffs, NJ: Prentice Hall. Google Scholar
  • Ebert-May D, Derting TL, Hodder J, Momsen JL, Long TM, Jardeleza SE ( 2011 ). What we say is not what we do: effective evaluation of faculty professional development programs . Bioscience 61 , 550-558. Google Scholar
  • Facione PA ( 1991 ). Using the California Critical Thinking Skills Test in Research, Evaluation, and Assessment , Millbrae, CA: California Academic Press. Google Scholar
  • Feldt LS ( 1993 ). The relationship between the distribution of item difficulties and test reliability . Appl Meas Educ 6 , 37-48. Google Scholar
  • Fencl HS ( 2010 ). Development of students’ critical-reasoning skills through content-focused activities in a general education course . J Coll Sci Teach 39 , 56-62. Google Scholar
  • Fox S. ( 2006 ). Online Health Search 2006, Pew Charitable Trust. Google Scholar
  • Garvin-Doxas K, Klymkowsky MW ( 2008 ). Understanding randomness and its impact on student learning: lessons learned from building the Biology Concept Inventory (BCI) . CBE Life Sci Educ 7 , 227-233. Link ,  Google Scholar
  • Gross LJ ( 2004 ). Interdisciplinarity and the undergraduate biology curriculum: finding a balance . Cell Biol Educ 3 , 85-87. Link ,  Google Scholar
  • Holbrook J, Rannikmae M ( 2009 ). The meaning of scientific literacy . Int J Environ Sci Educ 4 , 275-288. Google Scholar
  • Horrigan J. ( 2006 ). The Internet as a Resource for News and Information about Science. www.pewtrusts.org/our_work_report_detail.aspx?id = 21142 (accessed 13 January 2011). Google Scholar
  • Hoskins SG ( 2010 ). “But if it's in the newspaper, doesn't that mean it's true?” Developing critical reading and analysis skills by evaluating newspaper science with C.R.E.A.T.E . Am Biol Teach 72 , 415-420. Google Scholar
  • Jenkins EW ( 1990 ). Scientific literacy and school science education . School Sci Rev 71 , 43-51. Google Scholar
  • Karsai I, Kampis G ( 2010 ). The crossroads between biology and mathematics: the scientific method as the basis of scientific literacy . Bioscience 60 , 632-638. Google Scholar
  • Koballa T, Kemp A, Evans R ( 1997 ). The spectrum of scientific literacy . Sci Teach 64 , 27-31. Google Scholar
  • Kuiper E, Volman M, Terwel J ( 2005 ). The Web as an information resource in K-12 education: strategies for supporting students in searching and processing information . Rev Educ Res 75 , 285-328. Google Scholar
  • Kutner M, Greenberg E, Jin Y, Boyle B, Hsu Y, Dunleavy E ( 2007 ). Literacy in Everyday Life: Results from the 2003 National Assessment of Adult Literacy (NCES 2007–480) , Washington, DC: National Center for Education Statistics, U.S. Department of Education. Google Scholar
  • Labov JB ( 2004 ). The challenges and opportunities for improving undergraduate science education through introductory courses . Cell Biol Educ 3 , 212-214. Link ,  Google Scholar
  • Lawson AE ( 1978 ). The development and validation of a classroom test of formal reasoning . J Res Sci Teach 15 , 11-24. Google Scholar
  • Lemke M, Sen A, Pahlke E, Partelow L, Miller D, Williams T, Kastberg D, Jocelyn L ( 2004 ). International Outcomes of Learning in Mathematics Literacy and Problem Solving: PISA Results from the U.S. Perspective , Washington, DC: National Center for Education Statistics. Google Scholar
  • Maienschein J, et al. ( 1998 ). Scientific literacy . Science 281 , 917-917. Google Scholar
  • MaKinster JG, Beghetto RA, Plucker JA ( 2002 ). Why can't I find Newton's third law? Case studies of students’ use of the Web as a science resource . J Sci Educ Technol 11 , 155-172. Google Scholar
  • Marsteller P, de Pillis L, Findley A, Joplin K, Delesko J, Nelson K, Thompson K ( 2010 ). Toward integration: from quantitative biology to mathbio-biomath? . CBE Life Sci Educ 9 , 165-171. Link ,  Google Scholar
  • Meinwald JHildebrand JG ( 2010 ). Science and the Educated American: A Core Component of Liberal Education , Cambridge, MA: American Academy of Arts and Sciences. Google Scholar
  • Millar R ( 1996 ). Towards a science curriculum for public understanding . School Sci Rev 77 , 7-18. Google Scholar
  • Millar R, Osborne J, Nott M ( 1998 ). Science education for the future . School Sci Rev 80 , 19-24. Google Scholar
  • Miller JD. ( 2007 ). The impact of college science courses for non-science majors on adult science literacy. Paper presented at the Critical Role of College Science Courses for Non-Majors, San Francisco, CA, February 18, 2007. Google Scholar
  • National Academy of Sciences ( 1997 ). Introducing the National Science Education Standards , Washington, DC: Center for Science, Mathematics, and Engineering Education. Google Scholar
  • National Institute of Mathematical and Biological Sciences ( 2012 ). NIMBioS website. http://nimbios.org/education (accessed 25 June 2012). Google Scholar
  • National Research Council (NRC) ( 1996 ). National Science Education Standards , Washington, DC: National Academies Press. Google Scholar
  • NRC ( 2003 ). BIO2010: Transforming Undergraduate Education for Future Research Biologists , Washington, DC: National Academies Press. Google Scholar
  • Newton P, Driver R, Osborne J ( 1999 ). The place of argumentation in the pedagogy of school science . Int J Sci Educ 21 , 553-576. Google Scholar
  • Norris SP, Phillips LM, Smith ML, Guilbert SM, Stange DM, Baker JJ, Weber AC ( 2008 ). Learning to read scientific text: do elementary school commercial reading programs help? . Sci Educ 92 , 765-798. Google Scholar
  • Organisation for Economic Co-operation and Development ( 2003 ). The PISA 2003 Assessment Framework—Mathematics, Reading, Science and Problem Solving Knowledge and Skills. Google Scholar
  • Osborne J ( 2010 ). Arguing to learn in science: the role of collaborative, critical discourse . Science 328 , 463-466. Medline ,  Google Scholar
  • Osterlind SJ ( 2010 ). Modern Measurement: Theory, Principles, and Applications of Mental Appraisal , 2nd ed. Upper Saddle River, NJ: Pearson. Google Scholar
  • Picone C, Rhode J, Hyatt L, Parshall T ( 2007 ). Assessing gains in undergraduate students’ abilities to analyze graphical data . Teach Issues Exp Ecol 5 (July 2007):Research #1. http://tiee.esa.org/vol/v5/research/picone/abstract.html (accessed 3 May 2010). Google Scholar
  • Preece J, Janvier C ( 1992 ). A study of the interpretation of trends in multiple curve graphs of ecological situations . School Sci Math 92 , 299-306. Google Scholar
  • Quitadamo IJ, Faiola CL, Johnson JE, Kurtz MJ ( 2008 ). Community-based inquiry improves critical thinking in general education biology . CBE Life Sci Edu 7 , 327-337. Link ,  Google Scholar
  • Ryder J ( 2001 ). Identifying science understanding for functional scientific literacy . Stud Sci Educ 36 , 1-44. Google Scholar
  • Sadler PM ( 1998 ). Psychometric models of student conceptions in science: reconciling qualitative studies and character-driven assessment instruments . J Res Sci Teach 35 , 265-296. Google Scholar
  • Semsar K, Knight JK, Birol G, Smith MK ( 2011 ). The Colorado Learning Attitudes about Science Survey (CLASS) for use in biology . CBE Life Sci Educ 10 , 268-278. Link ,  Google Scholar
  • Shi J, Wood WB, Martin JM, Guild NA, Vicens Q, Knight JK ( 2010 ). A diagnostic assessment for introductory molecular and cell biology . CBE Life Sci Educ 9 , 453-461. Link ,  Google Scholar
  • Smith MK, Wood WB, Knight JK ( 2008 ). The Genetics Concept Assessment: a new concept inventory for gauging student understanding of genetics . CBE Life Sci Educ 7 , 422-430. Link ,  Google Scholar
  • Steen LA ( 1997 ). Why Numbers Count: Quantitative Literacy for Tomorrow's America , New York: College Entrance Examination Board. Google Scholar
  • Sundre D. ( 2003 ). Assessment of Quantitative Reasoning to Enhance Educational Quality. Paper presented at the American Educational Research Association Meeting, Chicago, IL, April 2003. Google Scholar
  • Sundre D ( 2008 ). The Scientific Reasoning Test, Version 9 (SR-9) Test Manual , Harrisonburg, VA: Center for Assessment and Research Studies. Google Scholar
  • Sundre DL, Thelk A, Wigtil C ( 2008 ). The Quantitative Reasoning Test, Version 9 (QR-9) Test Manual , Harrisonburg, VA: Center for Assessment and Research Studies. Google Scholar
  • Tanner K, Allen D ( 2005 ). Approaches to biology teaching and learning: understanding the wrong answers—teaching toward conceptual change . Cell Biol Educ 4 , 112-117. Link ,  Google Scholar
  • Tsui C-Y, Treagust D ( 2010 ). Evaluating secondary students’ scientific reasoning in genetics using a two-tier diagnostic instrument . Int J Sci Educ 32 , 1073-1098. Google Scholar
  • Uno GE, Bybee RW ( 1994 ). Understanding the dimensions of biological literacy . Bioscience 71 , 553-557. Google Scholar
  • Walraven A, Brand-Gruwel S, Boshuizen HPA ( 2009 ). How students evaluate information and sources when searching the World Wide Web for information . Comput Educ 52 , 234-246. Google Scholar
  • Weber E ( 2009 ). Quantifying student learning: how to analyze assessment data . Bull Ecol Soc Am 90 , 501-511. Google Scholar
  • White B, Stains M, Escriu-Sune M, Medaglia E, Rostamnjad L, Chinn C, Sevian H ( 2011 ). A novel instrument for assessing students’ critical thinking abilities . J Coll Sci Teach 40 , 102-107. Google Scholar
  • Willis GB ( 2005 ). Cognitive Interviewing: A Tool for Improving Questionnaire Design , Thousand Oaks, CA: Sage. Google Scholar
  • Development and validation of an instrument for measuring civic scientific literacy 1 February 2024 | Disciplinary and Interdisciplinary Science Education Research, Vol. 6, No. 1
  • Time for an Update 6 September 2024 | Zeitschrift für Psychologie
  • Development of a digital literacy measurement tool for middle and high school students in the context of scientific practice 31 August 2024 | Education and Information Technologies, Vol. 76
  • Scientific Literacy Profile of Prospective Science Teacher Students 31 July 2024 | IJORER : International Journal of Recent Educational Research, Vol. 5, No. 4
  • Exploring and comparing the difficulties among undergraduate and postgraduate students’ understanding of experimentation using primary scientific literature 16 July 2024 | Journal of Biological Education, Vol. 44
  • Online-based Literacy Test Instruments for Evaluating Prospective Elementary School Teachers’ Science Competencies 27 June 2024 | Jurnal Ilmiah Sekolah Dasar, Vol. 8, No. 2
  • Effectively integrating research argumentation in syllabus learning: A case study of reading journal articles in four fourth-year engineering fluid mechanics courses 10 August 2023 | International Journal of Mechanical Engineering Education, Vol. 52, No. 3
  • Pengaruh Model Inkuiri Terbimbing Berbantuan Game Edukatif Wordwall Terhadap Literasi Sains Siswa Sekolah Dasar 4 August 2024 | Jurnal Riset dan Inovasi Pembelajaran, Vol. 4, No. 2
  • Not all Bloom and gloom: assessing constructive alignment, higher order cognitive skills, and their influence on students’ perceived learning within the practical components of an undergraduate biology course 20 July 2022 | Journal of Biological Education, Vol. 58, No. 3
  • Increasing interest in data literacy: The quantitative public health data literacy training program 7 May 2024 | Journal of Statistics and Data Science Education
  • Drawing attention to previous studies can reduce confidence in a new research finding, even when confidence should increase 9 April 2024 | Quarterly Journal of Experimental Psychology, Vol. 30
  • Scientific evaluations and plausibility judgements in middle school students’ learning about geoscience topics 18 May 2023 | Journal of Geoscience Education, Vol. 72, No. 2
  • Belief updating when confronted with scientific evidence: Examining the role of trust in science 8 November 2023 | Public Understanding of Science, Vol. 33, No. 3
  • Scientific literacy in biology and attitudes towards science in the Chilean education system 19 March 2024 | Research in Science & Technological Education, Vol. 12
  • A Cooperative Model of Development and Validation of a Curriculum-Based Scientific Competence Test 23 May 2023 | International Journal of Science and Mathematics Education, Vol. 22, No. 3
  • Stephanie M. Gardner ,
  • Aakanksha Angra , and
  • Joseph A. Harsh
  • Kristy J Wilson, Monitoring Editor
  • Percepción de habilidades científicas en estudiantes peruanos de Medicina Humana 1 Mar 2024 | Educación Médica, Vol. 25, No. 2
  • Practice-Based Learning in Aesthetic Medicine: Assessing Scientific Literacy Among Cosmetic Practitioners 22 February 2024 | Dermatologic Surgery, Vol. 47
  • Biology lecturers’ perceptions of genetics literacy in Indonesia: an explanatory sequential mixed methods study 5 February 2024 | Journal of Biological Education
  • Evaluating secondary school students’ science literacy with the Test of Scientific Literacy Skills (TOSLS) 1 Feb 2024 | Journal of Physics: Conference Series, Vol. 2715, No. 1
  • Fen Bilgisi Öğretmen Adaylarının Fen Okuryazarlık Algısı Ölçeği Uyarlama Çalışması 13 January 2024 | Anadolu Üniversitesi Eğitim Fakültesi Dergisi, Vol. 8, No. 1
  • Developing the test of inquiry skills: measuring the level of inquiry skills among pupils in Slovakia 9 June 2023 | International Journal of Science Education, Vol. 46, No. 1
  • Introduction of Taiwanese literacy-oriented science curriculum and development of an aligned scientific literacy assessment 1 Jan 2024 | Eurasia Journal of Mathematics, Science and Technology Education, Vol. 20, No. 1
  • Assessment of scientific literacy levels among secondary school students in Lebanon: Exploring gender-based differences 1 Jan 2024 | Eurasia Journal of Mathematics, Science and Technology Education, Vol. 20, No. 3
  • Research on Bibliometric Analysis of Lawson Classroom Test of Scientific Reasoning in the Past Ten Years 31 December 2023 | Studies in Learning and Teaching, Vol. 4, No. 3
  • Biomedical Engineering Students’ Question Posing Skill Based on Reading Scientific Articles 23 February 2023 | Journal of Science Education and Technology, Vol. 32, No. 6
  • Megan F. Cole ,
  • Clarke O. Britton ,
  • Denver Roberts ,
  • Peter Rubin ,
  • Hannah D. Shin ,
  • Yassin R. Watson , and
  • Colin Harrison
  • Erika Offerdahl, Monitoring Editor
  • Improving Scientific Literacy of Elementary School Students through Problem-Based Learning Model with Balinese Local Wisdom 1 January 2024 | Jurnal Ilmiah Sekolah Dasar, Vol. 7, No. 4
  • First-year interdisciplinary science experience enhances science belongingness and scientific literacy skills 5 January 2023 | Journal of Applied Research in Higher Education, Vol. 15, No. 5
  • TOSLS Cognitive Instrument to Measure Students' Scientific Literacy Abilities 7 November 2023 | IJORER : International Journal of Recent Educational Research, Vol. 4, No. 6
  • Investigation of Preservice Science Teachers’ Scientific Literacy Skills in terms of Academic Achievement, University Entrance Exam Scores and Grade Level 1 November 2023 | Participatory Educational Research, Vol. 10, No. 6
  • Interactive socio-scientific inquiry for promoting scientific literacy, enhancing biological knowledge, and developing critical thinking 2 December 2021 | Journal of Biological Education, Vol. 57, No. 5
  • Lara K. Goudsouzian and
  • Jeremy L. Hsu
  • Stephanie Gardner, Monitoring Editor
  • Developing a Scientific Literacy Assessment Instrument for Portuguese 3rd Cycle Students 16 September 2023 | Education Sciences, Vol. 13, No. 9
  • Preprint Peer Review Enhances Undergraduate Biology Students' Disciplinary Literacy and Sense of Belonging in STEM 22 Aug 2023 | Journal of Microbiology & Biology Education, Vol. 24, No. 2
  • Synthesizing Research Narratives to Reveal the Big Picture: a CREATE(S) Intervention Modified for Journal Club Improves Undergraduate Science Literacy 22 Aug 2023 | Journal of Microbiology & Biology Education, Vol. 24, No. 2
  • Comparing study features is easy but identifying next steps is hard: Evaluating critical thinking through the Biology Lab Inventory of Critical Thinking in Ecology 10 May 2023 | Ecology and Evolution, Vol. 13, No. 5
  • Imagining, designing, and interpreting experiments: Using quantitative assessment to improve instruction in scientific reasoning 4 April 2023 | Biochemistry and Molecular Biology Education, Vol. 51, No. 3
  • An Equity-Focused Redesign of an Introductory Organismal Biology Lab Course To Develop Foundational Scientific Practices 20 Apr 2023 | Journal of Microbiology & Biology Education, Vol. 24, No. 1
  • Discussion of Annotated Research Articles Results in Increases in Scientific Literacy within a Cell Biology Course 20 Apr 2023 | Journal of Microbiology & Biology Education, Vol. 24, No. 1
  • Primary Scientific Literature Is Not Just for Students and Academics: a Study of Primary Source Modalities and Predictors of Learning across Adulthood 20 Apr 2023 | Journal of Microbiology & Biology Education, Vol. 24, No. 1
  • Scaffolding STEM Literacy Assignments To Build Greater Competence in Microbiology Courses 20 Apr 2023 | Journal of Microbiology & Biology Education, Vol. 24, No. 1
  • Validity of Science Literacy on the Respiratory System in Indonesia’s Merdeka Curriculum 30 March 2023 | IJORER : International Journal of Recent Educational Research, Vol. 4, No. 2
  • Length of course-based undergraduate research experiences (CURE) impacts student learning and attitudinal outcomes: A study of the Malate dehydrogenase CUREs Community (MCC) 9 March 2023 | PLOS ONE, Vol. 18, No. 3
  • Teaching in Tandem: Using Graphs in an Active-Learning Classroom to Shape Students’ Understanding of Biology Concepts 30 August 2023 | Journal of College Science Teaching, Vol. 52, No. 4
  • Building a Performance-Based Assessment of Graph Construction Using Evidence-Centered Design 9 March 2023
  • INSTRUMENTOS DE AVALIAÇÃO DA LITERACIA CIENTÍFICA: UMA REVISÃO SISTEMÁTICA DE LITERATURA 1 January 2023 | Educação em Revista, Vol. 39
  • SCIENTIFIC LITERACY ASSESSMENT INSTRUMENTS: A SYSTEMATIC LITERATURE REVIEW 1 January 2023 | Educação em Revista, Vol. 39
  • The Scientific Literacy’s Profile of Primary Education’s Teacher Candidates 21 June 2023
  • The effect of the 5E instructional model on students’ cognitive processes and their attitudes towards chemistry as a subject 1 Jan 2023 | Eurasia Journal of Mathematics, Science and Technology Education, Vol. 19, No. 9
  • Building a Performance-Based Assessment of Graph Construction Using Evidence-Centered Design 15 October 2023
  • Understanding the Role of Service-Learning on Civic Engagement and Scientific Literacy 1 January 2023 | Journal of Applied Learning in Higher Education, Vol. 09, No. 2023
  • Cognitive and Cultural Factors That Affect General Vaccination and COVID-19 Vaccination Attitudes 30 December 2022 | Vaccines, Vol. 11, No. 1
  • Building Connections to Biology and Community through Service-Learning and Research Experiences 15 Dec 2022 | Journal of Microbiology & Biology Education, Vol. 23, No. 3
  • Kevin P. Callahan ,
  • Celeste N. Peterson ,
  • Betsy M. Martinez-Vaz ,
  • Kathryn L. Huisinga ,
  • Nicole Galport ,
  • Courtney Koletar ,
  • Rebecca M. Eddy ,
  • Joseph J. Provost ,
  • Jessica K. Bell , and
  • Lisa Corwin, Monitoring Editor
  • Cara Gormally and
  • Austin Heil
  • Sehoya Cotner, Monitoring Editor
  • Assessing the Development of Digital Scientific Literacy With a Computational Evidence-Based Reasoning Tool 20 April 2022 | Journal of Educational Computing Research, Vol. 60, No. 7
  • Cultivating Science Literacy Through the General Education Curriculum 31 Oct 2022 | The Korean Association of General Education, Vol. 16, No. 5
  • Active Learning in STEM Education with Regard to the Development of Inquiry Skills 9 October 2022 | Education Sciences, Vol. 12, No. 10
  • Using large databases to study cancer genomics in an undergraduate classroom 21 July 2022 | Biochemistry and Molecular Biology Education, Vol. 50, No. 5
  • What influences students’ abilities to critically evaluate scientific investigations? 30 August 2022 | PLOS ONE, Vol. 17, No. 8
  • A Model for a Data Analysis-and Literature-Intensive Undergraduate Course 27 September 2023 | Journal of College Science Teaching, Vol. 51, No. 5
  • Measuring integrated understanding of undergraduate chemistry research experiences: assessing oral and written research artifacts 1 January 2022 | Chemistry Education Research and Practice, Vol. 23, No. 2
  • Alexa W. Clemmons ,
  • Deborah A. Donovan ,
  • Elli J. Theobald , and
  • Alison J. Crowe
  • Tammy Long, Monitoring Editor
  • Impact of Authentic Course-Based Undergraduate Research Experiences (CUREs) On Student Understanding in Introductory Biology Laboratory Courses 1 Mar 2022 | The American Biology Teacher, Vol. 84, No. 3
  • Fen Öğretmenlerinin Bilimsel Okuryazarlık Beceri Düzeylerinin Farklı Değişkenler Açısından Değerlendirilmesi 21 February 2022 | e-International Journal of Educational Research
  • Engaging Students with Experimentation in an Introductory Biology Laboratory Module 12 May 2022
  • Comparison of Published Assessments of Biological Experimentation as Mapped to the ACE-Bio Competence Areas 12 May 2022
  • Assessment of Evidentiary Reasoning in Undergraduate Biology: A Lit Review and Application of the Conceptual Analysis of Disciplinary Evidence (CADE) Framework 12 May 2022
  • 2022 | Asia-Pacific Science Education, Vol. 8, No. 1
  • 2022 | SEJ (Science Education Journal), Vol. 6, No. 1
  • Face-to-face, Blended or Online: How Do Students Prefer to Study? 1 January 2022 | Psychological Science and Education, Vol. 27, No. 5
  • Gods, germs, and science: Unraveling the role of scientific literacy, germ aversion, and religious fundamentalism in predicting attitudes towards gays and lesbians 10 June 2021 | Journal of Community & Applied Social Psychology, Vol. 32, No. 1
  • Practicing Critical Thinking in Undergraduate Microbiology Classes by Presenting News Stories with Data Evidence 15 Dec 2021 | Journal of Microbiology & Biology Education, Vol. 22, No. 3
  • An Active Learning Intervention Based on Evaluating Alternative Hypotheses Increases Scientific Literacy of Controlled Experiments in Introductory Biology 15 Dec 2021 | Journal of Microbiology & Biology Education, Vol. 22, No. 3
  • A Profile of Scientific Literacy of Senior High School Students on Physics Learning 1 Nov 2021 | Journal of Physics: Conference Series, Vol. 2110, No. 1
  • Active learning tools improve the learning outcomes, scientific attitude, and critical thinking in higher education: Experiences in an online course during the COVID ‐19 pandemic 15 October 2021 | Biochemistry and Molecular Biology Education, Vol. 49, No. 6
  • Türkiye’de Bilimsel Okuryazarlık Alanında Yapılan Araştırmaların İçerik Analizi 9 October 2021 | İnönü Üniversitesi Eğitim Bilimleri Enstitüsü Dergisi, Vol. 8, No. 16
  • A Professional Development Framework for Higher Education Science Faculty that Improves Student Learning 19 May 2021 | BioScience, Vol. 71, No. 9
  • Misplaced trust: When trust in science fosters belief in pseudoscience and the benefits of critical evaluation 1 Sep 2021 | Journal of Experimental Social Psychology, Vol. 96
  • Insights on biology student motivations and challenges when reading and analyzing primary literature 10 May 2021 | PLOS ONE, Vol. 16, No. 5
  • Inovation of Biology learning through the development of authentics assessment based on scientific literacy for student of senior high school 1 May 2021 | Journal of Physics: Conference Series, Vol. 1899, No. 1
  • The development of integrated textbook cooperative learning model to improve learning outcomes and practice literation skills 1 May 2021 | Journal of Physics: Conference Series, Vol. 1899, No. 1
  • Science education reform conundrum: An analysis of teacher developed common assessments 25 April 2021 | School Science and Mathematics, Vol. 121, No. 5
  • FORMATIVE ASSESSMENT AS A TOOL TO ENHANCE THE DEVELOPMENT OF INQUIRY SKILLS IN SCIENCE EDUCATION 5 April 2021 | Journal of Baltic Science Education, Vol. 20, No. 2
  • Enhancing science literacy capabilities of prospective primary school teachers through the STEM Project Learning Model 1 Apr 2021 | Journal of Physics: Conference Series, Vol. 1869, No. 1
  • Use of a Linked-Course Model to Teach Scientific Writing to First-Year Undergraduates 27 September 2023 | Journal of College Science Teaching, Vol. 50, No. 4
  • Using Citizen Science to Incorporate Research into Introductory Biology Courses at Multiple Universities 1 December 2021 | Citizen Science: Theory and Practice, Vol. 6, No. 1
  • Improvement of Cultivation Quality of Future Environmental Protection Talents: from Scientific Literacy Aspects 1 Feb 2021 | IOP Conference Series: Earth and Environmental Science, Vol. 676, No. 1
  • Using a Cross-disciplinary Teaching Approach to Attenuate the Void: Building Educators and Researchers at a Historically Black College/University 18 Jan 2021
  • Pengembangan Instrumen Asesmen Berbasis Literasi Sains Tentang Materi Sistem Ekskresi dan Sistem Pernapasan 14 January 2021 | Jurnal Ilmiah Pendidikan Profesi Guru, Vol. 4, No. 1
  • Implementation of e-module endocrine system based on problem based learning (PBL) to improve scientific literacy and cognitive learning outcome 1 Jan 2021
  • Student’s science literacy about Covid-19 1 Jan 2021
  • Development of bird mobile apps and its effectiveness in bird literacy and scientific reasoning of ornithology students at Universitas Negeri Malang 1 Jan 2021
  • Development scientific literacy through STEM project in biology classroom: A mixed method analysis 1 Jan 2021
  • RICOSRE: An innovative learning model to promote scientific literacy 1 Jan 2021
  • The effect of problem-based learning module containing research result to improve students’ scientific literacy 1 Jan 2021
  • Utilization of online data resources by final year student in completing final year project on biochemistry and biotechnology 1 Jan 2021
  • Factors Influencing Willingness to Share Health Misinformation Videos on the Internet: Web-Based Survey 9 December 2021 | Journal of Medical Internet Research, Vol. 23, No. 12
  • Revising of the Integrating Scientific Literacy Skills Scale (ISLS) with Rasch Model Analysis 15 December 2020 | Journal for the Education of Gifted Young Scientists, Vol. 8, No. 4
  • Jerry Timbrook ,
  • Jon C. Herron , and
  • Ross Nehm, Monitoring Editor
  • Another look at the core concepts of physiology: revisions and resources 1 Dec 2020 | Advances in Physiology Education, Vol. 44, No. 4
  • The Biochemical Literacy Framework: Inviting pedagogical innovation in higher education 18 August 2020 | FEBS Open Bio, Vol. 10, No. 9
  • The impact of student misconceptions on student persistence in a MOOC 18 December 2019 | Journal of Research in Science Teaching, Vol. 57, No. 6
  • Competencies and success measured by net income among Hungarian HE graduates 27 Jul 2020 | Education + Training, Vol. 63, No. 3
  • Teens learning epidemiology? A cohort study on epidemiology instruction for high school youth 1 Jul 2020 | Annals of Epidemiology, Vol. 47
  • Evaluating the Leuven Research Skills Test for 11th and 12th Grade 22 February 2019 | Journal of Psychoeducational Assessment, Vol. 38, No. 4
  • ‘I’m probably just gonna skim’: an assessment of undergraduate students’ primary scientific literature reading approaches 24 May 2020 | International Journal of Science Education, Vol. 42, No. 9
  • The development of module based on scientific literacy: geometric optics 1 Jun 2020 | Journal of Physics: Conference Series, Vol. 1567, No. 4
  • Are secondary school graduates prepared for the studies of natural sciences? – evaluation and analysis of the result of scientific literacy levels achieved by secondary school graduates 15 April 2019 | Research in Science & Technological Education, Vol. 38, No. 2
  • “I'm just not that great at science”: Science self‐efficacy in arts and communication students 29 November 2019 | Journal of Research in Science Teaching, Vol. 57, No. 4
  • Development and Validation of students’ Worksheet Based on Guided-Inquiry to Improve Students’ Scientific Literacy Skills of Junior High School on Straight Motion Concept 1 Feb 2020 | Journal of Physics: Conference Series, Vol. 1467, No. 1
  • Information Literacy and Science Communication in Undergraduate Courses That Connect Chemistry to Sustainability 11 August 2020
  • Profile and factors affecting students’ scientific literacy of senior high schools 1 Jan 2020
  • Scientific literacy profile of science and non-science students in senior high schools in Malang 1 Jan 2020
  • A Modified CREATE Intervention Improves Student Cognitive and Affective Outcomes in an Upper-Division Genetics Course 1 Jan 2020 | Journal of Microbiology & Biology Education, Vol. 21, No. 1
  • A Single, Narrowly Focused CREATE Primary Literature Module Evokes Gains in Genetics Students’ Self-Efficacy and Understanding of the Research Process 1 Jan 2020 | Journal of Microbiology & Biology Education, Vol. 21, No. 1
  • Influence of training in inquiry-based methods on in-service science teachers’ reasoning skills 13 July 2019 | Chemistry Teacher International, Vol. 1, No. 2
  • Investigating disciplinary context effect on student scientific inquiry competence 6 December 2019 | International Journal of Science Education, Vol. 41, No. 18
  • April A. Nelms and
  • Miriam Segura-Totten
  • Erin L. Dolan, Monitoring Editor
  • Scientific Literacy of Secondary School Students’ Related the Global Warming Subject 1 Nov 2019 | Journal of Physics: Conference Series, Vol. 1387, No. 1
  • Urgensi Pendidikan STEM Terhadap Literasi Sains Mahasiswa Calon Guru Sekolah Dasar 30 October 2019 | Proceedings of The ICECRS, Vol. 2, No. 1
  • Practicality and effectiveness of physics teaching materials based on contextual through inquiry to increase studentsscience literacy 9 November 2019 | Journal of Physics: Conference Series, Vol. 1317
  • Justin F. Shaffer ,
  • Julie Ferguson , and
  • Kameryn Denaro
  • Peggy Brickman, Monitoring Editor
  • Little intervention, big results: intentional integration of information literacy into an introductory-level biology lab course 17 July 2018 | Journal of Biological Education, Vol. 53, No. 4
  • Employers’ ratings of importance of skills and competencies college graduates need to get hired 10 Jun 2019 | Education + Training, Vol. 61, No. 5
  • Instrument Development in Measuring the Scientific Literacy Integrated Character Level of Junior High School Students 1 Jun 2019 | Journal of Physics: Conference Series, Vol. 1233, No. 1
  • Quantifying critical thinking: Development and validation of the physics lab inventory of critical thinking 31 May 2019 | Physical Review Physics Education Research, Vol. 15, No. 1
  • The Figure of the Day: A Classroom Activity to Improve Students' Figure Creation Skills in Biology 1 May 2019 | The American Biology Teacher, Vol. 81, No. 5
  • Chemistry of Sustainable Products: Filling the Business Void in Green-Chemistry Curricula 15 February 2019 | Journal of Chemical Education, Vol. 96, No. 4
  • Validation analysis of physics teaching materials based on contextual through inquiry to increase student’s science literacy 1 May 2019 | Journal of Physics: Conference Series, Vol. 1185
  • Analysis of Student Concepts Understanding in Solving Scientific Literacy on the Topic of Momentum and Impulse 15 March 2019 | Journal of Physics: Conference Series, Vol. 1155
  • Annotated primary scientific literature: A pedagogical tool for undergraduate courses 9 January 2019 | PLOS Biology, Vol. 17, No. 1
  • Challenge based learning to improve scientific literacy of undergraduate biology students 1 Jan 2019
  • Aakanksha Angra and
  • Stephanie M. Gardner
  • Jennifer Knight, Monitoring Editor
  • Scientific reasoning ability does not predict scientific views on evolution among religious individuals 17 March 2018 | Evolution: Education and Outreach, Vol. 11, No. 1
  • Student performance on the Test of Scientific Literacy Skills (TOSLS) does not change with assignment of a low-stakes grade 3 July 2018 | BMC Research Notes, Vol. 11, No. 1
  • Characterizing lab instructors’ self-reported learning goals to inform development of an experimental modeling skills assessment 15 November 2018 | Physical Review Physics Education Research, Vol. 14, No. 2
  • What is the level of scientific literacy among geography students studying bachelor's studies in natural sciences? 12 October 2017 | International Research in Geographical and Environmental Education, Vol. 27, No. 4
  • Assessing inquiry skills of upper secondary school students 8 November 2018 | Journal of Physics: Conference Series, Vol. 1076
  • Development of subject-specific pedagogy based on guided inquiry about newton’s law to improve senior high school students’ scientific literacy ability 12 October 2018 | Journal of Physics: Conference Series, Vol. 1097
  • Effects of Modeling Instruction Professional Development on Biology Teachers’ Scientific Reasoning Skills 8 August 2018 | Education Sciences, Vol. 8, No. 3
  • Developing Shared Vision: A Case Study Documenting a STEM General Education Change Process at a Small Liberal Arts School 27 September 2023 | Journal of College Science Teaching, Vol. 47, No. 6
  • Conductas de búsqueda de información en Internet en estudiantes universitarios de Comunicación Social en Ecuador 25 June 2018 | Revista Latina de Sociología, Vol. 8, No. 1
  • Nancy Pelaez ,
  • Trevor R. Anderson ,
  • Joel K. Abraham ,
  • Edward L. Bartlett ,
  • Cara Gormally ,
  • Carol A. Hurney ,
  • Tammy M. Long ,
  • Dina L. Newman ,
  • Karen Sirum , and
  • Michael T. Stevens
  • Jeff Schinske, Monitoring Editor
  • Linking introductory astronomy students’ basic science knowledge, beliefs, attitudes, sources of information, and information literacy 15 June 2018 | Physical Review Physics Education Research, Vol. 14, No. 1
  • US experiences with STEM education reform and implications for Asia 13 Mar 2018 | International Journal of Comparative Education and Development, Vol. 20, No. 1
  • Peer teaching and information retrieval: the role of the NICE Evidence search student champion scheme in enhancing students’ confidence 11 January 2018 | Health Information & Libraries Journal, Vol. 35, No. 1
  • Teaching Evolution: Criticism of Common Justifications and the Proposal of a More Warranted Set 31 August 2017
  • 2018 | Teaching and Teacher Education, Vol. 75
  • Properness test: Development of an inquiry-based learning module to improve science literacy in thermochemistry subject 1 Jan 2018
  • The development of guided inquiry-based learning devices on photosynthesis and respiration matter to train science literacy skills 15 February 2018 | IOP Conference Series: Materials Science and Engineering, Vol. 296
  • Discovery and Broad Relevance May Be Insignificant Components of Course-Based Undergraduate Research Experiences (CUREs) for Non-Biology Majors 1 Jan 2018 | Journal of Microbiology & Biology Education, Vol. 19, No. 2
  • Liz Stanhope ,
  • Laura Ziegler ,
  • Tabassum Haque ,
  • Marcelo Vinces ,
  • Gregory K. Davis ,
  • Andrew Zieffler ,
  • Peter Brodfuehrer ,
  • Marion Preest ,
  • Jason M. Belitsky ,
  • Charles Umbanhowar , and
  • Paul J. Overvoorde
  • A flexible e‐learning resource promoting the critical reading of scientific papers for science undergraduates 16 June 2017 | Biochemistry and Molecular Biology Education, Vol. 45, No. 6
  • DEVELOPMENT OF A SKILLS-BASED INSTRUMENT TO MEASURE SCIENTIFIC REASONING IN MEDICINE ACROSS DIFFERENT LEVELS OF EXPERTISE 25 June 2017 | Journal of Baltic Science Education, Vol. 16, No. 3
  • Anna Jo Auerbach , and
  • Elisabeth E. Schussler
  • Brickman Peggy, Monitoring Editor
  • Cissy J. Ballen ,
  • Jessamina E. Blum ,
  • Sara Brownell ,
  • Sadie Hebert ,
  • James Hewlett ,
  • Joanna R. Klein ,
  • Erik A. McDonald ,
  • Denise L. Monti ,
  • Stephen C. Nold ,
  • Krista E. Slemmons ,
  • Paula A. G. Soneral , and
  • Sehoya Cotner
  • Daron Barnard, Monitoring Editor
  • Measuring scientific reasoning – a review of test instruments 19 July 2017 | Educational Research and Evaluation, Vol. 23, No. 3-4
  • Development and Validation of Scientific Literacy Scale for College Preparedness in STEM with Freshmen from Diverse Institutions 28 December 2015 | International Journal of Science and Mathematics Education, Vol. 15, No. 4
  • Luanna B. Prevost , and
  • Paula P. Lemons
  • Hannah Sevian, Monitoring Editor
  • How to Assess Your CURE: A Practical Guide for Instructors of Course-Based Undergraduate Research Experiences 1 Dec 2016 | Journal of Microbiology & Biology Education, Vol. 17, No. 3
  • Enhancing Scientific Literacy in the Undergraduate Cell Biology Laboratory Classroom 1 Dec 2016 | Journal of Microbiology & Biology Education, Vol. 17, No. 3
  • EFFICACY OF INQUIRY-BASED LEARNING IN MATHEMATICS, PHYSICS AND INFORMATICS IN RELATION TO THE DEVELOPMENT OF STUDENTS´ INQUIRY SKILLS 25 October 2016 | Journal of Baltic Science Education, Vol. 15, No. 5
  • Todd D. Reeves ,
  • Gili Marbach-Ad ,
  • Kristen R. Miller ,
  • Judith Ridgway ,
  • Grant E. Gardner ,
  • Elisabeth E. Schussler , and
  • E. William Wischusen
  • Marilyne Stains, Monitoring Editor:
  • Does Higher Education Improve Student Scientific Reasoning Skills? 11 December 2014 | International Journal of Science and Mathematics Education, Vol. 14, No. 4
  • Teaching for higher levels of thinking: developing quantitative and analytical skills in environmental science courses 25 April 2016 | Ecosphere, Vol. 7, No. 4
  • Thomas Deane ,
  • Kathy Nomme ,
  • Erica Jeffery ,
  • Carol Pollock , and
  • Gülnur Birol
  • Erin Dolan, Monitoring Editor:
  • Kristy L. Kenyon ,
  • Morgan E. Onorato ,
  • Alan J. Gottesman ,
  • Jamila Hoque , and
  • Sally G. Hoskins
  • James Hewlett, Monitoring Editor:
  • Using a Concept Inventory to Assess the Reasoning Component of Citizen-Level Science Literacy: Results from a 17,000-Student Study 1 Mar 2016 | Journal of Microbiology & Biology Education, Vol. 17, No. 1
  • Development of a framework for graph choice and construction 1 Mar 2016 | Advances in Physiology Education, Vol. 40, No. 1
  • David I. Hanauer , and
  • Graham Hatfull
  • Elisa Stone, Monitoring Editor
  • Arctic Climate Connections Curriculum: A Model for Bringing Authentic Data Into the Classroom 14 June 2018 | Journal of Geoscience Education, Vol. 63, No. 3
  • Students' usability evaluation of a Web‐based tutorial program for college biology problem solving 1 June 2015 | Journal of Computer Assisted Learning, Vol. 31, No. 4
  • Brian A. Couch ,
  • William B. Wood , and
  • Jennifer K. Knight
  • Alison Gammie, Monitoring Editor
  • Undergraduate research experiences: Impacts and opportunities 6 Feb 2015 | Science, Vol. 347, No. 6222
  • Using Scientific Visualization to Enhance the Teaching and Learning of Core Concepts 24 April 2015
  • The Development and Implementation of an Instrument to Assess Students’ Data Analysis Skills in Molecular Biology 1 Dec 2014 | Journal of Microbiology & Biology Education, Vol. 15, No. 2
  • Annwesa P. Dasgupta ,
  • Trevor R. Anderson , and
  • Nancy Pelaez
  • Erin Dolan, Monitoring Editor
  • Science Cafés: Engaging Scientists and Community through Health and Science Dialogue 9 April 2014 | Clinical and Translational Science, Vol. 7, No. 3
  • Complementary Approaches to Teaching Nature of Science: Integrating Student Inquiry, Historical Cases, and Contemporary Cases in Classroom Practice 17 April 2014 | Science Education, Vol. 98, No. 3
  • Rebecca M. Price ,
  • Tessa C. Andrews ,
  • Teresa L. McElhinny ,
  • Louise S. Mead ,
  • Anna Thanukos , and
  • Kathryn E. Perez
  • Michèle Shuster, Monitoring Editor
  • Application of the Test of Scientific Literacy Skills in the Assessment of a General Education Natural Science Program 1 Mar 2014 | The Journal of General Education, Vol. 63, No. 1
  • How Students Think about Experimental Design: Novel Conceptions Revealed by in-Class Activities 14 December 2013 | BioScience, Vol. 64, No. 2
  • 2014 | The Journal of General Education, Vol. 63, No. 1
  • A.-M. Hoskinson ,
  • M. D. Caballero , and
  • J. K. Knight
  • Eric Brewe, Monitoring Editor
  • Context Matters: Volunteer Bias, Small Sample Size, and the Value of Comparison Groups in the Assessment of Research-Based Undergraduate Introductory Biology Lab Courses 1 Jan 2013 | Journal of Microbiology & Biology Education, Vol. 14, No. 2

Submitted: 14 March 2012 Revised: 19 July 2012 Accepted: 19 July 2012

© 2012 C. Gormally et al. CBE—Life Sciences Education © 2012 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

Captcha Page

We apologize for the inconvenience...

To ensure we keep this website safe, please can you confirm you are a human by ticking the box below.

If you are unable to complete the above request please contact us using the below link, providing a screenshot of your experience.

https://ioppublishing.org/contacts/

IMAGES

  1. Students advocate for science literacy

    scientific literacy problem solving skills

  2. Accepting Our Role in Developing Science Literacy

    scientific literacy problem solving skills

  3. Scientific Literacy in Education. (With images)

    scientific literacy problem solving skills

  4. Science Literacy Mat

    scientific literacy problem solving skills

  5. Scientific Literacy and Critical Thinking Skills- Critical Thinking Secrets

    scientific literacy problem solving skills

  6. WHAT IS SCIENCE PROCESS SKILLS

    scientific literacy problem solving skills

VIDEO

  1. Understanding Science Conversations with Experts

  2. Mastering Critical Thinking: The Ultimate Guide to Making Informed Decisions #audiobook #audiobooks

  3. Third Millennium Thinking by Saul Perlmutter, Robert MacCoun & John Campbell

  4. Don't Get Left Behind: Scientific Literacy in Management #shorts

  5. Psychological Science: Modeling Scientific Literacy

  6. Psychology of Learning Week 4 || NPTEL ANSWERS

COMMENTS

  1. Facilitating factors of scientific literacy skills - ed

    This will prepare students for a more complex, interconnected world, with jobs that require critical thinking, teamwork and problem-solving skills. In fact, scientific literacy is one of the essential skills required in this digital age literacy (Turiman, Omar, Daud, & Osman, 2012).

  2. Developing Science Literacy in Students and Society: Theory ...

    The 2023 JMBE themed issue on “Scientific Literacy” will examine this concept from multiple angles, from theoretical frameworks to research on the impact of literacy interventions to practical tools for developing scientific literacy in diverse groups of learners.

  3. Developing a Test of Scientific Literacy Skills (TOSLS ...

    The test measures skills related to major aspects of scientific literacy: recognizing and analyzing the use of methods of inquiry that lead to scientific knowledge and the ability to organize, analyze, and interpret quantitative data and scientific information.

  4. Fostering the 21st Century Skills through Scientific Literacy ...

    Based on scientific literacy, as a general definition, should be thought as the combination of both insights related to science and scientific knowledge, as well as skills such as inquiry, critical thinking, problem solving and decision making.

  5. Teaching science & technology: components of scientific ...

    According to the third dimension of scientific literacy (Showalter, Citation 1974, p. 450), stating that a scientifically literate person uses scientific processes to solve problems, we highlight the scientific research procedural steps as one of the most important elements of scientific literacy. Student-centered learning enables the ...

  6. Building Scientific Literacy and Physics Problem Solving ...

    This study aims to determine the influence of Inquiry-Based Learning for STEM Education towards scientific literacy and physics problem-solving skills. This study used a quasi-experimental with pretest-posttest control group design.