1. Introduction

Effective teaching starts with well-written, measurable learning objectives (; ). Learning objectives communicate the focus of instruction, including the content, scientific practices, and affective outcomes the learner should gain after an activity, class, or program (). Marine science educators are in a unique position because, more often than not, they are not beholden to specific learning objectives. This reality is much different than that of their colleagues teaching formal K-12 learners, whose learning objectives must align to state or national education standards. While it is common for marine educators, especially those working closely with K-12 students and schools, to use standards to guide educational programming, the focus of instruction is often informed by works such as Ocean Literacy: The Essential Principles and Fundamental Concepts of Ocean Science for Learners of All Ages (2020). Ocean literacy, including recent expansion initiatives () and efforts to align Ocean Literacy to performance standards (), define the concepts and principles for marine science instruction. However, ocean literacy requires more than just content understanding (). For instance, learners must be able to do ocean science and care about ocean ecosystems. However, there are no existing policies or resources that help marine educators prioritize scientific practices and affective outcomes (e.g., science identity, science belonging) of their learners. Therefore, it is critical that marine science educators consider the extent their learning objectives focus on aspects other than just content acquisition, such as scientific practices and affective outcomes of their learners.

We are not the first to advocate for ocean literacy to include more than just content understanding. Cudaback () recognized a major limitation of the Ocean Literacy initiative was its sole focus on content acquisition. While Ocean Literacy Principles often asks learners to understand and remember content – it rarely asks them to apply, evaluate, or, in other words, do science. These science and engineering practices (SEPs) are a critical component of K-12 science learning. In fact, the Next Generation Science Standards () provide eight essential SEPs K-12 learners must be able to do. Many K-12 teachers use these eight practices to develop learning objectives and it is imperative that marine educators evaluate the extent their instruction focuses on these, too. Second, marine educators strive to change affective outcomes of their learners (; ). Marine science curriculum and educational programs are specifically designed to get learners to care, appreciate, and become stewards of the ocean (). However, affective outcomes typically fall outside most content standards and Ocean Literacy Principles. In this article, we argue it is essential marine educators write learning objectives that consider affective outcomes of their learners.

Unfortunately, educators from all walks of life struggle writing effective learning objectives. Learning objectives must be written around a visible performance () and educator struggles are usually attributed to issues with assessment, leading to unmeasurable and low-cognitive level learning objectives (). One way to ensure learning objectives are measurable is to use action verbs to describe what students should be able to do and know with the knowledge or skills. Bloom’s taxonomy of cognitive skills provides a useful framework for writing learning objectives with action verbs at a range of cognitive complexities (). And, this range of cognitive complexity has benefits for all students. For example, Zohar and Dori () found instruction that fostered higher-order cognitive skills (i.e., a focus on analysis and interpretation rather than on remembering and understanding) had benefits for both high and low performing students. Taken together, marine science educators must consider the measurability and cognitive level of their learning objectives.

We have outlined the importance of learning objectives in marine science education. Yet, there is no training to support marine educators to write effective learning objectives. It is likely that marine educators struggle to write effective learning objectives, which has significant implications for their learners and assessment of their educational efforts. Therefore, the purpose of this research is to present an evidence-based tool to help marine science educators evaluate their learning objectives. In this article, we outline the Marine Science Learning Objectives Tool (MS-LOT), which builds on the extensive documents that help educators teach towards Ocean Literacy Principles by including: 1) scientific practices and affective outcomes important to supporting science learning and 2) consideration of learning objective measurability and cognitive level. We demonstrate how the MS-LOT uncovers gaps, inconsistencies, and issues with learning objectives by applying it to our own learning objectives.

2. Methods

We created an evidence-based Marine Science Learning Objectives Tool (MS- LOT) to analyze learning objectives in marine science education and used it to analyze University of Georgia (UGA) Marine Extension and Georgia Sea Grant learning objectives. We start with a description of MS-LOT, defining relevant components and presenting examples as needed. Then, we briefly describe UGA Marine Extension and Georgia Sea Grant’s educational programs and the collection of learning objectives analyzed for this case study.

2.1 Description of MS-LOT

MS-LOT characterized two broad components of learning objectives: focus and assessment (Figure 1). First, it distinguished learning objectives based on their focus. In other words, MS-LOT characterized what the learning objective is asking learners to do. Three focal areas common in science education were identified from literature (): content acquisition, competency development, and affective outcomes. MS-LOT characterizes each learning objective into one of the three focal areas. Second, MS-LOT characterized learning objectives based on assessment. Specifically, it considered the measurability and cognitive level of each learning objectives. Below, we describe each component of MS-LOT, providing examples when applicable.

Figure 1 

The Marine Science Learning Objectives Tool (MS-LOT) created to analyze learning objectives in marine science education.

2.2 Focus

Learning objectives establish the 1) content, 2) practices, and/or 3) affective outcomes for the learners (). MS-LOT considers the frequency a set of learning objectives focuses on each of these three outcomes, using established resources to inform our coding when applicable. We describe each below.

Content-focused

Content-focused learning objectives center learning around acquisition of scientific information and understanding of scientific concepts. MS-LOT characterized content-focused learning objectives according to Ocean Literacy Essential Principles and Concepts (Appendix C). For instance, the learning objective “Students will be able to identify adaptations of sessile invertebrates” was coded as content-focused and aligned with Ocean Literacy Principle 5. Some content-focused learning objectives were not connected to an Ocean Literacy Principle. In these cases, the learning objective was coded as “content-focused” with “No Ocean Literacy Principle”.

Practice-focused

Practice-focused learning objectives centered learning around skills development and engagement in scientific practices. MS-LOT characterized practice-focused learning objectives according to Next Generation Science Standards eight science and engineering practices (SEPs) (Table 1). For instance, the learning objective “Students will be able to construct a hypothesis” was coded as a practice-focused learning objective and aligned with the NGSS SEP “Asking Questions & Defining Problems”.

Table 1

Next Generation Science Standards (NGSS) Science and Engineering Practices used to characterize learning objectives in the MS-LOT.


NGSS SCIENCE AND ENGINEERING PRACTICES (SEPS)

Asking Questions and Defining Problems

Developing and Using Models

Planning and Carrying Out Investigations

Analyzing and Interpreting Data

Using Mathematical and Computational Thinking

Constructing Explanations and Designing Solutions

Obtaining, Evaluating and Communicating Information

Engaging in Argument from Evidence

Affective-focused

Affective focused learning objectives centered learning as feelings, emotions, attitudes, and values rather than knowledge or skills. Affective-focused objectives are an important goal of informal learning. Specifically, affective objectives have been measured in out-of-school settings, including science self-efficacy (), the role emotion plays during science learning center visits (), aquarium visitors’ perceptions and attitudes about the importance of biodiversity (), and promoting eco-conscious behavioral changes (). For example, the learning objective, “Learners will develop a deep appreciation of the ocean” was coded as an affective-focused learning objective.

2.3 Assessment

Well-written learning objectives are critical for assessment as they should contain an observable, measurable verb that establishes the cognitive level (). The MS-LOT, therefore, characterizes learning objectives based on 1) measurability and 2) cognitive level. We describe both components of MS-LOT below and use examples when applicable.

Measurability

The action verb of the learning objective dictates whether it is measurable (). The MS-LOT used two published resources to characterize the measurability of a learning objective. Schoepp () identified a number of action verbs to be unmeasurable (i.e., “learn”, “know”, and “explore”). On the other hand, Stanny () provided a meta-analysis that characterized measurable verbs according to Anderson and Krathwohl’s () revised Bloom’s taxonomy. For example, “Learners will be able to construct a hypothesis” was coded as a measurable learning objective because the action verb “construct” was measurable and observable. Contrarily, the learning objective, “Learners will be able to learn about plankton” was coded as unmeasurable because the action verb “learn” is not observable. In other words, it is not possible to observe whether someone truly learned something. However, it is possible to observe – either through informal conversations with students or formal assessments – whether someone can explain a concept or predict an outcome.

Cognitive level

Along with measurability, the action verb also indicates cognitive level. The MS-LOT characterized the frequency of learning objectives across different cognitive levels using Anderson and Krathwohl’s () revised Bloom’s Taxonomy action verbs (; Appendix A). It should be noted that this tool could be adapted for other taxonomies of learning, such as Fink’s () Taxonomy of significant learning. Occasionally, a learning objective might contain multiple Bloom’s action verbs. In these instances, all action verbs for cognitive level were coded. For example, in the learning objective “learners will be able to construct a hypothesis and defend why it was acceptable”, “construct” was coded as Apply (Level 3) and “defend” was coded as Evaluate (Level 5). Additionally, for learning objectives that do not include a Bloom’s verb, our tool codes these as “No Bloom’s verb present”. For example, in the learning objective, “student should be able to know about microplastics”, the verb “know” is not a Bloom’s verb, and thus was coded accordingly.

2.4 Case study: Description of educational programs and learning objectives

We provide educational programming at our Marine Education Center and Aquarium as part of a larger unit within UGA Marine Extension and Georgia Sea Grant. We used the learning objectives offered by the Marine Education Center and Aquarium to show how MS-LOT can be used in marine science education. It is, therefore, important to briefly describe our educational programs and the sample of learning objectives used in this case study.

UGA Marine Education Center and Aquarium has provided “hands-on, feet in” experiential education programming for 5th through 12th grade students for the past 40 years. On average, UGA marine educators provide educational programming on their campus for 75 school groups per year (ranging from 15 to 150 students/groups) and currently offers 29 educational programs to 5–12th grade visitors that were created by current and previous faculty. Programs are offered at a cost to all schools in our region, but sometimes costs are offset with funding. Educational programs range in length (45mins – 6hrs) and location (indoors vs. field-based). For instance, indoor-based programs are run out of the aquarium, auditorium, and laboratory-based classrooms. And, outdoor-based programs can be either on-land (i.e., Salt Marsh Walks, Maritime Forest Hikes) or boat-based (i.e., Barrier Island Exploration, Estuary Trawl). To demonstrate a more fine-grained analysis, we grouped learning objectives based on location in this study. A brief description and the location of the educational programs analyzed for this research is described in Appendix B.

Learning objectives (N = 207) were collected from our 29 educational programs for 5th – 12th grade students. For the sake of this analysis, it was important to distinguish between indoor and outdoor-based programs. We used MS-LOT as a tool to reevaluate and characterize our learning objectives, as they had not been revisited since 2016, where they were loosely written to align with state standards and Ocean Literacy Principles.

3. Results

MS-LOT revealed gaps in instruction and issues with assessment within our learning objectives (Figure 2). We describe the results obtained from each component of MS-LOT below.

Figure 2 

Results from the case study evaluating our own learning objectives.

3.1 Focus

Sixty-one percent of the learning objectives were content-focused, 39% were practice-focused, and 0% were affective-focused. The latter indicated that not one of the 207 learning objectives analyzed focused on affective outcomes of our learners (i.e., appreciate, care, etc.).

When comparing learning objectives from indoor and outdoor programs, we found learning objectives from outdoor programs were slightly more likely to be content-focused (64%) compared to learning objectives from indoor programs (58%).

Content-focused

Content-focused learning objectives were characterized using Ocean Literacy Principles. Approximately three quarters (78%) of the 207 learning objectives were aligned with an Ocean Literacy Principle (Figure 3). Of the learning objectives that were connected to an Ocean Literacy Principle (N = 161), the vast majority (67%) focused on Essential Principle 5: “The ocean supports a great diversity.” Less frequently, learning objectives aligned with Essential Principles 1 (21%), 2 (12%), and 6 (29%). However, the remaining three principles (Essential Principles 3, 4 and 7) represented less than 5% of the learning objectives analyzed, with only a single learning objective aligned with Essential Principle 4.

Figure 3 

Percentage of learning objectives aligned with the seven Ocean Literacy Principles.

Nearly twice as many learning objectives from indoor programs (27%) were not aligned to an Ocean Literacy Principle compared to learning objectives from outdoor programs (16%). Other observable differences were noted for Principle 6 (29% outdoor vs. 15% indoor) and Principle 7 (7% outdoor vs. 1% indoor; Figure 4).

Figure 4 

Percentage of indoor and outdoor learning objectives aligned with the seven Ocean Literacy Principles.

Practice-focused

Practice-focused learning objectives were characterized using Next Generation Science Standards’ eight scientific and engineering practices. Less than half of the learning objectives were practice-based (39%). Of the learning objectives that were practice-based (N = 88), 88% were connected to one of four NGSS practices: planning and carrying out experiments (27%), obtaining, evaluating, and communicating information (25%), constructing explanations (24%), and analyzing and interpreting data (13%). The remaining four practices represented 5% or less of our learning objectives: asking questions (3%), developing and using models (1%), using math and computational thinking (5%), and engaging in arguments (0%).

Similar trends for indoor and outdoor practice-based learning objectives were observed. In fact, the same three practices were most commonly observed for both indoor and outdoor programs, which included 1) obtaining, evaluating, and communicating information (32% Indoor, 16% Outdoor), 2) constructing explanations (26% Indoor, 21% Outdoor) and 3) planning and carrying out experiments (20% Indoor, 39% Outdoor).

Affective-focused

Interestingly, not a single learning objective focused on affective outcomes of learners. We explore how we use this finding to guide future instruction in our Discussion.

3.2 Assessment

MS-LOT characterized assessment in terms of measurability and cognitive level. We present findings from our learning objectives, noting differences between outdoor and indoor programs.

Measurability

Nearly two-thirds (64.4%) of the learning objectives were unmeasurable as written (Figure 5). Common unmeasurable verbs were: understand (n = 81), learn (n = 16), explore (n = 4), and be introduced (n = 4). Additionally, indoor learning objectives (66%) were 30% more likely to be unmeasurable than outdoor learning objectives (46%; Figure 6).

Figure 5 

Percentage of unmeasurable learning objectives.

Figure 6 

Percentage of unmeasurable learning objectives across location.

Cognitive Level

Learning objectives were nearly four times more likely to be lower-order (63%) than high-order (15%; Figure 7). Additionally, 22% of our learning objectives did not have a Bloom’s verb. The four most common action verbs were: understand (non-Bloom’s verb; N = 81), identify (Bloom’s verb; N = 18), learn (non-Bloom’s verb; N = 16), and describe (Bloom’s verb; N = 14).

Figure 7 

Bloom’s cognitive level of all learning objectives.

Outdoor learning objectives were slightly more likely to be higher-order (18%) than indoor learning objectives (12%; Figure 8). Additionally, indoor learning objectives were nearly twice as likely to not contain a Bloom’s verb (27.4%) compared to outdoor learning objectives (14.7%).

Figure 8 

Bloom’s cognitive level of indoor and outdoor learning objectives.

4. Discussion

We presented the evidence-based Marine Science Learning Objectives Tool (MS-LOT) as a tool to evaluate learning objectives in marine science education. MS-LOT centers learning objectives as a critical, yet often overlooked, aspect of effective marine science teaching. They communicate to learners, colleagues, and partners what they will be learning, but they also are the basis of instruction and assessment (). MS-LOT provides marine educators tools to evaluate their learning objectives according to the 1) focus and 2) assessment.

First, MS-LOT characterized the focus of a learning objective as either content, practices, or affective-focused. A number of resources in marine science education help educators identify content to guide their instruction (i.e., Ocean Literacy Essential Principles and Fundamental Concepts for K-12 (2020)). In reality, excellent resources outline the content required to achieve ocean literacy. However, MS-LOT pushes educators to think past content and consider the practices and affective outcomes that are needed to promote ocean literacy. This is in direct response to prominent critiques of science literacy () and ocean literacy in recent decades ().

We demonstrated how MS-LOT can uncover gaps in marine science instruction, using our learning objectives as an example. Most surprising, we found no learning objectives focused on affective outcomes. Affective outcomes are not inconsequential for learners in science. In fact, promoting them is often a core goal of marine education programs (; ). Second, using MS-LOT revealed that our learning objectives showed a strong propensity for content-acquisition. In and of itself, it is not a concern to focus instruction on content-acquisition. Yet, MS-LOT revealed gaps in instruction related to the Ocean Literacy Principles. Even though the vast majority of the learning objectives analyzed (78%) aligned with an Ocean Literacy Principle, instruction tended to concentrate around a single principle in lieu of others. As we revise these learning objectives, it is critical to focus instructional effort on the four principles typically excluded (asking questions, using models, mathematical and computational thinking and using arguments). Finally, more than a third of the learning objectives were practice-focused. This finding was encouraging. We want learners to do science, and many of the learning objectives analyzed reflected this. However, using the MS-LOT revealed certain NGSS SEPs that were neglected or all together excluded. For instance, not one learning objective was written around argumentation, an increasingly important skill in science education. MS-LOT points to gaps in content, practices, and affective outcomes that we will use to improve our instruction.

Second, MS-LOT considered the assessment of learning objectives, specifically, the measurability and cognitive level. It is critical that learning objectives are measurable for assessment () and written at various cognitive levels (). Yet, we found the vast majority of the learning objectives were unmeasurable and low-cognitive level. We will use findings from MS-LOT to change unmeasurable verbs (i.e., “learn” and “understand”) to observable and assessable verbs (i.e., “identify” and “apply”). MS-LOT pointed to significant issues for assessment of instruction.

Taken together, MS-LOT showed our team that the words used to define learning objectives matter. Words matter for the teachers, students, and parents/guardians that attend marine science programs. It is critical that marine educators 1) consider how experiences and explorations for learners in their programming overlap with what is happening inside the classroom and 2) communicate this overlap clearly to teachers and parents/guardians. Words matter for assessment. To show evidence that learners are benefiting from learning experiences, marine educators must write learning objectives that are measurable and at various cognitive levels. Finally, words matter for instruction. MS-LOT pointed to an overemphasis on content-acquisition within our programs. This realization prompted us to reimagine our content-heavy programs to center science practices, affective outcomes, and exploration as the heart of what learners do.

An unexpected outcome of our analysis revealed a tension amongst our team and within marine science education as a field. We started to ask questions like: Don’t we want to expand outside what is expected in a formal classroom? Shouldn’t we emphasize exploration and connection with the environment in addition to aligning with classroom standards and practices? We conceded that many marine education programs are exploratory in nature, and thus, are not readily defined by a specific objective. However, the goal of this paper is not to create boundaries on what is taught in marine science education. Instead, it is to challenge educators to write effective learning objectives that consider aspects of what many K-12 learners are doing and learning inside the classroom. Even if marine science learning objectives are focused on these broad, exploratory-like objectives, the MS-LOT can still help educators write them effectively to help guide instruction and assessment.

We believe that the MS-LOT will be useful to all marine educators, especially those working with K-12 learners. As a community, marine educators have done an exceptional job identifying the content needed to be ocean literate. However, there is little training and resources to help marine educators write effective learning objectives. MS-LOT builds on existing resources and provides marine educators tools to visualize their instruction. It is our hope that marine educators will use this tool to revisit their learning objectives, much like we report doing in our case study.

Additional File

The additional file for this article can be found as follows:

Appendices

Appendix A to C. DOI: https://doi.org/10.5334/cjme.92.s1