18
Nov 13

Science Teaching Practices – Aubree

 

Crawford (2007)
“Learning to Teach Science as Inquiry in the Rough and Tumble of Practice”

The most interesting part of this piece was the many reasons that the five pre-service teachers did or did not incorporate inquiry into their teaching.  Even though the field of science education has been promoting inquiry-based learning for decades, many classrooms do not reflect this change.  This is a much more complicated problem than I had originally expected.  The world outside of the universities can affect even the most well-prepared inquiry teachers.

I liked the definition of inquiry used in this paper.  I’ve read other papers that expand upon the many definitions of inquiry (much like Lampert’s work with practice), but I am particularly fond of this one:

“…students in K-12 science classrooms develop abilities to do scientific inquiry, gain understandings about scientific inquiry, and that teachers facilitate students in acquiring deep understanding of science concepts through inquiry approaches” (p. 614).  I was confused by the next sentences regarding student outcomes.  How can we require standards that enforce students’ “appreciating the diverse ways in which scientists conduct their work”?  This is an assessment nightmare.

Lampert (2009)
“Learning Teaching in, from, and for Practice: What Do We Mean?”

This was a very interesting and informative view of the nuances in the word practice. The part of speech, singular versus plural form of the word, and the grain size or subject of the sentence becomes vital in distinguishing the definitions.  The author concludes that a consensus about this term needs to be reached in order to achieve a common language, a prerequisite for collectively solving the problem of “learning the work of teaching” (p. 12).  I’m not sure that I agree. That’s the beauty of the English language, yes?  We can use one word to mean many things.  When speaking about The Practice of Teaching, I don’t think it is appropriate to rely on a definition that is synonymous with rehersal.  Why can’t all of these definitions be true some of the time?

“When educators use the term best practices, another question immediately follows: best to achieve what goals?” (p. 12)  This is a key question that needs to be revisited when discussing any teacher’s or teaching pracice.  Differences in practices closely reflect teaching and learning values, like the learning theories we’ve been discussing all semester.  The kinds of activities that happen in research and in leading a classroom are based off of an individual’s beliefs about learning.  So, these questions become vital for understanding the context of the learning: what is important to this person?  What goals are they trying to achieve?

Windschitl, et. al. (2012)
“Proposing a Core Set of Instructional Practices and Tools for Teachers of Science”

This reading introduced me to some new vocabulary: (1) high-leverage instructional practices, (2) ambitious teaching, and (3) teaching practices vs teaching moves.  I admire the authors’ intent to identify a core set of HLPs, but I think this is a very ambitious task.  My experience in science classrooms is limited to being the student or a teaching assistant, so I am not confident in my ability to critique the Model-Based Inquiry Framework applied to multiple classroms, different in content, teacher view of learning, and in student background.

Do the NGSS solve the problem of students not being able to identify the “Big Ideas” in science?  Can the core disciplinary ideas and cross-cutting concepts be used in the Model-Based Inquiry Framework to organize teachers’ learning of practices as well as their subsequent teaching?

(Apologies to my group for the late post.  I had it saved to a Word document and forgot to post last Friday.  I have cupcakes to ease the pain…)


18
Nov 13

Science Practice and Teaching – Ryan

I read the Lampert, Windschitl, and Aguiar articles for this week because they seemed to be best suited to my interests and most helpful for actual teaching.

I started with the Lampert article because it was all about defining practice, and it seemed like it would be a good idea to understand the definition of this week’s topic from the beginning.  Of course Lampert showed that there was no consensus on the definition of practice (and that the most commonly used one, that which is opposed to theory, is probably the least useful), but a discussion of the possible definitions was still useful.  I noticed that I read the same three articles as KeriAnn, so I will try to do the same as her in identifying the learning perspectives of each author.  I’m not sure about taking the leap that KeriAnn did and saying that Lampert follows the situative perspective.  She does a good job of keeping her own perspective out of the way while summarizing what everyone else thinks about the subject.  She may very well be situative, but I don’t think there’s sufficient evidence to say.  On the other hand, there does seem to be evidence that most of the people talking about teaching practice do follow the situative perspective.  In particular, many people seemed concerned by the issue of transfer, and I believe the belief in the nonexistence of transfer is the driving force between the theory-practice dichotomy of the first definition.  In her conclusion, Lampert says:

Perhaps the most important questions raised by different
uses of the word practice in relation to learning teaching is
whether practice is meant to be something that an individual does and learns from other individuals or something created and maintained by a collective and learned by participation in that collective. (pg. 12)

These two meanings seem to correspond with the cognitive and situative perspectives, suggesting this whole debate about the definition of practice really boils down to our original debate of cognitive versus situative.

Windschitl et al. (2012) wanted to reform teacher education by starting with a set of practices that seemed useful, a “candidate core,” and then refining that set of practices based on research.  In constructing their candidate core, they underwent a change of framework, although both frameworks are situative.  Initially, Windschitl et al. seemed to follow a cognitive apprenticeship type of framework, in which the novice teachers learned from and were given tools by the master researchers.  However, they discovered that the novices formed a community that reshaped the tools given to them to suit their needs, rather than fully relying on the masters.  I don’t think I know enough about the different forms of situative frameworks to definitively name this, but since there is now more of a focus on communities, I would guess this is a communities of practice framework.  I was a bit confused by their Big Ideas tool.  At first it seems obvious, but according to the authors, a railroad car imploding and a person doing a wall flip are Big Ideas, but natural selection and force and motion aren’t.  I would think it should be the opposite.

Aguiar et al. (2009) say from the very beginning that they are using a framework based on socio-cultural theory, so their perspective is obviously situative.  This makes me wonder if all writing about practices comes from the situative perspective or if that just happened to apply to the articles I picked.  I would guess that the majority, if not all, are situative because practices make more sense in specific settings.

 


17
Nov 13

Science Practices & Teaching–Julianne

Sorry this is late. Migraines and looking at computer screens are not a good combination.

After reading the abstracts of all of the papers I selected Manz (2012), Windschitl, Thompson, Braaten, & Stroupe (2012), and Wu & Krajcik (2006) because their topics seemed to align most closely with my interests—investigations carried out in an outdoor “wild backyard area” environment (Manz, 2012), instructional practices and tools for science teachers (Windschitl, Thompson, Braaten, & Stroupe, 2012), and students’ inscription practices in science education (Wu & Krajcik, 2006).

The two science practices papers (Wu & Krajcik, 2006; Manz, 2012) were more similar than I had anticipated but it was interesting to see how the different studies were carried out and interpreted. The two veteran teachers incorporating inscription practices in their inquiry-based classrooms (Wu & Krajcik, 2006) had attended workshops on developing project-based approaches to teaching, had collaborated with each other to develop inquiry-based learning projects, had used learning tools to enhance their curriculum, and had worked extensively with university researchers prior to the current study. Similarly, the teacher in Manz’s (2012) study has several decades of teaching experience and had “participated in significant professional development” (p. 1076) to develop instruction around modeling activity and investigation. Both studies and the science units that were the foci of the studies were conducted over the course of a at least one school year—eight months (Wu & Krajcik, 2006); first year out of two (Manz, 2012) and classroom activities were iterative. Both classrooms took the mile deep-inch wide approach of science education where one large topic occupied the school year—water quality (Wu & Krajcik, 2006) and plant reproduction success (in the backyard ecosystem) (Manz, 2012). In both cases the topics were ones to which the students could make direct, personal connections. Scaffolding was a key element of the classrooms studied by Wu & Krajcik (2006), i.e., “the teachers provided substantial scaffolds to support the inquiry process” (p. 66). The authors define scaffold as “assistance that allowed students to accomplish tasks they could not do alone” (p.70) and stated scaffolding “faded out” (p.74) over the course of the project as students gained confidence, knowledge, and ability. The scaffolds had allowed the students’ levels of competence to increase over time. Although the term scaffolding was not used in Manz (2012), I felt there were indications that scaffolding was integral to the studied learning environment. The term support was frequently used when scaffolding seemed to be occurring, i.e., “instructional support” (p. 1076), working with the teacher to develop “supportive classroom discourse structures to support student agency” (p. 1081), teacher “supported students to make and considering claims” (p. 1091).

Wu & Krajcik’s paper on student inscriptions was a case study. The authors looked the development and use of inscriptions (graphs, data tables, maps, scales, models, etc.) by students in the guided inquiry environment the teachers had created. While the “social practical perspective” (p.64) taken by the authors is clearly a situative perspective, the authors do acknowledge that cognitive development may have played a part in the recorded progression of student inscription practices, as did the “multiple exposures to inscriptional activities” and the “teachers’ ongoing scaffolds” (p. 90). I thought it was particularly interesting that the authors stated they used a “naturalistic approach” (p 64) for their study. Can a naturalist approach be another way to say a realistic approach? Are they acknowledging that there may be several ways to interpret their study? I rather hope so. I really liked this paper and the learning environments that were being highlighted. The teachers seemed to be quite diligent in creating classrooms where students were able to progress, gain confidence, engage with each other and with the teachers, and learn and expand inscription skills that were immediately applicable to the year-long lesson. I would really like to know if there has been any follow up work with these students to see if the skills they gained during the water quality units were transferred to other classrooms and subjects.

Manz’s (2012) design study also took a situative perspective, acknowledged the importance of students’ prior knowledge, and looked at designing learning environments that could support the joint development of scientific knowledge and practices. While there were several activities in this study that I liked ( i.e, getting the kids outside, teaching them how to observe, using books as reference materials and learning tools, the multi-seasonal nature of the investigations) I was somewhat confused by the concept of a design study. How objective can a design study be? “Because this was a design study, the research team developed conjectures about learning before instruction began, then refined them iteratively in local cycles of planning, enacting lessons, making conjectures about student learning, and redesign” (p. 1076). “As this was a design study, the design itself necessarily unfolded in relation to students’ participation in instruction, our developing understanding of student thinking, and the teacher’s needs and capabilities” (p. 1078). Does a design study run the risk of changing to match the results?

The science teaching practice article by Windschitl, Thompson, Braaten & Stroupe (2012) fit well with the two science practice articles I read. Preparing education students to engage in ambitious teaching seems like it should be the rule rather than the exception for science education as well as other subject areas. There were several themes that seemed to run through the article—theory of action, theory of support, theory of mediation, theory of design and implementation, developmental progressions, performance progressions, tool sets for scaffolding students’ scientific discourse and activity, and scaffolding for different forms of reasoning in students. Windschitl et al’s discussion of the use of inscriptions and iteration seemed to fit well with the Wu & Krajcik (2006) article. While their recognition of the importance of prior knowledge—“teacher uses knowledge of students’ intellectual and experiential resources together with knowledge of the target science phenomenon to anticipate ways to respond to the thinking of others” (p. 888) had me making connections to Manz (2012). The most striking aspect of the article, to me, was the recognition of Windschitl et al’s assumption that they, the researchers, would create the resources which the novices would then implement in the classroom was incorrect. It was delightful to read their delight in recognizing the novice teachers acted as a community and used their insight to adapted, modified, and co-designed tools to support ambitious teaching. Perhaps this revelation was the impetus for recognizing the concept of empathy as integral to instruction? “The notion of caring about students as human beings was also infused into our work together—students are more than clients or objects of instruction. Indeed our work in classrooms has brought us to the realization that the most rigorous and equitable forms of instruction are unattainable if the teacher does not have a caring relationship with students” (Windschitl et al, 2012: 898).


16
Nov 13

Science Practice & Teaching – KeriAnn

As I was completing this week’s readings, I tried to focus on identifying the authors’ learning perspectives. Although I feel somewhat confident with doing this now, I definitely had some trouble this week.

The first article that I read was by Aguiar, Mortimer, and Scott (2009). I selected this article because I am very interested in students’ development and use of questions. As a classroom teacher, my students did not ask very many wonderment questions, which I now realize was probably due to the classroom environment. Luckily, I had the opportunity to work with a very talented teacher through my learning progression research assistantship, which allowed me to experience how students develop wonderment questions and how these questions can be addressed. I found it interesting that this teacher used similar strategies to the ones that Aguiar, Mortimer, and Scott identified. Before completing this reading, I expected the authors to follow a cognitive perspective. I feel that a student develops a wonderment question to make sense of their current understanding of a particular concept. I am not sure if this is in agreement with the authors’ view. Aguiar, Mortimer, and Scott state that, “meaningful learning of science involves the student in making connections between ways of thinking about events” (p. 177). The authors purposely italicized ‘connections,’ which I find to be more confusing. Are these connections that the student makes in his/her head, or are these connections that the student makes by interacting with his/her environment? I think that the meaning behind the word ‘connections’ would provide greater evidence of the authors’ view of student learning. Although this article was about the development of wonderment questions, the major focus was on how the response to wonderment questions can influence future classroom discourse. Because of this, I found that the authors followed a more situative perspective. However, I wonder if it is possible to follow a cognitive perspective when looking at the development of wonderment questions, but follow a situative perspective when looking at how the questions influence classroom discourse.

I found the Windschitl, Thompson, Braaten, and Stroupe (2012) article to be extremely interesting because they changed their framework based on the information gathered through their study. The initial framework that the authors’ described seemed to be an inclusion of both the cognitive perspective and situative perspective. Windschitl et al. state that their initial framework “drew upon the sociocultural hypothesis that a tool operates between an individual and the accomplishment of a complex task that might otherwise be out of reach without some form of assistance (p. 887). I found the inclusion of the individual to be an indicator of the cognitive perspective and the inclusion of the tool to be similar to the work of Vygotsky (1917). However, I had a hard time determining whether or not Windschitl et al. were using the term ‘tool’ in the same way as Vygotsky. After analyzing their work, the authors changed their framework to follow a more situative perspective because they found that communities developed from engaging with the specific tools. This is the first article that addressed a change in framework, but I wonder if this is a common occurrence.

The science education field has always been so fascinating to me because there are so many terms that educators and researchers cannot agree upon. In addition to inquiry and learning progressions, Lampert (2009) suggests that there is controversy over the meaning of practices when referring to the practices that teachers should utilize in a classroom. I think that the most interesting component of Lampert’s article was the questions that she posed to her readers. Many of her questions related to the potential need of social interactions in order for a teacher to learn appropriate teaching practices. For example, Lampert asks, “Would one need to practice ‘on’ real students?” (p. 7). I felt like I was reading between the lines, but I inferred that her argument was for the inclusion of social interactions between pre-service teachers and real students during teacher education programs. Between this and the questions that she posed, I believe that Lampert follows the situative perspective.


15
Nov 13

Week 12: Science Practice & Teaching – Kate

The three articles I chose to read this week were Windschitl et al. (2012), Manz (2012), and Anguir, Mortimer, and Scott (2009).

Windschitl et al. (2012) proposed a set of core instructional practices and tools for science teachers to use in their classrooms. They emphasized researcher’s positive intentions in developing an adaptation of the model-based inquiry (MBI) framework to help improve pre-service or novice teacher’s instruction. For example, high-leverage instructional practices (HLP) modeled through ambitious teaching. This framework focused on “Big Ideas” and three types of discourses. However, the following statement in regards to ambitious teaching reads as utopian and unobtainable. “–to get students of all racial, ethnic, class, and gender categories to engage deeply with science (p. 880).” It would be great if we could find a solution to meet the needs of every student, but I worry if one set of instructions is implemented the changes of time will quickly make the framework obsolete. Even though the authors’ state their framework is adaptable and fluid, I am curious how much would actually work. Also, it seems there needs to be another shift in pedagogical practices. They claim teachers are focusing too much on activity rather than questioning and knowledge building. “Large scale observational studies have documents that in American classrooms there is a focus on activity rather than sense making and that questioning in general is among the weakest elements of instruction. Only a small fraction of lessons take into account students’ prior knowledge and teachers seldom press of explanations (p. 881).” Basically, I feel because of testing, state standards, time constraints, and oversized classroom, teachers are instructing for the average student, leaving less flexibility and time to help the outliers. It seems to me this plan would produce similar results. What works for one teacher doesn’t always work for another. However, this plan seems to say it will, even as the first set of criteria for HLP encourages diverse teaching methods. I did notice feature three plans for organizing science instruction, which includes students’ everyday language, experiences, and knowledge, I still believe it is not that simple.

Manz (2012) discusses the importance of intertwining practice with knowledge in teaching. Her article emphasizes third grade students gaining conceptual power within the stages of co-constructional modeling practice and ecological knowledge in a yearlong plant reproduction investigation. This activity helped students learn within situated social interactions. Her framework includes: model making, making claims in the model system, and understanding the entailments of a model. Creating a model collaboratively can help students build scientific knowledge through investigation, question forming, and meaning-making. One question I had concerned her methodology. What is a design study? She briefly discussed creating her framework prior to implementing the research project. She also claims adaptability once implementation begins and then after the research is conducted using retrospective analysis to analyze her plan. I am not exactly sure what this meant.

Anguir, Mortimer, and Scott (2009) analyzed classroom interactions surrounding wonderment questions, followed by discourse, between the students and teacher. Depending on the discussion, the teacher readjusted their response to help build student knowledge and meaning-making. It was interesting to read about the three types of questions and the four classes of communicative approach. However, I did not find anything on length of questions asking, such as discussed in the article about Sister Gertrude Hennessy. One interesting part that also reminded me of the Sister Gertrude article was that one class had been together since the first year of compulsory education. I liked reading this article. It was interesting to learn about different types of communication between students/teacher and how teacher responses can be categorized in the flow of class discourse.


14
Nov 13

Science Practice & Teaching – Cori

This week’s readings were interesting because they covered some great range of material focused on science practices and teaching science practices. Aguiar, Mortimer, and Scott (2010) examined classroom interactions initiated by students’ wonderment questions. I think an important point that relates back to conceptual change theory is that the students are trying to connect new scientific concepts with their own interests, experiences, and knowledge by asking questions. This is a very cognitive concept because the students are assimilating their previous experiences and knowledge into the new concepts presented.

Crawford (2007) examined the knowledge, beliefs, and efforts of some prospective teachers to enact teaching science as inquiry during a one-year high school fieldwork experience. An important finding from this article stresses that teachers have different sets of beliefs that guide their instructional decisions. Their personal beliefs and anecdotes heavily influence their teaching and views of science. This has important implications for teacher training interventions. I would be interested in hearing what the class has to say about teacher training because I am very uninformed about these types of training.

Manz (2012) investigated the co-construction of modeling practice and ecological knowledge by following the development of one disciplinary construct (plan reproduction) through third graders’ yearlong investigation of a wild backyard area. It was interesting how the author pushed for the idea that elementary school students should be supported to participate in “relatively complex scientific practices to develop conceptual understanding, rather than first being taught the concepts in isolation of practice or being initiated only into simple forms of practice such as observation or categorization” (p. 1099). This is interesting because traditional methods appear to give a lecture/instruction on a topic and then do an activity to reinforce the material. However, Manz is saying the opposite: do the complex activity/practice first followed by inquiries and discourse and instruction.

Wu and Krajcik (2006) used a case study and found that constructing and interpreting graphs and tables provided students with opportunities to discuss, review, and clarify questions about concepts and inquiry process. Thus, engaging in inscriptional practices can benefit students by helping them construct understandings about certain concepts and inquiry. Similarily, Windschitl and collegues (2012) stressed that tools are also another way to help with ambitious teaching.

All of these science practice articles have important implications for science practice teaching. Grossman et al. (2009) wanted to develop a framework to describe and analyze the teaching of practice in formal education programs, specifically preparation for relational practices. Three key concepts for understanding the pedagogies of practice in formal education included representations, decomposition, and approximations of practice. Decomposition reminds me of the opposite of a Gestalt approach to learning. Decomposition involves breaking down practice into constituent parts for the purposes of teaching and learning. Here, the whole is not greater than the sum of its parts.

Overall, it seems a major theme in the readings this week was that just teaching theory will not result in a capacity for thoughtful and productive interaction or “adaptive expertise” (Lampert, 2009).  Actual practice, experience, and interactions with teaching and students will do this instead.


11
Nov 13

Learning Progressions – Ryan

I decided to start with the Shavelson (2012) article so that I knew what the complaints about learning progressions are from the beginning.  I also started there because I want to like learning progressions, and the last argument you read is often the most convincing, so I didn’t want to end on a negative note.  That being said, the Shavelson article was really not all that negative.  He seemed to just be giving reasonable, cautious advice for a burgeoning new area of research that has the potential to get too far ahead of itself.  I was a bit confused why he believed both roads he described deserved further investigation.  My interpretation of the roads was that the Curriculum and Instruction road involved researchers determining how they think knowledge progresses and the Cognition and Instruction road involved how the students’ knowledge actually progresses.  It would seem to me that only the second one would be useful, although my interpretations could be wrong.

I thought the Gunkel (2012) article was very interesting.  It seems like Discourse is another learning perspective that combines cognitive and situative, and it seems like a promising one.  I like the idea that in order to be part of a discourse community, you have to engage in their practices, but you also have to have the required knowledge.  It would be inappropriate to yell and cheer at a golf match (incorrect practice), but it would also be inappropriate to call a hole-in-one a touchdown (incorrect knowledge).  I was also intrigued by the “theory of the world” inherently imparted by language.  One concern I had about this article was that they want students to be environmentally educated citizens who can make informed decisions, but just because the students demonstrate that they are at the upper anchor of the learning progression in school, that does not necessarily mean they will apply those practices and knowledge, engage in that discourse community, when outside of school.  Using that knowledge to answer questions posed to them on an assessment is one thing, but who’s to say they would use that knowledge to make decisions about environmental policy?  If this seems like a strange concern, consider the fact that people who know proper grammar and can use it when writing papers or in professional meetings, often do not use it around their friends.  They are able to engage in multiple levels of discourse.  Just because a higher one has been achieved does not mean they can’t use a lower one in certain situations.

Gotwals (2013) acknowledges that “learning does not customarily follow linear, sequential steps of development” and “the construct of the learning progression is an idealized sequence rather than a stepwise path.”  I would agree with this, although Shavelson seems to think that if learning is not actually linear, then linear learning progressions are of little to no use.  I’m not sure yet whether or not I agree with this.  My initial instinct is to disagree, because there are plenty of models out there that are not actually accurate depictions of the world, but are useful nonetheless.


10
Nov 13

Learning Progressions – KeriAnn

Hi group! I’m sorry that this post is so late. I was traveling for a conference this weekend and did not have much time to complete the readings and write the reflection. I hope you had a good weekend!

This week’s readings were of particular interest to me because I have been working on a learning progression research project for the past year and a half. I had previously read all but one of the readings selected for this week, so I chose to read the one that I was not familiar with and the two articles that I read when I first joined the learning progression research team. When reading those two articles, I compared my original notes to the notes that I completed when reading the articles this week. This opened my eyes to all of the things that I either misunderstood or missed the first time that I had read them. This comparison also demonstrated that my understanding of learning progressions has advanced over the last year and a half.

The first learning progression article that I had ever read was the work provided by Gunckel et al. (2012). When I looked at my initial notes, I found that I had only focused on information that provided a description or explanation of learning progressions. Reading through the article with my new lens allowed me to find information that I had missed the first time. Although I acknowledged the inclusion of discourse the first time that I had read this article, I did not realize that this was a way of including a specific perspective of learning. Most, if not all, learning progressions follow the cognitive perspective. Therefore, I find it extremely interesting that Gunckel et al. (2012) found a way to include the situative perspective in their learning progressions. I found the description of primary and secondary discourse to be very interesting, but I felt like there were some missing components in the description. Gunckel et al. (2012) state that students “begin life with the primary discourse of their home communities” (p. 43). Later, they state that, “students’ primary discourses define the lower end of [their] learning progression frameworks” (p. 43). Although this may be true of most students, I wonder if students who come from scientific households may use secondary discourse within their home communities. In addition to this, I also wondered whether or not students could use primary discourse while maintaining scientifically accurate ideas. Lastly, I think it would have been helpful if the authors had included some description or hypothesis of how students move from using primary discourse to using secondary discourse.

Although Gunckel et al. (2012) did not focus on the development of assessment items, I found the Alonzo and Steedle (2009) to be somewhat similar to the work of Gunckel et al. (2012). While developing the assessment items, Alonzo and Steedle (2009) had to focus on the students’ use of language to determine what level of understanding they held. However, Alonzo and Steedle (2009) did not describe this as being a part of the situative perspective. Instead, these authors interpreted the students’ language as their ability to express their individual level of understanding. I found it interesting that the authors focused their work on the development of assessment items using students’ language. While interviewing students for our learning progression development, we have run into the same problems as Alonzo and Steedle (2009); sometimes you think that a question is worded in a way that all students understand, but you quickly find out that it is not the case. For example, in our interview protocol we ask students, “Is the Solar System flat?” Responses to this question were not useful, so we added a probe: “Is the Solar System flat like racecars going around a track or more like fireflies in a jar?” After additional interviews, we found that students were still not providing useful responses. We finally changed our question to, “Are the orbits in the Solar System flat? Do all of the planets orbit around the Sun on the same level?” This shows exactly how iterative the development of a learning progression and all of its components truly are.

When I first read the Shavelson (2009) article last year, I agreed with many of his concerns about the use of learning progressions. However, I found that working on a learning progression project over the last year has allowed me to better understand his concerns. Shavelson (2009) describes a curriculum that he used to address sinking and floating in the middle school grade-band. He states that students still held naïve conceptions, even though the curriculum followed “scientists’ evolving explanations of sinking and floating” (Shavelson, 2009, p. 2). After reading this the first time, it seemed as if this was evidence that learning progressions may not work. However, the difference here is that the curriculum was developed based on scientists’ evolving explanations, not students’ evolving explanations. This makes me wonder if there would have been a change in the students’ level of understanding if the curriculum had been developed based on the progression of students’ understanding. Shavelson (2009) later states that one of his greatest concerns is the hypothetical nature of learning progressions. However, all research needs to start somewhere. Learning progressions are currently hypothetical because longitudinal studies have not been completed. Once initial, hypothetical learning progressions have been developed, researchers can then apply these frameworks for longitudinal studies. This would then allow the learning progression to move from hypothetical to empirical. I think that one of the most important points that Shavelson (2009) makes is about whether or not students “actually grow their knowledge” in a certain way (p. 7). I think the best way to determine this would be to assess students and apply their level of understanding to a working learning progression multiple times. In many cases, learning progression studies assess students pre-instruction and post-instruction, but do not assess students during instruction. I think that the assessment of students during instruction may be able to provide further information about how students’ knowledge changes, which can then be used to refine the middle levels of a learning progression.


09
Nov 13

Learning Progressions–Julianne

“certain things such as the foundation and then walls must come first to provide structural support for the windows and roof—yet within those constraints there is some flexibility as well and multiple ways to build a house” (National Research Council, 2007: 247)

Not being well versed in what defined learning progressions, I started this weeks’ reading with TSS Chapter 8 (National Research Council, 2007: 213–250). What struck me in particular about the chapter was the hopeful but cautionary tone of the writing. There are probably few that will disagree with the undesirability of the mile wide-inch deep approach to science curricula—clearly there must be a better way to teach and learn science. Is incorporating learning progressions into curriculum design and assessment the answer? Curriculum designers and educational researchers are cautioned to retain awareness of the hypothetical nature of learning progressions and their dependency upon the “teachers’ knowledge and the effectiveness of their instructional practices” (p.213) as well as the student’s knowledge, skills, and understanding of “core ideas across the disciplines of science” (p.213). Metz’s (2009) critique echoes the unproved nature of the concept when she states “at this point the research literature is inadequate to design optimal learning progressions…with any degree of confidence. As we have little knowledge of how the students’ knowledge might develop under other conditions, the pathway we construct becomes in part a product of our imagination” (p. 19).

The Metz (2009) review was, I thought, well crafted and quite interesting. In brief, she seemed to be saying that kids need to be given more credit and that branding their supposed abilities by their age (or stage) underestimates their “reasoning competencies” (p. 9) and is not supported by the cognitive developmental literature. However, with instructional scaffolding ala Vygotsky’s and Ann Brown’s work and some constraints based on the children’s level of maturation and development, Metz’s emphasized that what children can learn is a function of what they know and how that relates to the instructional opportunities they are afforded. Much like TSS she sees potential value in learning progressions but includes caveats as well—such as the enormity of testing learning progression limitations and powerfulness. Also, because learning progressions are hypothesized to work over the span of many years during which a child is in elementary and middle school it seems imperative that instruction throughout those years needs to follow the same line of thinking—“Any teacher who teaches as they have taught before essentially disrupts the progression. The implementation of a learning progression is inherently an enterprise of Coordination” (Metz, 2009: 16).

Although I read the Gunckel, Mohan, Covitt, & Anderson (2012) article before reading Metz (2009) I kept thinking back to Gunckel et al while reading the Metz article. Much of what I felt Gunckel et al (2012) were trying to express echoed Metz’s critique. For example “we simply do not know what might be possible under more optimal instructional conditions, particularly under conditions of multi-year learning progressions and corresponding instructional support that more fully capitalize on children’s capabilities” (Metz, 2009: 9). Gunckel et al looked at what was currently happening in classrooms. They wanted to “capture the current reality of schools” (p. 67) through their learning progression framework and assessments. They were attempting to improve student performance without making “whole scale changes in curricula” (p. 67). In fact, their Process Tool for classroom use took the form of a large poster (36” x 48”), student activity pages, and a prepared PowerPoint presentation for the teacher to use. That’s pretty mainstream. “We recognized the link between instruction and the learning progression framework was not tied to a specific instructional sequence but rather reflected a teacher’s general approach to conveying the importance of principle-based reasoning in a variety of contexts” (Gunckel, Mohan, Covitt, & Anderson, 2012: 67). Perhaps realistically, the authors acknowledged that they knew little about how much professional development would be required to turn the teachers into “active users” (p.71) of the proposed learning progression system. They stated that even conservative changes to instruction styles represent “substantial shifts in pedagogy” (p.71) and formative assessments would be required to highlight any changes that may occur in student performance.

In their conclusions, Gunckel et al (2012) note that language and the use of language (Discourse perspective) can shape and represent a student’s reasoning. Further, they claim that focusing on language allows them to develop a framework that covers socio-cultural (situative) and cognative aspects of learning. I really would have liked to have had more explanation of that line of reasoning.

The empirical work of Gotwals and Songer (2013) seemed to be trying to prove too many things at once. I found the article frustrating primarily because I sensed that the cautionary warnings from the earlier papers (Metz, 2009; National Research Council, 2007) had been dismissed. Secondarily, I wondered if the outcomes they reported for the two students—Tatiyana and Charity—were inadvertently skewed by the researchers expectations. Tatiyana had been classified as a “medium to high level student” with a “slightly above average” ability level (p. 15). On the other hand, Charity was classified as “lower level student with a lot of potential” (p. 15) with a below average ability level (Gotwals & Songer, 2013: 15). As the study continued and the students were interviewed Tatiyana’s responses “when prompted by the interviewer illustrate[d] that she understood more than we would have expected” (Gotwals & Songer, 2013: 16). Charity’s interview responses and think-aloud exercises did not elicit that kind of response. Thinking back to Gunckel et al (2012) contention that language and the use of language can represent a student’s reasoning I wondered in perhaps Charity did not have the vocabulary to express herself in terms deemed high-value by the assessors. Or, having been termed a “lower level student” she may have come to believe it and saw no point in rising above that expectation. Did both students rise to the level they were expected to?

Gotwals & Songer (2013) posited the conclusion that it was easier for students to make claims than it was for them to reason about the claims. While this may be true, the authors also said that claims were “an answer to a scientific question” (p. 22) posed to the students, claims were the “component of explanation that was most likely to be answered correctly” (p.22), and claims were the least problematic “in terms of linking student responses to our learning progressions for evidence-based explanations” (p.23). Were the authors stuck in the tradition of expecting the “right” answers to be evidence of knowledge gains? Clearly, being a researcher can be difficult—it is tough to be impartial and not introduce bias or preconceptions into research.

“While we know quite a bit about what young children understand in some content areas and contexts, we also recognize that learning does not always happen in a linear, stepwise fashion, thus making the articulation and empirical backing of one idealized learning path difficult to generate. It is possible that even the same learner might proceed differently through a knowledge development path in different contexts, so we recognize that while it might be possible to identify target upper and lower anchors of a progression, articulation of the intermediate learning points might be better described as “the messy middle” (Gotwals & Songer, 2013: 3).


08
Nov 13

Week 12: Learning Progressions – Kate

Sometimes I think education is desperate for a catch all theory on how to reach all students. I believe that was what Shevelson (2012) was trying to warn against in his article. In education, the latest hot topic is learning progressions. While I can see the value in researching this subject, I agree with Shevelson that imposing one type of learning trajectory for empirical studies and data collection can be problematic. Although after reading Gunkel et al (2012), Gotwals and Songer (2013) and chapter 8 from Taking Science to School: Learning and Teaching Science in Grades K-8 (2007), learning progressions definitely seem worthy of further investigation and research within the field of education.

Gunkel et al. (2012) discusses learning progressions through their Environmental Literacy Project (ELP). This project is unique because it connects scientific issues to real problems, unlike K-12 curriculum where disconnected facts and processes are often taught (also addressed in TSS). The authors’ discuss how the ELP is divided into four strands. I was a little confused about these strands because the authors did not define or mention them much again. Did they mean the four research groups or levels? I am not sure if I missed something there. These strands (whatever or whoever there are) include three research groups working to develop learning progressions connecting three key aspects of socio-ecological systems. These aspects include: water, carbon, and biodiversity. The literature does not discuss how they were chosen, such as local, regional, or general interest concerns. I was wondering how and why they chose these key aspects. The fourth research group examines students’ decision making practices in citizenship roles. While I was reading about the goals, parameters, challenges, and limitations of this project, it was also helpful to read about alternative pathways. These acknowledge how students do not learn the same way and at the same pace. There are many environmental influences that need to be factored into learning…situative example perhaps!?

The authors’ focus on two core challenges found in environmental science literacy: defining what progresses in a learning progression, such as students Discourses (primary/force-dynamic reasoning and secondary/scientific reasoning), practices, and knowledge. I found the four levels of achievement and lower and upper anchors interesting as the authors connected examples for the stages and progression viewpoints.

The second challenge addressed defining pathways and linking them to instruction. This includes alternative pathway transitioning between the four Lower and Upper Anchors (four levels) on two different paths: Structure-First and Principles-First. Another question I had was why “Discourse” is capitalized? I have never seen it written that way. Also, while I understand the Principles-First pathway might be the most successful from the study, I believe having the knowledge from the Structure-First pathway helps the student to know the language in order to explain the process. I do not see how the two can be separate.

I chose Gotwals and Songer (2013) for my empirical studies article. Their study focuses on examining 6th grade students’ abilities to fuse core ecological ideas with evidence based explanations. They build their framework around linking core disciplinary ideas with practices and assessing fused knowledge in learning progressions. The common thread I found between these articles emphasizes teaching students to become scientifically literate citizens through scaffolding instruction and levels of learning progression. This study provides information on developing and evaluating resources to assess fused knowledge with the practice of explanation. For example, the article discusses designing assessment that measures learning progressions at multiple points. I feel this is important to help researchers pinpoint where problems in fused knowledge and learning occur, and also how educational instruction should change regarding the individual (TSS article too). Learning progressions should influence assessment rather than being arbitrary ways of gauging knowledge. I really enjoyed how they take the individual into account, stating that learners gather knowledge and progress in many ways. Scaffolding was also discussed both positively and negatively in developing evidence based explanations around core ideas. For example, scaffolding played a significant role helping students provide appropriate and sufficient evidence. However, in the students’ written responses scaffolding did not influence the difficult parameter of items or think aloud activity. Sometimes I assume scaffolding should be included in every aspect of education, so it was beneficial to read when it does not work.


Skip to toolbar