Testing in Science
Fu and Shavelson share that the National Assessment of Educational Progress (NAEP) is leading by example for what science assessment should entail. Because previous NAEP assessments have relied on “conceptually disconnected multiple-choice and short-answer items … not ideal for assessing such practices as relating multiple concepts, explaining scientific phenomena, conducting a physical investigation, and manipulating variables in a dynamic simulation”, the 2009 framework began using new item types.
Photo by dullhunk.
New item types recommended or implemented include concept maps, item clusters (analysis the responses to several multiple choice questions in conjunction with one another to gain fuller understanding of student understanding), “Predict-Observe-Explain” items, hands on performance tasks (HOTs), and interactive computer tasks (ICTs). ICTs integrate the other item types to computerized testing, solving logistical issues such as grading concept maps and supplies needed for HOTs but is “the least psychometrically studied of the item types” (2009).
Self Concepts in Science
Numerous findings from Trends in International Mathematics and Science Study (TIMSS) data have found that students that have a positive self-concept about their science ability score higher on science tests (House, 2008; Mettas, Karmiotis, & Christoforou, 2006; Shen & Tam, 2008). To me, it is unclear if students have a more positive self-concept as a result of their higher proficiency or if simply believing they are more proficient in science leads to higher proficiency. I think it is likely my first assumption. Whichever the case, it is clear that students’ self-concepts about their ability affect their achievement.
Teaching Strategies in Science
Using TIMSS data, it has been found that cooperative learning activities and active learning strategies positively affect science achievement (House, 2008). Accordingly, these strategies should be implemented in science classrooms.
Number of Concepts Taught
Baybee (2007) shares that TIMSS data has brought to light that U.S teacher report significantly more science topics addressed and taught during a school year than teachers in Japan and Germany (about 60-70 topics versus 6-8 topics respectively). This is approximately 10 times more concepts being attempted to be taught! I think this may be evidence that more depth and less breadth needs to be a focus of science instruction in the United States.
Takeaways for Science Teachers
The following points can be taken away from the NAEP and TIMSS:
- Add to your variety of assessment methods. Have you used “Predict-Observe-Explain” or concept maps with your students?
- Improve student confidence in science by providing specific positive feedback.
- Implement cooperative learning activities.
- Focus on depth of understanding. Analyze the standards carefully and cut any content that does not align. You may be surprised at the time you are able to free up!
Fu, A., Raizen, S., & Shavelson, R. (2009). The Nation’s Report Card: A Vision of Large-Scale Science Assessment. Science, 326(5960), 1637-1638. Retrieved from Education Research Complete database.
House, J. (2008). Effects of Classroom Instructional Strategies and Self-beliefs on Science Achievement of Elementary-school Students in Japan: Results from the TIMSS 2003 Assessment. Education, 129(2), 259-266. Retrieved from Education Research Complete database.
Mettas, A., Karmiotis, I., & Christoforou, P. (2006). Relationship Between Students’ Self-Beliefs and Attitudes on Science Achievements in Cyprus: Findings from the Third International Mathematics and Science Study (TIMSS). Eurasia Journal of Mathematics, Science & Technology Education, 2(1), 41-52. Retrieved from Education Research Complete database.
Shen, C., & Tam, H. (2008). The paradoxical relationship between student achievement and self-perception: a cross-national analysis based on three waves of TIMSS data. Educational Research & Evaluation, 14(1), 87-100. doi:10.1080/13803610801896653.