Abstract

Collecting data on instructional practices is an important step in planning and enacting meaningful initiatives to improve undergraduate science instruction. Self-report survey instruments are one of the most common tools used for collecting data on instructional practices. This paper is an instrument- and item-level analysis of available instructional practice instruments to survey postsecondary instructional practices. We qualitatively analyzed the instruments to document their features and methodologically sorted their items into autonomous categories based on their content. The paper provides a detailed description and evaluation of the instruments, identifies gaps in the literature, and provides suggestions for proper instrument selection, use, and development based on these findings. The 12 instruments we analyzed use a variety of measurement and development approaches. There are two primary instrument types: those intended for all postsecondary instructors and those intended for instructors in a specific STEM discipline. The instruments intended for all instructors often focus on teaching as well as other aspects of faculty work. The number of teaching practice items and response scales varied widely. Most teaching practice items referred to the format of in-class instruction (54 %), such as group work or problem solving. Another important type of teaching practice items referred to assessment practices (35 %), frequently focusing on specific types of summative assessment items used. The recent interest in describing teaching practices has led to the development of a diverse set of available self-report instruments. Many instruments lack an audit trail of their development, including rationale for response scales; whole instrument and construct reliability values; and face, construct, and content validity measures. Future researchers should consider building on these existing instruments to address some of their current weaknesses. In addition, there are important aspects of instruction that are not currently described in any of the available instruments. These include laboratory-based instruction, hybrid and online instructional environments, and teaching with elements of universal design.

Highlights

  • Collecting data on instructional practices is an important step in planning and enacting meaningful initiatives to improve undergraduate science instruction

  • Broad patterns and comparisons What is the nature of the instruments that elicit self-report of postsecondary teaching practices? (RQ1) Background Almost all of the instruments were developed out of a growing interest to improve undergraduate instruction at a local and/or national scale

  • What teaching practices do the instruments elicit? (RQ2) As we examined all of the instruments in our sample, the majority had the largest number of their items focused on instructional format (BEFS, Henderson & Dancy Physics Faculty Survey (HDPFS), OCES) or a combination of instructional format and assessment (FSSE, Postsecondary Instructional Practices Survey (PIPS), Statistics Teaching Inventory (STI), STEP, SUCCEED, Teaching Practices Inventory (TPI))

Read more

Summary

Introduction

Collecting data on instructional practices is an important step in planning and enacting meaningful initiatives to improve undergraduate science instruction. Self-report survey instruments are one of the most common tools used for collecting data on instructional practices. This paper is an instrument- and item-level analysis of available instructional practice instruments to survey postsecondary instructional practices. While self-report surveys are acknowledged as being useful tools for measuring teaching practices, there has been little systematic work characterizing the available instruments. This report was the result of a 3-day workshop to develop shared language and tools by examining current systematic efforts to improve undergraduate STEM education. The report provides an overview of available instruments, it does not examine the design and development of the surveys nor analyze the content and structure of survey items. We report estimated time to completion for instruments in their entirety (this may include items other than those related to teaching practice). The consistency and number of response scales may potentially add to administrative burden

Objectives
Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.