Illuminating Teacher Learning of Technology-Enhanced Formative Assessment

A talk given to the American Association of Physics Teachers Summer 2006 National Meeting in Syracuse, NY: Contributed Talk DH05, Tuesday, July 25.

(If you click on the title links, the relevant “slide” should open in a new browser window. Subsequent slides should open in the same window, so if you resize the window to something about 1024 x 768 and drag it to the side of the narrative in this window, you should somewhat recreate the effect of the talk. Except for the part about me being rattled by technical problems and talking way too fast. Hopefully, your display is working better than my projector did during the talk. The slides are HTML, not PDF or images, so fonts and layout and such may vary.)


Title & Authors

Good evening.

This is the kind of talk where I tell you about a current research project, because it’s good for professionals to know what their colleagues are up to. It’s also the kind of talk where I slide in my own pedagogical opinions, because, well, I want to change the world.

If you hear anything brilliant in my talk, credit probably belongs to my colleagues in the UMass Physics Education Research Group. They’ve been thinking about this for longer than I.

Classroom Response Systems

How many of you know what a “classroom response system” is? Also known as an “audience response system”, “voting machine”, “polling system”, or “clicker system”? [Probe audience and adjust talk as appropriate.]

Briefly, for those of you who haven’t: A classroom response system provides a supplemental, technology-mediated channel of communication between instructor and students. [1]

It is a combination of hardware and software that allows an instructor to:

* Present a question to students in class;
* Have them submit answers;
* Immediately aggregate the responses; and
* Share the results with the whole class, usually as a histogram.

[Pause.] Are classroom response systems effective?

[Pause.] Please raise your hand if you think they’re educationaly effective… Who thinks they’re not so hot?… Who’s still sitting on the fence?…

That was a trick question.

Assessing Instructional Technology

This session is about “assessing the educational effectiveness of technology”, but technology doesn’t have educational effectiveness. At least not by itself.

What is the culinary effectiveness of a wok?

[Pause.] If you know how to use one and you’re trying to make a nice stir-fry, it’s quite effective. But if you have no clue what you’re doing in a kitchen, or you’re trying to make a quiche, it’s pretty darn useless.

Any evaluation of instructional technology must ask four questions:

* For what purpose is the technology being applied?
* How is it being applied?
* How well is the user doing it?
* How well does the technology enable or enhance the attempt?

Only the fourth is about the technology itself.

[Pause] That perspective motivates the project I’ll be describing.

Uses for Classroom Response Systems

Classroom response system can be tremendously powerful instructional tools, but they’re not a silver bullet. We’ve seen them used completely ineffectually, horribly abused, or used well for ends we don’t see much merit in.

Some of the different goals people use classroom response systems for include:

* Taking attendance
* Administering quizzes
* Provoking engagement [2]
* Checking for progress
* Promoting knowledge diffusion [3]

All of these typically involve sprinkling response system questions throughout “normal” instruction. The approach we at UMPERG have developed is, I think, considerably more radical. [4]

The Question Cycle

The core idea is that we structure class time around an interactive question cycle [5], iterated three times per hour, more or less. The question cycle serves as the primary vehicle, the primary engine, for instruction. It’s not an add-in; it’s the meat of the class.

We use the question cycle to:

* Reveal and explore students’ thinking,
* Introduce new ideas,
* Refine and extend students’ understanding, and
* Develop students’ analysis and problem solving skills.

A classroom response system is not strictly necessary for this, but it sure helps.

Classroom Dynamic

There’s a lot more to our approach than the question cycle, but that would require a few more ten-minute talks.

What do we call it?

We’ve called the approach by various names at various times, depending on what aspect we’re focusing on.

* Question Driven Instruction (QDI)
* Technology Enhanced Formative Assessment (TEFA)
* Assessing to Learn (A2L)
* Agile Teaching (AT)

I’ll use the first one, “Question Driven Instruction” or “QDI”, tonght.

So back to the project. We want to make a really strong case for the efficacy of classroom response system coupled with the QDI pedagogical approach. That means doing a large-population study of student learning impacts, a so-called “scaling study”, rigorous enough for the What Works Clearinghouse.

But, we can’t evaluate learning impacts fairly without a cadre of teachers who are trying to do QDI, and doing it competently.

Scaling Study Preliminaries

So, we need two things first.

* We need a professional development program that can efficiently and reliably move teachers to a state of QDI competence.
* And we need measures of implementation fidelity that tell us when a teacher is, in fact, doing a reasonable job at it.

If we don’t have the first, we won’t have any QDI to study. If we don’t have the second, we won’t know whether negative results — that is, poor learning impacts — indicate that QDI doesn’t work, or that it just isn’t happening.

So the project we proposed to The National Science Foundation, and that we’re currently working on, is a preliminary study that sets up a scaling study.

TLT Project Goals

Our project is called Teacher Learning of Technology Enhanced Formative Assessment. We are studying teacher learning, not student learning. There will be no measuring of student learning gains.

The project has three general goals:

# To better understand how teachers get from novice to expert in the use of a classroom response system and QDI.
# To refine our methodology for teaching QDI to teachers, and “package” it so that others can do the professional development.
# To prepare the measures, instrumentation, design, and general know-how for a “scaling study” on student learning impacts.

TLT Design: Professional Development

Our plan is to work with the entire science department at a high school, so that teachers can learn collaboratively and support each other, and so that students get a consistent learning experience from class to class.

During the first treatment year, we’ll conduct an intensive two-semester professional development course. The course will focus on the skills that go into successful QDI, including:

* Using classroom response system technology,
* Designing effective questions,
* Navigating the question cycle,
* Moderating classroom discourse, and
* Integrating QDI with curriculum goals and external constraints.

[Pause.] Change doesn’t happen overnight. For all three treatment years, we’ll facilitate a collaborative action research program for the teachers. This will:

* Support the ongoing evolution of their teaching practice, and
* Provide a forum where the teachers can set the agenda and come to terms with QDI.

TLT Design: Data Acquisition

We’ll collect longitudinal data on teacher change over the three treatment years, plus baseline measurements. We’ll gather data from classroom observations, and from interviews and surveys, in order to track changes in:

* What teachers do in the classroom;
* How they approach lesson planning;
* How they perceive knowledge, learning, and teaching;
* What aspects of teaching occupy their attention most;
* What difficulties they wrestle with;
* What supports and assistance they find helpful;
* How they perceive their own QDI skills; and
* How their students perceive the classroom environment.

We’re developing most of the research instruments ourselves, since existing instruments don’t really address the variables we want to track. We are, however, incorporating pieces of established instruments to aid comparison with other research.

TLT Timeline

We’ll be working with two schools, staggered by one year so that we’re taking baseline data in school 2 while teaching the PD course in school 1.

Our first school is fairly small, with eleven participating teachers. This includes most of the high school science faculty, as well as some of the math and junior high science teachers. We’re planning on a larger cohort for school 2, with over 20 participants.

Where are we now? We’re just wrapping up baseline data collection for school 1. Next week, we’ll kick off the PD course with a three-day workshop. We don’t have any results yet, but come back next year for some fascinating preliminary findings!

Links & Credits

If you want to learn more about the project, come chat with me and see our poster during tomorrow morning’s poster session.

Or, visit our web page about the project. (It’s a little sketchy right now, but I’ll be augmenting it soon.)

If you fell asleep in the middle of this talk and want to see what you missed, I’ll post the narrative and slides to my personal weblog, hopefully tomorrow.

And by the way, our graduate student Colin Fredericks is giving a talk at ten tonight, right here, about a different but also interesting project. So please stick around a little longer.

Thanks for your time. Any questions?


1 Ian D. Beatty (2004) Transforming Student Learning with Classroom Communication Systems. Educause Center for Applied Research (ECAR) Research Bulletin ERB0403.

2 Robert J. Dufresne, William J. Gerace, William J. Leonard, Jose P. Mestre, Laura Wenk (1996). Classtalk: A Classroom Communication System for Active Learning. Journal of Computing in Higher Education 7, 3-47.

3 Mazur, Eric (1997) Peer Instruction: A User’s Manual (Prentice Hall, Upper Saddle River, NJ).

4 Ian D. Beatty, William J. Leonard, William J. Gerace, Robert J. Dufresne (2006). Question Driven Instruction: Teaching science (well) with an audience response system. In David A. Banks (Ed.), Audience Response Systems in Higher Education: Applications and Cases (Idea Group Inc., Hershey, PA).

5 Robert J. Dufresne, William J. Gerace, Jose P. Mestre, William J. Leonard (2000). ASK-IT/A2L: Assessing Student Knowledge with Instructional Technology. UMass Scientific Reasoning Research Institute technical report Dufresne-2000ask.

About Ian

Physics professor... science education researcher and evangelist... foodie and occasionally-ambitious cook... avid traveler... outdoorsy type (hiking, camping, whitewater kayaking, teaching wilderness survival skills to high school students, etc.)... amateur photographer... computer programmer and amateur web designer... and WAAY too busy!
This entry was posted in Educational Research, Physics Education Research. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *