Member Login

Lost your password?

Registration is closed

Sorry, you are not allowed to register by yourself on this site!


“Peer-review” of Marzano’s IWB Study Report, Part I

Two of the more well-known brand names in education recently combined forces.  Robert J. Marzano (wildly popular consultant/author/speaker) produced a report of a study he conducted of Promethean ActivClassroom (wildly popular interactive white board (IWB) technology).

The report has received lots of publicity; I have seen multiple references to it on Twitter and elsewhere. You can get a copy from Promethean, but only by first providing them with lots of contact information here. I wasn’t willing to make that exchange, but the day after discussing the report on Twitter, Sonny Magana, the Director of Education Strategy at Promethean, Inc. was kind enough to e-mail me a copy of the report.

Marzano’s work has not yet been formally reviewed by any “peers” (at least as far as I can tell).  While I am very critical of the way peer-review is typically conceived and carried out in academia, there is real value in the process.   Therefore, I’m using this space to do just that.  This first post is a bit of an introduction.  In subsequent posts, I’ll address methodological and analytical issues.

In this first post, I’ll try to do two things simultaneously: address a key criticism and establish some semblance of credibility as a reviewer.  The report states that it was prepared by the Marzano Research Laboratory for Promethean, Ltd.  That undoubtedly means that Promethean funded a study of their own product(s).  Such an arrangement, which certainly gives the appearance of a lack of objectivity, is not unpredecented and not even unusual.  I should know; I’ve done plenty of evaluation research as a “third-party, independent” evaluator funded by vendors.  For the better part of ten years, I was part of a research team that conducted evaluation research funded by private vendors such as Lightspan [since purchased by Plato Learning], Scholastic, eChalk, Jostens [sic.], etc to study their own products/programs.  Based on my experiences, I can state confidently that those sorts of arrangements should be viewed with skepticism and examined critically.  I stand by much of the work I did and would defend the work against any critique.  However, there were certainly instances where the vendor/funding source “influenced” the contents of the final report. More often, the final report was written in a way that would be most palatable to the client.

[NOTE: of the privately funded evaluation research I was a part of, I’ve only been able to find one report that is publicly available.  This report of a large-scale evaluation of Scholastic’s READ 180 (funded by Scholastic) happens to be one by which I swear.  There are a number of reasons why this study is credible, but the most important factor is that the main stakeholder was the Council of Great City Schools and not Scholastic].

Ultimately, without being present at the initial negotations between the parties and without being privy to conversations between the researcher(s) and the client, it is hard to know how “objective” or “honest” a research report is when the study is of a product/program and the study is funded by the vendor of said product/program.  The best we can do is to (peer-)review these sorts of reports against the standards of educational research.  Onward then…

PART II: Research Design

Tags: , , , ,