Moving at the Speed of Creativity by Wesley Fryer

Oversimplifying TxTIP year 1 results

Barbara Cargill serves on the Texas State Board of Education, representing District 8. Among other things, our SBOE is responsible for textbook adoption, standardized test (TAKS) standards, and curriculum standards (TEKS). These are extremely important and influential responsiblities. In her published article from January 19, 2006 entitled “A First Year Perspective: Serving on the Texas State Board of Education,” Barbara noted the following:

It is also clear to me that we must first make sure that academic achievement is the top priority for the advancement of technology. The Texas Technology Immersion Pilot (laptop use, often referred to as Texas TIP,) is underway in our state.

The first year results in these middle schools are not pointing to increased academic achievement (Evaluation Report as of 11-17-05). This study continues. My efforts will support a sure but steady approach, one in which thorough research is done that shows how students directly benefit from tried and true technology programs and tools.

I am not writing this post to criticize Barbara or the work she is doing for Texans on the SBOE. But I do want to point out how extended, complicated research documents can become distilled into a single sentence, which I have bolded above. There are multiple payoffs to technology immersion which are documented in the referenced report. There is so much more to student achievement than what is measured on a standardized assessment!!! High on this list are 21st century literacy skills.

The year 1 eTxTIP report is a 66 page PDF file. As I have noted in the draft of my dissertation lit review, the bottom line to TxTIP (from a state policymaker perspective) does not appear to be transforming teaching and learning and helping students as well as teachers daily refine 21st century literacy skills. The overarching goals and objectives of TxTIP are to improve student test scores on state measures of achievement. Here is a quotation from my current lit review draft:

This predominant focus on “test score results” over other more nebulous measures of student performance and achievement is clearly articulated in the year 1 eTxTIP report. eTxTIP researchers report:

The ultimate goal of technology immersion is increasing middle school students’ achievement in core academic subjects (English language arts, mathematics, science, and social studies) as measured by the Texas Assessment of Knowledge and Skills (TAKS) (Shapley, Sheehan, Sturges, Caranikas-Walker, & Huntsberger, 2005, p. 59).

Relative measures of technology immersion on school campuses, perceived levels of student self-efficacy and technology proficiency are also being studied and reported by research agencies like TCER evaluating one to one projects. These measures and foci, however, are overshadowed by a dominant emphasis on “traditional literacy skills” measured by high stakes testing required by educational accountability laws.

Barbara is correct in her analysis, the year 1 TxTIP results are NOT positive with regard to the measured impact of technology immersion on student test scores. But how could they be? On some TxTIP campuses, students did not even receive their laptops till April of 2005, after the TAKS tests had already been taken!!! She could have gone farther in highlighting these year 1 negative results, however. Here is another quote from my current lit review draft:

In comparing immersion campuses to control campuses, first year results of the TxTIP project indicate no positive effects of immersion on student attendance rates (significant or otherwise), and no positive effects of immersion on measured student test scores. Since 6th grade students in Texas are not formally assessed on writing skills, this year one analysis only included test results for reading and mathematics. The comparison actually showed that students on control campuses outscored “immersed” campus students on reading tests. Researchers reported:

…Students at control schools had higher TAKS passing rates for the reading assessment and made slightly greater gains in reading than students in treatment schools (Shapley et. al, 2005, p. 60).

Student performance on the state mandated mathematics test also showed poorer results for students on “immersed campuses:”

Moreover, students’ scores on the TAKS mathematics assessment are notably lower than for reading. In 2005, only 57% of treatment students and 66% of control students met state passing standards in mathematics (Shapley et. al, 2005, p. 60).

While various “mediating variables” have been offered by TCER to explain these differences in student achievement results measured through traditional test scores, the fact remains that year 1 results of the TxTIP project are exactly the reverse of those predicted by the research hypothesis and the goals of the project and study. The impact which a variety of differentiating factors could have had on these results remains to be analyzed by TCER or other researchers.

It is not realistic or reasonable, particularly given the implementation timeline differences on TxTIP campuses in 2004-2005, to expect a positive impact on test scores YET. This caveat should be identified when anyone (researcher or politician) discusses these results. Unfortunately, many people (policymakers included) may not make this distinction, and may also have a myopic focus on the single issue of student test scores.

There are multiple ways to measure “success” in a one to one technology learning initiative, and this is the focus of my dissertation.

If you enjoyed this post and found it useful, subscribe to Wes’ free newsletter. Check out Wes’ video tutorial library, “Playing with Media.” Information about more ways to learn with Dr. Wesley Fryer are available on wesfryer.com/after.

On this day..


Posted

in

,

by

Tags:

Comments

One response to “Oversimplifying TxTIP year 1 results”

  1. Conn McQuinn Avatar

    Good heavens. Anybody with any knowledge of educational reform and research would know that you can’t expect significant measurable results in one year.

    First off, the first impact of any major change is a *decrease* in student performance due to the disruption caused by the change. The studies done by Apple Classroom of Tomorrow indicated that it takes 3-5 years before you can expect to the see the full impact of this level of implementation. So actually, the fact that the immersion students remained the same or slightly lower than the control students is, in itself, a positive result.

    Second, the TAKS results for a 6th grader do not reflect what a student learned in 6th grade. It measures what the student learned in grades K-5 and seven months in grade 6. Making big changes in the 6th grade curriculum is too little, too late.

    And that’s all beside the point of putting all your eggs in one narrow assessment basket.