Research: Laptops Do Not Increase Academic Achievement in Reading and Writing
“There were no statistically significant effects of immersion on the TAKS reading and Writing.”
By Donna Garner –
With the Texas Legislature almost ready to begin its 81st Regular Session in January 2009, I am sure the technology lobbyists are out in full force. For years, they have been trying to pressure Legislators to pass legislation that would force taxpayers to fund laptops for every student in the Texas public schools. The question is:Do laptops on every student’s desk raise academic achievement?
In January 2008 a report entitled Evaluation of the Texas Technology Immersion Pilot: Outcomes for the Third Year (2006-07) was released (http://www.tcer.org/research/etxtip/documents/y3_etxtip_quan.pdf).Based upon four years of solid research, here is the answer to the academic achievement question: “There were no statistically significant effects of immersion on the TAKS reading and Writing.“[TAKS — Texas Assessment of Knowledge and Skills were the tests used to measure academic achievement for TIP.]It seems that laptops on every desk did not raise student academic achievement in the most important foundational skills a student will ever learn — reading and writing.
With the downturn in the economy across our nation, it is more important than ever to make sure that our tax dollars are well spent.I hope that Texas Legislators will read the following information about the Texas Technology Immersion Pilot (TIP) and make responsible decisions based upon this scientific research.
BACKGROUND ON THE TEXAS TECHNOLOGY IMMERSION PILOT (TIP)
The Technology Immersion Pilot (http://www.tea.state.tx.us/technology/tip/) was created by the Texas Legislature in 2003.Senate Bill 396 called for the Texas Education Agency (TEA) to establish a pilot project to “immerse” schools in wireless laptops. The mandate came without any funding; but through a competitive grant process, the TEA used more than $20 million in federal monies to fund the TIP project. Concurrently, a federal research study has been evaluating whether student achievement improves over time through this immersion in laptops.The Texas Center for Educational Research is a non-profit research organization in Austin that has been working with the TEA for four years (2004-2008) to produce research-based results.
*Since January 2008, two more reports (July 2008 and December 2008) have been produced that emphasize other aspects of laptop immersion; but neither focuses on the lack of academic achievement on TAKS reading and writing.(Please see links posted at the bottom of this article.)
My concern is that the capstone report (December 2008 — Progress Report on the Long-Range Plan for Technology, 2006-2020) that has been produced for the 81st Legislature really seems to “dance around” the most important issue which is the academic achievement.Instead the report puts out information on issues of secondary importance (e.g., whether students and teachers like laptops, whether the immersion has been deep enough, whether students’ computer skills have improved, whether discipline problems have decreased, whether teachers have received enough technology training, etc.).These may be interesting to study in and of themselves but do not really get to the heart of the matter which is whether laptops indeed improve students’ reading and writing skills appreciably – enough to justify the huge expenditure to provide individual student laptops for all students in Texas.Legislators may be prone to read only the December 2008 TIP report and disregard the January 2008 TIP report that holds the real “meat” of the issue.
Evaluation of the Texas Technology Immersion Pilot: Outcomes for the Third Year (2006-07), January 2008
Following are excerpts taken from this report:
Methodology, p. i: “The overarching purpose of the study is to scientifically investigate the effectiveness of technology immersion in increasing middle school [Grades 6 – 8] students’ achievement in core academic subjects as measured by the Texas Assessment of Knowledge and Skills (TAKS).”
Setting, p. ii: “The research includes 42 grades 6 to 8 middle schools drawn from rural, suburban, and urban locations in Texas…The study focused on three student cohorts in the third year. Cohort 1 included eighth graders (2,586 treatment, 2,863 control) who completed their third project year, Cohort 2 included seventh graders (2,644 treatment, 2,882 control) who finished their second project year, and Cohort 3 included sixth graders (2,597 treatment, 2,840 control) who concluded their first project year. Students in the cohorts were predominantly minority (65%) and economically disadvantaged (67%). In the third year, a total of 1,253 teachers participated in the study, including 591 in immersion schools and 662 in control schools.”
Excerpt from p. iv: “After controlling for student and school poverty, there were no statistically significant effects of immersion on the TAKS reading growth rates for either Cohort 1 (eighth graders) or Cohort 2 (seventh graders)…For Cohort 3 sixth graders, after controls for students’ prior achievement, demographic characteristics, and school poverty, there was no statistically significant effect of immersion on students’ 2007 TAKS reading scores.”
Excerpt from p. vii: “Writing. For both Cohorts 1 and 2, after controlling for pretest writing scores (4th grade writing), demographic characteristics, and school poverty, there was no statistically significant effect of immersion on students’ TAKS writing scores as seventh graders. The immersion effect was negative across both cohorts.”If I am interpreting this data right, it means that immersed students’ TAKS scores actually went down. In other words, their TAKS writing scores decreased when students used laptops extensively.This is a very dismal finding because proponents of laptops have sold the public on the idea that students will improve their writing if they only have their own laptops.
Excerpt from page 16: “…Cohorts 1, 2, and 3 students…complete TAKS reading and mathematics assessments annually, so all student cohorts have pretest and posttest measures…Cohort 2 students completed the TAKS writing assessment in 2004 (4th grade) and 2007 (7th grade).”
Excerpt from Page 68-69: “After controlling for student and school levels of poverty, results show there was no statistically significant effect of immersion on Cohort 1 students’ growth rate for TAKS reading.
Excerpt from Page 69: “TAKS reading outcomes for Cohort 2, similarly, showed no statistically significant effect of immersion on seventh graders’ reading achievement.”
Excerpt from Page 71: Cohort 3, Sixth Grade — “TAKS reading outcomes for sixth graders reported in Table 6.6 show that after controlling for students’ prior reading achievement, demographic characteristics, and the level of school poverty, there were no statistically significant differences in the 2007 TAKS reading T scores for students in immersion and control schools.”
Excerpts from Page 76: “Cohort 2 students completed the TAKS writing assessment as fourth graders in 2004 and again as seventh graders in 2007…Results for Cohort 2 students show that after controlling for students’ pretest writing scores, student demographic characteristics (gender, ethnicity, economic status), and campus poverty level, there was no statistically significant difference in the 2007 TAKS writing T scores for students in immersion and control schools.”
Excerpt from page 76: “In the third project year, we examined the effects of immersion on Cohort 1 students (eighth graders who attended schools for three years), Cohort 2 students (seventh graders who attended schools for two years), and Cohort 3 students (sixth graders who attended one school year). Key findings are the following:
• TAKS reading. After controlling for student and school poverty, there were no statistically significant effects of immersion on the TAKS reading growth rates for either Cohort 1 or Cohort 2 students.
Excerpts from page 77:”For Cohort 3, after controls for students’ prior achievement, demographic characteristics, and school poverty, there was no statistically significant effect of immersion on students’ 2007 TAKS reading T scores.”
Excerpts from page 77:”TAKS writing. After controlling for Cohort 2 seventh graders’ pretest writing scores (fourth grade), demographic characteristics, and campus poverty, there was no statistically significant difference in the TAKS writing scores for immersion and control students. The immersion effect was negative but not by a significant margin.”
Excerpts from page 92:”Students at immersion schools, compared to control, reported mounting technical problems over time when they used computers at school. Cohorts 1 and 2 immersion students reported increasing technical problems using computers across years compared to control students, with the growth in problems statistically significant for Cohort 1 (eighth graders). Cohort 3 students at immersion schools (sixth graders), who inherited laptops that had been used by students during two previous school years, also reported significantly more technical problems than control group-students.
Excerpts from page 93-94:”Increasing middle school students’ academic achievement in core subjects as measured by state assessments is the ultimate goal of technology immersion…Analyses for Cohort 1 included about 1,380 immersion and 1,600 control students, Cohort 2 included about 1,550 immersion and 1,725 control students, and Cohort 3 included about 1,780 immersion and 1,990 control students.”
Excerpts from page 94:”Technology immersion had no statistically significant effect on students’ TAKS reading achievement. After controlling for student and school poverty, there were no statistically significant effects of immersion on the TAKS reading growth rates for either Cohort 1 (eighth graders) or Cohort 2 (seventh graders).”
Excerpts from page 96:”Similarly, there was no statistically significant immersion effect on the 2006 writing scores for Cohort 1 students who completed writing assessments during the second project year. The immersion effect, similar to Cohort 2, was negative…”
In summation, laptops on every student’s desk have not raised academic achievement in reading and writing probably because students are losing valuable class time by doing activities on their computers that may be entertaining but that do not produce solid academic achievement. Legislators need to heed the TIP research, provide state dollars to equip our public schools with adequate computer opportunities for all students without jeopardizing the teacher-student relationship that develops from direct, systematic instruction.
TECHNOLOGY IMMERSION PILOT (TIP) LINKS
List of Immersed Schools:
List of Control Schools:
*A Report to the 81st Legislature – December 2008 — Progress Report on the Long-Range Plan for Technology, 2006-2020 (page 9 – Technology Immersion Pilot):
*Third-Year (2006–07) Traits of Higher Technology Immersion Schools and Teachers, July 2008 — http://www.tcer.org/research/etxtip/documents/y3_etxtip_qual.pdf
Published December 20, 2008
The Texas Rural Technology (R-TECH) Pilot Program
Second Interim Evaluation Report — February 2010
[Our Texas political leaders are pushing digitized textbooks for our public school students. I say, “Please look at the research.”
Are these politicians looking at the research studies to see whether students are increasing their academic achievement through technology-based learning?
The Texas Technology Immersion Pilot — TTIP (2004-2008) — never could prove that students’ academic achievement had been increased by the $20 Million taxpayers spent on student laptops. (Please see my article posted at the bottom.)
Now we have yet another evaluation from a second technology project called the Texas rural Technology (R-TECH) Pilot Program. It looks as if we are still waiting for the huge expenditures of taxpayers’ dollars on technology — $7.8 Million — to produce higher achievement scores in our Texas students.
Either the technology is not as effective as vendors portray it to be, or the software and/or online programs that students are using with their technology are not rich enough in content to elevate students’ test scores.
I also have to wonder how hard students are working on their “taxpayer-funded technology-based supplemental education programs,” or are students just wasting a great deal of instructional time playing games on their techie toys at taxpayers’ expense?
As I read the following, it seemed to me as if the R-TECH researchers went out of their way to “make excuses” for the lack of increased academic achievement. I bet if the students’ TAKS test scores had increased, none of these supposed “excuses” would have been mentioned in the report.
A final and more inclusive R-TECH report is to be produced in the fall of 2010. I guess we will need to wait for that final report before drawing any definite conclusions. However, it bothers me tremendously that not only are R-TECH students not improving their TAKS scores, but they are even experiencing lowered score results. — Donna Garner]
Introduction to R-TECH
In 2007, the Texas Legislature (80th Texas Legislature, Regular Session, 2007) authorized the creation of the Texas Rural Technology (R-Tech) Pilot program, which provides $8 million in funding to support rural districts in implementing technology-based supplemental education programs…
R-Tech grants were awarded in two periods, or cycles. The Texas Education Agency (TEA) awarded approximately $6.3 million in funding to 64 districts in Cycle 1 grant awards, and $1.5 million in funding to 19 districts in Cycle 2 grant awards…
Grantee districts receive $200 per student served by R-Tech in state funding for each year of the grant and are required to provide matching funds of $100 per student per grant year.
(Excerpts from Pp. IV – VII)
Research Question 4: What is the Effect of R-Tech on Student Outcomes?
The sections that follow present results from analyses of R-Tech on students’ TAKS outcomes. However, test results are a limited indicator of R-Tech program effects because most standardized tests lack the sensitivity needed to measure incremental increases in student achievement produced by supplemental programs such as R-Tech. Given this limitation, readers are asked to consider this report’s findings as preliminary. The evaluation’s final report (fall 2010) will include a broader range of student outcome data, including graduation and attendance rates, advanced course completions, and indicators of college readiness, that were not available at the time of this report’s writing.
The effect of access time. Students who spent more time using R-Tech resources did not experience improved testing outcomes relative to students who spent less time with resources. However, results should be interpreted with caution because researchers were not able to control for unobserved student differences that may have affected outcomes. For example, students who spent more time using R-Tech resources may have been at greater academic risk, requiring more remediation time than students who used R-Tech for briefer periods. If this was the case, then the lack of effect for time spent accessing R-Tech may reflect the characteristics of the students identified for more intensive support rather than the effects of the support itself.
Program type. The small number of districts offering one-to-one tutoring with online instructional support, technology immersion programs, and iPods loaded with instructional content prevented their
inclusion in the statistical analysis of program type; therefore, analyses were limited to students participating in self-paced programs and dual credit courses. Students participating in self-paced programs experienced reduced TAKS scores in reading/ELA relative to R-Tech students who participated in other program types; however, self-paced programs had no effect on TAKS outcomes in mathematics, science, and social studies. Again, results should be interpreted with caution because it was not possible to control for the student characteristics that may have caused students to be identified for self-paced programs. If students identified for self-paced programs had more serious academic deficiencies than students identified for other types of R-Tech programs, then results may have been produced by unobserved student characteristics rather than program participation.
Supplemental vs. non-supplemental instruction. Students who received R-Tech services as supplemental instruction offered outside of the regular school day experienced reduced TAKS testing outcomes in social studies relative to students who participated in R-Tech as part of the regular school day (i.e., non-supplemental programs). The effects of supplemental programs on students’ reading/ELA, science, and mathematics were persistently negative, but not by statistically significant levels. These findings suggest that R-Tech services implemented as part of regular instruction may improve students’ TAKS outcomes; however, the characteristics of students identified for supplemental services may have affected outcomes. That is, students identified for supplemental services may have struggled academically, while students participated in non-supplemental services irrespective of academic need, which may indicate that testing outcomes reflect the effects of students’ academic characteristics rather than program participation.
“Legislators: Digitized (Computerized) Textbooks Will Not Work”
by Donna Garner
March 17, 2009
During school years 2004 through 2008, the U. S. Department of Education conducted a pilot program in Texas to see whether immersing students in laptop computers would significantly improve student learning. This pilot program (Texas Technology Immersion Pilot — TTIP) cost taxpayers more than $20 million.
Legislators need to pay careful attention to the TTIP conclusions before jumping haphazardly into authoring and passing bills that require school children’s hardcover textbooks to be ditched in favor of computerized versions.
What were the results of the TTIP? What did the taxpayers get for their $20 million? For all that money spent, did student learning improve appreciably?
I read through the lengthy report (Jan. 2009) and have come up with the following succinctly worded conclusions. To verify my statements, please read the entire TTIP report:
- Even though students were heavily immersed with expensive laptops ($1,100 – $1,600 per student) and teachers were trained extensively in immersion strategies for over four years, little-if-any positive student academic achievement was attained.
- Instead, laptops broke, costing large sums to repair.
- Student self-directed learning did not show positive gains.
- Laptop students during all but the fourth year attended school less regularly.
- Cohorts 2 and 3 did not improve their reading achievement.
- Control group students did better in writing than did the laptop immersed students.
- Cohorts 2 and 3 improved their math scores a bit, but Cohort 1 did not sustain the positive effect into the high-school years.
- Cohort 2 did not improve their 2008 science or social studies scores.
- The researchers concluded, “Given the financial and logistical challenges of implementing and sustaining the Technology Immersion model, statewide implementation may not be possible.”
I stand by my long-held beliefs that nothing will ever take the place of a live teacher working directly and systematically with a live student.
I do believe that certain knowledge-based, academic material can be learned effectively online; and I am the writer/consultant for such an online offering — MyStudyHall.com — that teaches students grammar and other valuable English / Language Arts / Reading (ELAR) skills.
Our excellent online tutorial provides teachers with supplementary units to help students hone their English and communication skills, but we at MyStudyHall.com do not support the granting of course credit for ELAR courses done through distance learning.
So far as I know, none of the TTIP schools utilized our online tutorial; but they should have. Perhaps if they had encouraged their students to “soak” in our in-depth content instead of skipping merrily through the usual TAKS-Prep Lite materials, the immersed laptop students would have improved their scores.
The brain research says that children must connect concepts by experiencing close cognitive progression if they are ever going to move the concepts into long-term memory. MyStudyHall.com provides just that type of instruction by carefully building grammar units that deepen in depth and complexity as students master each new unit.
We at MyStudyHall provide students with a pleasing and accessible way to learn important knowledge-based, academic skills; but it is the teacher who must hold students accountable to use these skills correctly and consistently in their speaking and writing. No computer can do that.
I stand by my contention that it is not a wise use of taxpayers’ dollars to provide laptops for all classroom students; and I do not believe laptops should ever replace the direct, systematic instruction of a teacher working face-to-face with a classroom of students. The Texas Technology Immersion Pilot study confirms my beliefs.
Extrapolating out the results of the TTIP’s findings should also serve to dissuade Legislators who want to do away with hardcover textbooks by replacing them with digitized (computerized) versions.
If Legislators genuinely want to increase public school students’ academic skills, then these policymakers should pay close attention to the results of the $20 million TTIP study. Computerized textbooks delivered on student laptops are not the answer to the educational gaps among our public school students.
To read the entire TTIP Jan. 2009 report:
Here are excerpts taken from this report:
Evaluation of the Texas Technology Immersion Pilot: Final Outcomes for a Four-Year Study (2004–05 to 2007–08) — January 2009
Texas Education Agency
Texas Center for Educational Research
P.O. Box 679002
Austin, Texas 78767-9002
512.467.3632 512.467.3658 (fax)
The Technology Immersion Pilot (TIP), created by the Texas Legislature in 2003, was based on the assumption that the use of technology in Texas public schools could be achieved more effectively by “immersing” schools in technology rather than by introducing technology resources, such as hardware, software, digital content, and educator training, in a cyclical fashion over time. The Texas Education Agency (TEA) invested more than $20 million in federal Title II, Part D monies to fund Technology Immersion projects at high-need middle schools through a competitive grant process. Concurrently, a research study partially funded by a federal Evaluating State Educational Technology Programs grant has investigated whether student achievement improved over time through exposure to Technology Immersion. The Texas Center for Educational Research (TCER) was TEA’s partner for a four-year evaluation of the implementation and effectiveness of the Technology Immersion model (p. i).
…Through an expert review process, the TEA selected three lead vendors to provide Technology immersion packages (Dell Computer Inc., Apple Computer Inc., and Region 1 Education Service Center [ESC]). Of the 21 Technology Immersion schools studied in the evaluation, 5 middle schools selected the Apple package, 15 selected the Dell package, and 1 school selected the Region 1 ESC package (with Dell computers) (p. i).
…The fourth-year evaluation provides final conclusions about the effects of Technology Immersion on schools, teachers, and students. This report combines information gathered during the fourth project year (2007-08) with data from the first-through-third implementation years (2004-05 through 2006-07). The study’s quasi-experimental research design has allowed inferences about the causal effects of Technology Immersion through comparisons between 21 treatment schools and 21 control schools (p. ii).
Setting and Participants
The 42 participating schools included Grades 6 to 8 middle schools drawn from rural, suburban, and urban locations across Texas. Middle schools were typically small (about 400 students, on average); however, enrollments varied widely (from 83 to 1,447 students). About two-thirds of schools were located in small or very small Texas districts (less than 3,000 students), and about a third were in very large districts (10,000 or more students). Students in the study were mostly economically disadvantaged (67%) and they were racially and ethnically diverse (roughly 58% Hispanic, 7% African American, and 36% White) (p.ii).
The study focused on three student cohorts in the fourth year. Cohort 2 included eighth graders (2,578 treatment and 2,858 control students) who finished their third immersion year; Cohort 3 included seventh graders (2,547 treatment and 2,845 control students) who concluded their second year. We also examined achievement data for Cohort 1 students (2,469 treatment students and 2,748 control group students) who had attended Technology Immersion and control schools from sixth-through eighth grade and then attended traditional high schools in the fourth year (high schools that typically did not provide individual laptops for students) (p. ii).
…As laptops aged over four years, students at Technology Immersion schools, compared to control, reported more technical problems when they used computers at school. In the fourth year, students in Technology Immersion schools reported technical problems with computers at more than twice the rates reported by control students. Eighth graders (Cohort 2) and seventh graders (Cohort 3) who often inherited second-hand laptops and had used those laptops across school years reported significantly more technical problems than control group-students. Although various technical problems occurred rarely (a few times a year) or just sometimes (once or twice a month), problems with deteriorating laptops substantially increased the workloads of technical-support staff, who often were already overburdened with technical demands (p. iv).
…Technical Problems. Given the increased availability of technology in immersion schools and classrooms, we reasoned that students might encounter more technical problems. Thus, we asked students to indicate on a 5-point scale about how often various Technical Problems happened when they tried to use a computer at school. Across Cohorts 2 and 3, growth rates showed that immersion students reported more technical problems using computers compared to control students. Figure 5.2 shows that Cohort 2 immersion students initially reported fewer technical problems than control students, but by the end of eighth grade, both economically advantaged and disadvantaged immersion students reported more technical troubles (p. 50).
Across four evaluation years, there was no evidence linking Technology Immersion with student self-directed learning or their general satisfaction with schoolwork. Findings from three student cohorts across four evaluation years showed there was no statistically significant effect of Technology Immersion on student Self-Directed Learning, as measured by the Style of Learning Inventory. As both immersion and control students progressed from lower to higher grade levels, their responses to statements measuring self-direction (e.g., goal setting, self-efficacy beliefs, and intrinsic effort) revealed significantly negative growth trends. Similarly, there was no significant difference in the levels of satisfaction with schoolwork expressed by treatment and control students. Across all middle schools, students’ became less satisfied with the meaningfulness and relevance of their schoolwork as they advanced to higher grade levels (pp. iv – v).
Although prior research suggested that the individualized learning opportunities allowed through one-to-one technology would positively affect students’ self-regulated learning, our results, consistent with previous years, revealed no significant immersion effects on Cohort 2 students’ growth in self-direction. As both immersion and control students progressed through eighth grade, their responses to statements revealed significantly negative growth trends…Overall findings indicated that students did not consider themselves to be strongly self-directed learners (pp. 51-52).
…For the first-through-third evaluation years, students at Technology Immersion schools had significantly lower school attendance rates than control students—however, in the fourth year, attendance-rate differences between treatment and control students were smaller and statistically nonsignificant. Unexpectedly, students at Technology Immersion schools attended school less regularly than control students across the first three years (p.v).
…Technology Immersion had no statistically significant effect on TAKS reading achievement for Cohort 2 (eighth graders) or Cohort 3 (seventh graders)—however, for Cohort 1 (ninth graders), there was a marginally significant and positive sustaining effect of Technology Immersion on students’ TAK reading scores. After controlling for student and school poverty, there were no statistically significant effects of immersion on the TAKS reading growth rates for either Cohort 2 or Cohort 3 (p. vi).
…Technology Immersion had a statistically significant effect on TAKS mathematics achievement for Cohort 2 (eighth graders) and Cohort 3 (seventh graders). For Cohort 1 (ninth graders), the sustaining effect of immersion on TAKS mathematics scores was positive but not by a statistically significant margin (p.vi).
Conclusions about the effects of Technology Immersion on TAKS social studies and science scores remain in doubt. However, outcomes for TAKS writing, which involved the administration of the TAKS assessment in traditional paper-and-pencil format, have consistently favored control students although not by statistically significant margins…Available results have revealed no statistically significant differences between treatment and control groups for TAKS social studies, science, or writing scores. Treatment-control group differences for science and social studies have varied from year to year, whereas outcomes for TAKS writing have consistently favored students at control schools. Across evaluation years, seventh graders in immersion schools, on average, have had lower TAKS writing scores…(p. vii).
…Findings from four years suggest that Technology Immersion can be implemented and is sustainable if districts and schools are committed to the model—however, other approaches to technology use may be appropriate for some districts and schools. Over four years, it became evident that Technology Immersion involved more than just buying laptops for students. Technology Immersion is a comprehensive model for transforming the school culture, and the nature of teaching and learning, and expanding the educational boundaries of the school. This study shows that fundamental school change is difficult and requires a long-term commitment at all levels of the school system (board members, superintendent, principals, teachers, students, and parents). Given the financial and logistical challenges of implementing and sustaining the Technology Immersion model, statewide implementation may not be possible. However, those districts and schools that are committed to Technology Immersion should have state support for their innovative school-reform efforts. At the same time, other districts and schools should receive support for alternative technology initiatives that have research-based evidence of effectiveness (p. xi).
…Through an expert review process, the TEA selected three lead vendors as providers of Technology Immersion packages (Dell Computer Inc., Apple Computer Inc., and Region 1 Education Service Center [ESC]). Package costs, which ranged from about $1,100 to $1,600 per student, varied according to the numbers of students and teachers, the type of laptop computer, and the vendor provider. Of the 21 immersion sites studied in the second through fourth years, 5 middle schools selected the Apple package, 15 selected the Dell package, and 1 school selected the Region 1 ESC package (Dell computer) (p. 17).
Conclusions – Student Achievement (p. 71)
In the fourth and final project year, we examined the effects of Technology Immersion on Cohort 2 students (eighth graders who attended middle schools for three years), Cohort 3 students (seventh graders who attended middle schools for two years), and Cohort 1 students (ninth graders who attended middle schools for three years and then enrolled in mainly traditional high schools). Key findings are the following.
• TAKS reading. After controlling for student and school poverty, there were no statistically significant effects of immersion on the TAKS reading growth rates for either Cohort 2 students or Cohort 3 students. The immersion effects were positive but very small. Across both student cohorts, positive mean growth trajectories showed that economically disadvantaged students and students in schools with above average levels of poverty grew in reading achievement at faster rates than their more affluent peers. For Cohort 1, postimmersion and control ninth graders attending high schools, there was a positive enduring effect of Technology Immersion on treatment students’ TAKS reading growth rate that approached statistical significance (p < .0.06).
• TAKS mathematics. After controlling for student and school poverty, Technology Immersion had a statistically significant effect on the TAKS mathematics growth rates for both Cohort 2 and Cohort 3 students. The TAKS mathematics scores of immersion students increased across years, whereas scores for control students decreased. For Cohort 1, post-immersion and control ninth graders attending high schools, there was a positive but statistically nonsignificant sustaining effect of Technology Immersion on TAKS mathematics achievement.
• TAKS science. After controlling for prior science achievement, demographic characteristics, and school poverty, there was no statistically significant effect of immersion on Cohort 2, eighth graders’ 2008 TAKS science scores. The estimated immersion effect was positive but very small.
• TAKS social studies. After controlling for Cohort 2, eighth graders’ reading achievement (seventh grade), demographic characteristics, and school poverty, there was no statistically significant effect of immersion on 2008 TAKS social studies scores. The estimated immersion effect was virtually zero (.006 T-score point).
• TAKS writing. After controlling for Cohort 3 seventh graders’ pretest writing scores (fourth grade), demographic characteristics, and campus poverty, there was no statistically significant difference in the TAKS writing scores for immersion and control students. The estimated immersion effect was negative but very small.