Ubiquitous technology integration in Canadian public schools: Year one study

Canadian Journal of Learning and Technology

Volume 32(1) Winter / hiver 2006

Ubiquitous technology integration in Canadian public schools: Year one study

Jennifer Sclater

Fiore Sicoly

Philip C. Abrami

C. Anne Wade


Jennifer Sclater, was an ICT Consultant at the Centre for the Study of Learning and Performance, Concordia University, Montreal, Quebec, and is currently Coordinator, Education (youth programs) with the World Anti-Doping Agency, Montreal, Quebec.

Fiore Sicoly, PhD, is a Professor at the Institute of University Partnerships & Advanced Studies, Georgian College, Barrie, Ontario and is a Collaborator with the Centre for the Study of Learning and Performance, Concordia University, Montreal, Quebec.

Philip C. Abrami, PhD is a Professor, Director and Research Chair at the Centre for the Study of Learning and Performance, Concordia University, Montreal, Quebec, Canada. Correspondence concerning this article can be sent to abrami@education.concordia.ca

Anne Wade, is Manager and Information Specialist at the Centre for the Study of Learning and Performance/Education, Concordia University, Montreal, Quebec.


Abstract: The current investigation was an exploration of the first year of a multi-year project designed to provide every Grade 3 to Grade 11 student throughout an English school board in Quebec with a laptop computer. Data were collected from 403 elementary and 270 secondary students from the experimental school board and also from 330 students in the control school board. In addition, questionnaire data were collected from 60 elementary school teachers and 51 secondary school teachers. Finally, interviews were conducted with 72 students and 20 teachers. Potentially the most interesting finding was the difference in achievement scores between the experimental and control boards. Secondary students from the experimental board had higher scores on the CAT-3 reading test and indicated making six times more frequent use of computer technology in their English classes, suggesting a possible treatment effect. In contrast, math scores were higher at the control board where neither board indicated high levels of computer use. Nevertheless, these findings must be interpreted with some caution until the threats to validity of selection bias are more clearly overcome.

Résumé: L’investigation en cours était un examen de la première année d’un projet pluriannuel qui visait à offrir un ordinateur portable à tous les élèves de la 3 e à la 11 e année d’une commission scolaire anglophone du Québec. Les données ont été recueillies auprès de 403 élèves de l’élémentaire et de 270 élèves du secondaire de la commission scolaire expérimentale ainsi qu’auprès de 330 élèves de la commission scolaire témoin. En plus, on a recueilli des données tirées de questionnaires remis à 60 enseignants de l’élémentaire et à 51 enseignants du secondaire. Enfin, des entrevues ont été menées avec 72 élèves et 20 enseignants. La conclusion la plus intéressante s’avère peut-être la différence entre les résultats de réussite entre les commissions scolaires expérimentale et témoin. Les étudiants du secondaire de la commission scolaire expérimentale ont obtenu de meilleurs résultats pour les tests de lecture CAT-3 (formation en matière d’applications informatiques) et ils ont indiqué utiliser six fois plus souvent la technologie informatique dans leur cours d’anglais, laissant croire que le traitement a peut-être eu des répercussions. Par ailleurs, les résultats en mathématiques étaient supérieurs à la commission scolaire témoin alors qu’aucune commission n’a indiqué des taux d’utilisation élevés de l’ordinateur. Néanmoins, ces conclusions doivent être interprétées avec discernement jusqu’à ce qu’on gère mieux les menaces à l’égard de la validité du biais de sélection.

Ubiquitous Technology Integration in Canadian Public Schools: Year One Study1

Enthusiasm for, as well as apprehension regarding, the use of technology for learning appears widespread as we herald the arrival of the Information Age. To some, computer technology can be used as a powerful and flexible tool for learning (Harasim, Hiltz, Teles & Turoff, 1995; Lou, Abrami & d’Apollonia, 2001; Scardamalia & Bereiter, 1996). Indeed, there is sufficient optimism in the potential of technology to have a positive impact on learning that governments have established task forces and dedicated substantial research funds to identifying and promoting ways to deliver or enhance instruction with the use of technology. At the same time, there is sufficient scepticism (Healy, 1998; Russell, 1999) about the use of technology to improve learning and beliefs that it may even represent serious threats to education. For example, it is believed that an imbalance between computer skills and essential literacy skills may be created; technology dependencies and isolation may be fostered rather than independent and interdependent learners; and the joy and motivation to learn may be eroded, replaced by frustration with failed equipment. Many teachers hold beliefs concerning the usefulness of information and communication technologies (ICT) that parallel their attitudes towards any change to teaching and learning, be it through government mandated reform or societal pressure. “If the computer can accomplish the task better than other materials or experiences, we will use it. If it doesn’t clearly do the job better, we will save the money and use methods that have already proven their worth” (Healy, 1998, p. 218).

Technology integration and student achievement

What has the research evidence revealed about the impact of technology integration, broadly defined, on student learning? There are now numerous narrative as well as quantitative reviews exploring the primary research on the impact of computer use on student achievement. The summaries vary; some suggest positive impacts on student learning while others are more equivocal, suggesting the evidence does not yet justify concluding that technology impacts positively and pervasively on student learning.

There are numerous examples of promising results. Kulik and Kulik (1989) cited several reviews, as early as during the 1980s, that found positive effects of computer based instruction on student performance, with a range of gains from 0.22 standard deviations to as high as 0.57 standard deviations. Schacter (1999) cited several studies that reported higher achievement, motivation, and engagement for students in a technology-enriched environment. In their meta-analysis, Waxman, Lin, and Michko (2003) found positive, albeit small, effects of teaching with technology on student outcomes. Sivin-Kachala and Bialo (2000) included studies that reported gains in the areas of language arts and reading, mathematics, science and medicine, social studies, foreign and second language acquisition, and programming languages such as LOGO. Kulik (2003) found that most studies looking at the impact of the word processor on student writing have shown improved writing skills, as well as a positive impact on teaching programs in math, and in the natural and social sciences. Goldberg, Russell and Cook (2003) conducted a meta-analysis looking at the effect of computer technology on student writing from 1992 to 2002. Results suggested that students who used computers when learning to write produced written work 0.4 standard deviations better than students who did not use computers.

Other reviews of the literature are less enthusiastic. Coley, Cradler and Engel (2000) conclude that drill-and-practice forms of computer-assisted instruction can be effective in producing achievement gains in students. In contrast, studies of more pedagogically complex uses of technology generally have been less conclusive, offering only promising and inviting educational vignettes (Coley et al., 2000). Fuchs and Woessmann (2004) initially found positive effects of home computer use on mathematics achievement. After adjusting for family background and school characteristics, they found “the mere availability of computers at home is negatively related to student performance in math and reading, and the availability of computers at school is unrelated to student performance” (p. 17). Ungerleider and Burns (2002), reviewing mostly Canadian research, found little methodologically rigorous evidence of the effectiveness of computer technology in promoting achievement, motivation, and metacognitive learning and on instruction in content areas in elementary and secondary schools. They also emphasized that student academic achievement does not improve simply as a result of having access to computers in the classroom without concurrent changes to instruction. We share their concern that methodologically sound studies must be undertaken with proper experimental and statistical controls. We also believe it is important to conduct longitudinal investigations of pervasive and ubiquitous attempts at technology integration.

One-to-one computer implementations

The majority of research on technology integration in schools has been conducted when students have limited access to technology either by learning in dedicated computer labs for select periods during the week or when technology is available in classrooms but at ratio of several students per computer. More recently, interest has shifted to more widespread technology use, in particular, when each student is provided with a computer for use throughout the day.

The aim of the Maine Learning Technology Initiative (MLTI) was to provide every seventh and eighth grade student in the state with a laptop computer. Although the phase one summary report and mid-year evaluations of the MLTI (Gravelle, 2003; Lane, 2003; Sargent, 2003; Silvernail et al., 2003; Silvernail & Lane, 2004) present promising findings in terms of the impact the initiative has had on student learning and achievement, interpretation of qualitative and quantitative data obtained from surveys, case studies, interviews, focus groups and observations were only collected from students, teachers, superintendents and principals with laptops. No data were collected from laptop control participants. Davies (2004) evaluated the impact of the MLTI on one class. She found positive changes in the way students learn (more risk-taking), in what they learn, the context for teaching and learning, and on student willingness to engage in collaborative learning.

The Henrico County Public Schools deployed laptop computers to all students from Grades 6 to 12 in the district. Similar to the MLTI, Henrico’s Teaching and Learning Initiative strives to close the digital divide among students and integrate technology across the curriculum. Davis, Garas, Hopstock, Kellum and Stephenson (2005) surveyed 29,022 students, teachers, school administrators, and parents on their experiences and opinions of the initiative. The report simply outlines perceived benefits and limitations regarding the use of laptops. Data on student learning outcomes were not collected.

The Peace River Wireless Writing Project involves the deployment of laptop computers to students in five Grade 6 and 7 classes. The goal of this project is to increase student performance in writing expression (Jeroski, 2003). Early results reveal an increase in writing scores on the B.C. Performance Standards compared to the previous year (Jeroski, 2004).

Rockman et al. (2000) conducted a three-year evaluation of Microsoft’s Anytime Anywhere Learning Program, to examine the impact of the laptop program on teaching and learning, and on the ways in which laptops might be supporting constructivist pedagogy. The research covered 13 schools from eight different sites. They used both internal (within the school) and external control groups (another school). Researchers used student and teacher survey data, collected logs of computer use, gathered writing samples, interviewed school administrators, and analyzed scores from state and nationally normed assessments. Rockman et al. found positive changes in student writing, student collaboration, and an increase in student confidence towards computing. Results from standardized achievement measures were inconclusive.


The findings of these laptop initiatives are promising but not definitive. There is a lack of rigorous, methodologically sound research of one-to-one laptop programs, especially large-scale, board-wide projects and on the impact these programs have on student learning. The current investigation was an exploration of the first year of a multi-year project designed to provide every student with a laptop computer from Grades 3 (cycle 2) to 11 (secondary 5), throughout an English school board in Quebec. The primary focus of this study was to explore changes in student learning and teaching as a result of the laptop integration. The objectives of the current research were consistent with the goals established by the experimental school board of exploring the nature and extent to which technology supports or impedes student learning, motivation, attitudes, self-concept, and self-regulation.


The technology initiative involves the deployment of laptop computers to every student in the school board from elementary cycle 2 (Grades 3 and 4) to secondary 5 (final year of high school), over a three year period. Three elementary schools volunteered and were subsequently selected to serve as lead schools for the initiative. Cycle 3 students at these schools received laptops in May of 2003. The first wave of large-scale deployment, which took place in October of 2003, involved the distribution of laptops to cycle three students at the Board’s twenty elementary schools and secondary 5 students at the three high schools. A second school board in Quebec volunteered to serve as a control. The control school board used technology typical of other public school deployments where on average there were several students per computer and, in general, technology supplemented teaching and learning both in quality and quantity of time spent.

Data collection occurred between January and April 2004. The study conformed to Canada’s Tri-Council Policy on the ethical treatment of research participants.

Student participants

There were both elementary and secondary school participants.

Elementary level

The groups at the elementary level comprised cycle 3 (Grades 5 and 6) students from the twenty elementary schools. Although most cycle 3 students in the board received laptops in October 2003, students in the three lead schools received their laptops in May 2003. This meant that at the time when the first round of data collection was conducted, cycle 3 level two students (Grade 6) in the lead schools had the laptops for seven months (n=103) whereas the other students only had the laptops for four months (n=300).

Secondary level

The deployment of laptops at the high school level included Grade 9 and Grade 11 students. Nonetheless, data collection at the secondary level focused only on Grade 11 students. Grade 11 students were selected since in the previous year they completed uniform provincial exams in mathematics, physical science and history of Quebec and Canada. It was also possible to obtain these provincial exam results for Grade 11 students in the control school board. Data were collected from all three high schools in the experimental board (n= 270) and four high schools from the control board (n= 330).

Teacher participants

Data were collected from teachers at the elementary (n= 60) and secondary levels (n= 51). At the elementary level, data were collected from cycle 3 teachers. At the secondary level, data were collected from all secondary 5 teachers at the experimental and control schools.


A team of two to three data collectors was sent out to each school. The data collectors provided the students with instructions for completing the response sheets and provided a context for each questionnaire. The Technology Integration Questionnaire (TIQ) was sent to the schools. Teachers were given instructions for completing the questionnaire, including the procedure for returning them to the school board.

Interviews were conducted at the three lead schools, three elementary schools that were similar in size and setting to the lead schools, at three high schools in the experimental board, and four high schools in the control board. School administrators were asked to select teachers and students who best represented the school population. Teachers were interviewed individually while student interviews were conducted as focus groups of five to seven students at each school. Each interview and focus group session was approximately 20 to 30 minutes in length.

Interview data and responses to open-ended TIQ questions were coded using predetermined and emergent codes. Digital text files of the open-ended responses were compiled. HyperResearch was used for coding audio files (interview data) and text files.


Multiple data collection instruments were used in this research.

Technology Implementation Questionnaire (TIQ)

A modified version of the TIQ (Wozney, Venkatesh, & Abrami, 2006) was distributed to teachers. The TIQ examines technology integration in the classroom, including both degree and quality of use, focusing on both advantages and drawbacks of integration. The TIQ was used to help describe and understand differences across teachers in planning and implementing technology for learning.

Student Questionnaire

The Student Questionnaire was partially based on the NANS/SIAA research on school success (Janosz, 2003). This questionnaire was modified to further define socio-economic status for individual students, and to match students for the purpose of comparing achievement scores. Several questions explored the impact of technology on student attitudes and motivation to learn. It was also used to explore student perceived advantages and disadvantages of technology use.

The Student Questionnaire also drew questions from the PedTech2 computer questionnaire for students (Lowerison, Sclater, Schmid, & Abrami, in press). These questions were used to collect data concerning student use of computers both inside and outside of the school environment. These data were used to establish whether a difference in student achievement was associated with computer use in general, or on the use of computer technology in the classroom specifically, and if the latter, what qualities of use can affect student learning.

Canadian Achievement Test, 3rd Edition (CAT-3)

Only the Basic Battery was used for collecting achievement data. The test consisted of a series of multiple-choice questions in reading/language (comprehension, vocabulary and language) and mathematics. A test consultant from the Canadian Testing Centre conducted an item-by-item analysis of the Basic Battery to confirm the test’s compatibility with the curriculum of the Quebec Education Program.

Self-Description Questionnaire

The SDQ (Marsh, 1999) was used to collect self-concept data from students. Elementary students completed level one while the secondary students completed level two of the SDQ. This questionnaire was used to determine whether the use of technology had an effect on student self-concept. The SDQ included questions specific to academics as well as more general self-esteem (e.g., body image, ability to make and keep friends).

Surveys of Self-Regulation

The Academic Self-regulated Learning (Ryan & Connell, 1989) question-naire and the Academic Motivation Scale (Vallerand et al., 1992) were used to collect data on student self-regulation. The Academic Self-regulated Learning (SRL-A) was distributed to the elementary students while the Academic Motivation Scale (AMS), which is a variation of the SRL-A for older children, was distributed to secondary students. These questionnaires identify student motivation for learning and completing school-related tasks.


Student and teacher interview data were used to supplement the quantitative research findings. For instance, the interviews served to help describe and understand differences across teachers in planning and implementing technology for learning.

A total of 72 students, representing twelve schools, were interviewed in focus groups of five to eight. Thirty-seven students at the cycle 3 level (19 from the three lead schools and 18 from remaining schools) and 35 students at the secondary 5 level (17 from the three experimental high schools and 18 from three control high schools) volunteered to be interviewed. The analysis of the interview data focused on the student’s appreciation of computer technology integration for learning, their perception of the impacts that technology has on their learning, and the problems encountered with the integration of technology.

Teacher interviews were conducted to verify the extent to which computer technology is integrated into teaching and how its use impacts on teaching and learning. The analysis of the interview data focused on teacher perceptions about introducing computer technology in their teaching, their actual use of technology for teaching, their perception of the impacts that technology has on student learning and achievement, and on the adequacy of professional development and technical support. Two teachers per school were selected by the school administration to take part in a one-on-one, face-to-face interview. Eleven elementary teachers and nine secondary teachers participated in the interviews.

Training materials

Training materials used by the experimental school board were reviewed to gain a clearer understanding of the support and professional development offered to the teachers. As well, data collected by the school board through focus group sessions with teachers were examined. The experimental board offered teachers a three-day training session during the summer and consultants from the school board met with small groups of teachers (usually three consultants for every 20 teachers) to discuss strategies for using the computers in the classroom and for sharing future directions, successes and frustrations.

Provincial Exam Scores

Scores on secondary 4 provincial exams were used in conjunction with the high school CAT-3 scores. All secondary 4 students in the province of Quebec wrote common end-of-year exams in mathematics, physical science and history of Quebec and Canada. An item analysis was conducted to establish the likeness between level 19/20 of the CAT-3 math test of the basic battery and the secondary 4 provincial math exam.

Indice milieu socio-économique

The Indice milieu socio-économique is a formula used by the Quebec Ministry of Education (MEQ, 2002) (now the Ministry of Education, Leisure and Sport) to determine each school’s socio-economic status. Schools are given ratings based on statistics on mothers’ level of education and whether at least one parent in the household holds a full-time job. These data are based on the geographic area where the schools draw their students, not necessarily on the specific families within the school.

Data Analysis Process

Reliability of the data collection instruments was assessed using Cronbach’s alpha. A Cronbach’s alpha value of .70 is generally considered acceptable, although higher values (.80 or above) are preferred. Reliability data could not be calculated for the Canadian Achievement Test results because these data were scored externally.

The Student Questionnaire included 52 items. In some cases, related items were combined to form composite scores (e.g., eight questionnaire items that asked students to rate how easy it was to perform a variety of tasks using a computer were combined into a single score). For item-clusters derived from the Student Questionnaire reliabilities ranged from .50 to .76 for elementary students with an average value of .68 and ranged from .46 to .89 for secondary students with an average value of .73.

The SDQ for elementary students contains 72 items that can be used to create eight sub-scales. Cronbach’s alpha associated with these sub-scales ranged from .81 to .94 with an average value of .86. The SDQ2 for secondary students contains 102 items that can be used to create 11 sub-scales, although only eight were used in this research. Cronbach’s alpha for these SDQ2 sub-scales averaged .86 with a range from .82 to .91.

The SRL-A for elementary students contains 32 items that are used to produce four sub-scores. Cronbach’s alpha associated with these sub-scores ranged from .75 to .86 with an average of .82. The AMS for secondary students contains 28 items that are used to generate seven sub-scores. Cronbach’s alpha for these sub-scores ranged from .77 to .86 with an average of .82.

Missing Data

Several of the instruments used in this research (e.g., SDQ, AMS, SRG) combined clusters of related questionnaire items into sub-scales. For example, the 72 items on the SDQ at the elementary level were combined into eight sub-scales. It sometimes occurred that students completed all of the items in a particular sub-scale except for one. In this situation, the SDQ instructions for analysis recommend that missing item scores be replaced by the mean or average score for that particular item. This strategy allows students to be retained in the analysis instead of being removed because the sub-score could not be created due to the missing information for a single item. This procedure does not replace missing data indiscriminately. The replacement of missing data occurs only for a small percentage of students and only when those students have provided responses for at least 90% of the questionnaire. This approach for replacing missing data was also used when creating sub-scale scores for the remaining instruments used in the research. It should be noted that missing data were replaced only when clusters of items were used to create composite or sub-scale scores. When analyzing data for individual items, students with missing data were simply excluded from the analysis for that particular item.


We report the findings for several measures and for both elementary and secondary school students and teachers.

Technology Implementation Questionnaire Results

The TIQ was completed and returned by 60 elementary teachers and 51 secondary teachers. Of the elementary teachers, 14 were from lead schools while the remainder (46) were from schools that implemented the technology initiative at a later date. Of the secondary teachers, 25 were from the experimental board while 26 were from control schools. Given the relatively small number of teacher respondents, some caution is warranted when interpreting the TIQ results presented in this section of the report. As part of a separate study, data for a province-wide sample of elementary (N=448) and secondary teachers (N=276) were collected during the 2002-03 school year. TIQ results for the province-wide sample are presented for comparison.

The proportion of teachers who reported student access to computer technology as good, very good, or excellent was higher among elementary (about 90%) and secondary teachers (71%) in the experimental board than among secondary teachers in control schools (20%) or teachers in the province-wide sample (51%). The percentage of teachers who responded that access to computer resource personnel was good, very good or excellent was highest in elementary lead schools (71%) and lowest in control secondary schools (23%) and schools in the province-wide sample (40%).

Elementary teachers (72% to 86%) were approximately twice as likely as secondary teachers (48% and 35% for experimental and control respectively) to report that they frequently integrate computer technologies into their learning activities. In the province-wide sample, the rate of integration was approximately 29%. Teachers were also asked to report their proficiency levels in relation to computer technologies. Average or higher levels of proficiency were most prevalent among elementary teachers located in lead schools (93%) and lowest among experimental secondary teachers (68%) and teachers in the province-wide sample (62% for elementary). Teachers in the province-wide sample (36%) and secondary teachers in experimental schools (36%) are least likely to report adaptation or creative application of technology into the curriculum while elementary teachers in lead schools are most likely to fall into this category (62%). Elementary teachers in lead schools expressed more positive views regarding the implementation of technology compared to teachers in schools in the Year 1 implementation group. For example, teachers in lead schools were less likely to indicate that computer technology results in students neglecting important traditional learning resources, makes classroom management more difficult, and demands that too much time be spent on technical problems. On the other hand, elementary teachers in lead schools were more likely to indicate that computer technology promotes student collaboration, promotes the development of students’ interpersonal skills, and motivates students to get more involved in learning activities.

Although elementary teachers were generally more positive, the majority of respondents from the lead schools (79%) also acknowledged that the use of computer technology in the classroom requires extra time to plan learning activities. The vast majority of experimental teachers (approximately 90%) further agree that there must be adequate teacher training in the uses of technology for learning and that technical staff must regularly maintain computers (approximately 93%).

In general, the views expressed by secondary teachers in experimental schools were more negative than the views of secondary teachers in control schools. For example, secondary teachers in experimental schools were significantly more likely to indicate that computer technology: a) requires extra time to plan learning activities; b) is too costly in terms of resources, time and effort; c) requires software-skills training that is too time consuming; and d) will increase the amount of stress and anxiety that students experience. On the other hand, experimental secondary teachers were less likely to indicate that computer technology: a) increases academic achievement (e.g., grades); b) is an effective tool for students of all abilities; and c) helps accommodate students’ personal learning styles.

A majority of teachers (71% to 93%) use computers to prepare handouts, test/quizzes, and homework assignments for students. In a related questionnaire item, they also report using word processors with the same frequency. Teachers frequently report use of the Internet to search for information for a lesson (58% to 90%). Relatively high percentages of teachers also report the frequent use of computer technology to create lessons plans (28% to 65%) and e-mail to communicate with other teachers (20% to 44%). There appears to be a heavy use of digital video and cameras by elementary teachers in lead schools (64%). Secondary teachers (about 70%) are far more likely than elementary teachers to use computers to keep track of student grades and marks.

Elementary Student Results

Results were analyzed for Grade 5 and 6 students separately but were remarkably consistent across the two grades and so were aggregated for the two grades. Female students (90%) are significantly more likely than male students (70%) to report that they enjoy what they learn in school. Likewise, female students (82%) are more likely to indicate that school is fun compared to male students (60%). More than 90% of boys and girls indicate that it is important to succeed in English and mathematics. Compared to males (75%), a higher percentage of girls (91%) report that their writing skills in English are strong or very strong. Results by gender for the CAT-3 confirmed the superior performance of female students. Female students achieved a significantly higher level of performance in reading, language, spelling, computation, total reading score, and overall test battery score. Male students displayed higher performance in mathematics than did female students although this difference was not statistically significant.

In the area of computer use, few gender differences appeared among elementary students. Almost all students reported having a computer at home that they could use (89% and 91% for boys and girls respectively). Access to the Internet at home was less widespread but again no gender differences emerged (74% and 77% for boys and girls respectively). Approximately 79% of students reported using a computer at home a few times per week or almost every day. In contrast, 91% of students reported using a computer at school. Boys and girls were equally likely (84%) to report that they enjoyed using a computer to complete their schoolwork.

The number of students who report using a computer for one hour or more per week is as follows for each class: English (65%), French (42%), social studies (36%), science (30%), and mathematics (14%). When students were asked how often they used a computer to do homework outside class for various subject areas, English was identified most often (35%) and mathematics was lowest (8%). When looking for information on a particular topic, more than 70% of students look first to the Internet rather than consulting books, CD ROMs, or another person. Boys and girls (77%) are equally likely to report that using computers to complete their schoolwork helps them learn.

Results were compared for students following an Individual Education Plan (IEP) with non-IEP students. On the SRL-A, there are no statistically significant differences between the responses of IEP and non-IEP students. IEP students provided significantly lower ratings on the following SDQ sub-scales: physical abilities in sports and games; relationship with parents; ability, enjoyment, and interest in reading; ability, enjoyment, and interest in mathematics; general ability, enjoyment, and interest in school; and, general satisfaction with self. In addition, IEP students demonstrate significantly lower levels of performance on every CAT-3 subtest. Nonetheless, IEP students report that is equally easy to use computers for performing a variety of tasks. IEP students are slightly (but not significantly) less likely to use a computer during classes but significantly more likely than other students to use a computer for homework outside classes. IEP students are as likely as other students to report using a computer for schoolwork at home. Finally, IEP students are equally likely to report that using computers to do homework helps them learn.

At the time of data collection, Grade 6 students in lead schools had used laptop computers for a longer period than students in the remaining elementary schools, albeit only several months longer. Analyses were conducted to compare these two groups of students (identified as lead and year 1). This comparison was intended to examine any differential impact on students in the lead schools who had access to the laptop computers for a longer period of time than other Grade 6 students.

Between group comparisons on several measures (e.g., attitudes, self-regulation and CAT-3) favoured students in year 1 schools even though students in lead schools had used the laptops for a longer period. This result is contrary to what would be expected. However, there were large and significant differences in achievement pretest scores that favour the students in the year 1 schools and there were a greater number of special needs students in the lead schools. This suggests that the populations of students in lead and year 1 schools may not be sufficiently equivalent prior to the integration of technology. Furthermore, we were not able to adequately adjust for these pretest differences using analysis of covariance.

Secondary Student Results

Overall, secondary students are far less likely than elementary students to indicate that they like school and enjoy what they learn at school. While experimental students are significantly less likely than control students to have a computer at home, there is no significant difference in terms of how often a computer is used for schoolwork at home. As expected, experimental students are far more likely than control students to use a computer at school in all of their classes. For instance, the majority of experimental students use a computer in French (76%) and English class (68%) substantially more often compared to only 5% for control students. In contrast, usage of computers in math classes were dramatically lower in both experimental (18% for males and 4% for females) and control secondary schools (1% for both males and females). Experimental students also report significantly higher usage of computers outside class to do their homework (p < .05).

Although experimental students were far more likely to use a computer at school, did this promote more positive attitudes? Experimental students demonstrated a significantly lower degree of interest in computers (53% vs. 67%, p < .05) and less enjoyment when using computers to complete their schoolwork (58% vs. 64%, p < .05). Finally, experimental students (38%) were considerably less likely than control students (62%) to indicate that using computers for schoolwork helped them learn.

Results for the SDQ2 and AMS show that there were no significant differences between students in experimental and control high schools on the following dimensions: physical abilities in sports and games: physical appearance or attractiveness; honesty-trustworthiness; relationship with parents; ability, enjoyment, or interest in reading and mathematics; general ability, enjoyment or interest in school; and general satisfaction with self. Responses to the AMS, provided by experimental and control students, were very similar except on the Amotivation sub-scale where experimental students reported significantly higher (p < .05) scores (e.g., I can’t see why I go to school and frankly I could care less.). Amotivation is significantly higher among male students.

CAT-3 results showed that experimental secondary students achieved somewhat higher levels of performance in reading 51 st percentile vs. 48 th percentile (p > .05). In contrast, control students obtained significantly higher scores in mathematics (50 th percentile vs. 45 th percentile, p < .05). In order for such comparisons to be valid, we must have confidence that the students in experimental and control were not significantly different before the laptop initiative was implemented in experimental schools. If there are pre-existing differences between the two groups, then it becomes much more difficult to evaluate the impact of the laptop initiative on experimental students. The issue of non-equivalent groups is complex and the problem does not have a simple solution. One strategy we used was the analysis of subsets of the data. Another strategy that is deemed to have merit is covariance analysis, which is used to statistically reduce or remove the influence of pre-existing differences.

In this research, several potential covariates were examined. Student Questionnaire items asked participants to provide information about the education and employment status of their parents. Past research has demonstrated that such socio-demographic variables are often correlated with students’ academic performance. For this reason, it was reasonable to adjust for the education level and employment status of parents when analyzing data for experimental and control students. Nonetheless, the use of these variables in the covariance analysis had no substantial effect on results and, for this reason, they are not reported.

MEQ scores in mathematics, science, and history were also potential covariates. These test scores were collected during the 2003 school year before the laptop initiative was implemented in experimental secondary schools. MEQ scores indicated that experimental students in the research sample scored significantly lower in mathematics and science than did control students before the technology initiative was implemented. It was also observed that MEQ scores were significantly correlated with reading, language, and mathematics achievement scores on the CAT-3 (Pearson’s r values ranged from .31 to .53, p < .001). It was therefore reasonable to use MEQ scores as covariates in an effort to remove some of the influence of these pre-existing differences in performance. The results of the covariance analysis increased the difference between experimental and control in reading and decreased the difference in mathematics. Nonetheless, the overall pattern remained the same and, for this reason, detailed covariance results are not reported.

A second strategy, namely matching, was also used to achieve equivalence between control and experimental students at the pre-test stage so that possible differences on the post-test could be more accurately evaluated. Experimental students were matched to control students by finding a control student who had the same MEQ pre-test score but was also of the same gender. This methodology was used to create two groups of students with equal MEQ scores at the pre-test stage. Matching was done using MEQ scores for mathematics, science, and history as the pre-test. Matching was done with each test separately and then combining scores from the three tests using a simple sum. In approximately 85% of cases exact matches were found. When experimental and control test scores could not be matched exactly, test scores were used that were within 1 mark of each other (and on a few rare cases within two marks of each other).

Results differed somewhat depending on which MEQ test was used for matching. For example, consider the CAT-3 reading results. Experimental students had some advantage on CAT-3 reading scores when math or science MEQ scores were used for matching but this trend is reversed when history is used for matching. When all three tests are combined, the experimental advantage in CAT-3 reading emerges but is not statistically significant (p=.10).

When considering the CAT-3 scores for math, experimental students were significantly lower when matching is based on history scores. When science or math MEQ scores are used for matching, math CAT-3 scores for experimental students are quite similar to those for control students. When the sum of all three MEQ tests is used for matching, CAT-3 math scores are significantly higher for control students.

Teacher Interviews

Almost all of the 20 teachers interviewed described their role as that of facilitator, or guide who listens to students and who, at times, has to take on the role of a parent. They described their roles as opening doors, pointing students in new directions, helping them understand and share knowledge, and helping students develop critical thinking skills and to work autonomously. Their goals are to stimulate the students to want to learn so that they can work to their full potential. Three of the elementary school teachers stated that the introduction of computer technology changed their role. One teacher reported that the laptops changed the style of learning in her classroom from being teacher-led to student-centred. Another stated that the laptops “jostle her leadership role” and that she felt that she was not always ahead of the students. None of the secondary school teachers at the experimental schools felt their roles were changed with the advent of the laptops. Three teachers emphasized that the need to teach to the test for subjects for which there are provincial exams imposes a major constraint on the integration of computer technology in their teaching.

The majority of the teachers from both levels felt that they had a good predisposition toward the use of computer technology in teaching. The interview data showed that experimental teachers were using computer technology in fifteen and eleven different types of activities, respectively. On the other hand, control teachers were limited to three different types of activities namely searching the Internet for information, using email and doing projects.

The single most often reported activity for which technology was used was to search the Internet for information (14 teachers). Control teachers made much less use of computer technology in their teaching. The second most frequently reported activity reported by experimental teachers was related to literacy and included writing (ten teachers) and grammar and spelling (two teachers). Finally, eight teachers reported using computer technology for projects and creating PowerPoint presentations for use in class. They also reported sharing files and information over the server (seven teachers), and using email and chat (six teachers). This was followed by creative activities such as creating digital videos (five teachers), mathematics (four teachers), and reviewing (three). Two elementary teachers stated that they use technology as an incentive for students to complete their work.

Almost all experimental teachers reported that students were enthusiastic when they received the laptop computers. Initial negative responses from students were reported by three teachers and involved only one or two students in their classes who changed their minds after using the computers. The novelty, however, wore off with time. This was especially true for secondary 5 students who were reported to become frustrated with the technical problems associated with using their laptop computers. It was further compounded by the restrictions imposed by the schools for downloading games and music from the Internet. This loss of enthusiasm did not appear to be as noticeable with the cycle 3 students, although two elementary teachers mentioned that with time, students brought their computers home less often than when they first received them. One teacher suggested that his/her students became more systematic about when and how they used their laptop computers.

Teachers (13) from both levels reported that the integration of computer technology in learning increased student motivation. Several teachers observed an initial increase in the amount of writing students were producing. One elementary teacher commented that she now has to place a limit on the amount of written text produced, “If I asked them to write a story before, they would write a little bit and stop. Now, they write pages and pages.” Another teacher added, “I think of one kid in particular who wouldn’t want to write and now he writes … He said he didn’t like writing with a pencil.”

On the other hand, the secondary 5 teachers were more reserved in their assessment. One teacher suggested that there was an increase in motivation but not to the extent that she had expected. Others qualified their statements such as one teacher who stated “For a couple of students who have been very unmotivated to get things done, it’s been great because they’ve got something accomplished. Maybe the first thing I’ve got in from them when it’s writing.”

Student Interviews

The students from the six experimental elementary schools were enthusiastic about having laptop computers in the classroom. To the question: Would you go back to not using your laptop computer in school if you had the choice? all students answered a resounding “no.” One student answered, “No, we’re born with that technology.” Students from three elementary schools said that, given the choice, they choose to use the computer to do their work because it’s faster. One student said that he enjoyed using his laptop for learning because it makes difficult work easier.

All of the experimental students at the secondary 5 level said that they enjoyed using computer technology for learning. Students in one school stated that they would not want to go back to having to use computer rooms and labs to work. “It’s good because you work on only one computer” said one student. Another stated, “I could live without it, but it’s easier with.” High school students’ responses were not as enthusiastic as were responses from the elementary level students. The secondary 5 students tended to be more reserved in their assessment than the cycle 3 students.

The most important reason for having computer technology for learning, as reported by the students of all experimental schools, was having rapid access to up-to-date information using the Internet and WorldBook for assignments, projects or presentations. The students believed that the information they found on the Internet was more reliable than what they found in books in the library because it was continuously updated. As one secondary 5 student stated “The greatest benefit, I’d say, is probably being able to do research right here in class. We don’t have to go to the library where things are outdated.”

The second most important benefit for using the laptops was for writing. Several students reported that it is easier to type than to write by hand and that “It improves on sloppy writing so teachers can read your work.” One elementary level student stated, “I like my laptop because I used to hate writing stories, but now I’m better at it.”

Experimental secondary school students (six schools) believed that they learned more because of the laptop computers. In general, they associated this with the fact that the computers (and Internet access) are fast and that they can find more information on specific topics. Several students also described their learning experience similarly to one student who said “It’s a different kind of experience because with the laptops you’re learning how to type, you’re learning math, you’re learning how to research the internet, you’re learning grammar, and you’re learning different things like that. You’re also learning how to put things together and check how technology works.” When asked if the teacher asked more of them, one student responded, “Teachers ask more but it is so easy to get the information that you don’t notice it.”


The university researchers and their school partners worked diligently to insure the highest methodological standards of research. This partnership meant there was special support and credibility offered to those who conducted the data collection and overall this enhanced the ecological validity of the research. Nevertheless, there were several problems that undermined this field-based inquiry.

At the secondary level, the researchers compared the experimental board with a control board using statistical control and matching in order to approximate the equivalence produced by random assignment of participants. This was only partially successful. Had a larger sample of sites been available, this might have allowed for a closer matching of groups. In addition, the researchers attempted to use statistical control to remove the effects of extraneous influences and reduce the possibility that threats to internal validity operated as alternative explanations to a technology treatment effect. This too was only partially successful. The degree to which there were sufficient control variables, and especially those which correlated with the outcome measures, was limited. For example, geographic differences (e.g., urban versus rural), mother tongue language differences, context of data collection (e.g., seriousness of student completion of instruments, time available), prior achievement differences and a host of factors were not completely controlled for experimentally or statistically.

At the elementary level, no control data were available. Nor did the timing of the research allow for the use of comprehensive pretesting prior to implementation of the laptop initiative. In future, more time and the use of the current data as baseline indices should make strong quasi-experimental designs possible (e.g., non-equivalent pretest-posttest control group design). Finally, the research process may have been weakened by differential selection of respondents that could be attributed to the nature of the consent forms and related procedures.

It also bears noting that none of the between board and certainly none of the within board comparisons examine technology use versus non-use. It is not a case of whether computers are used but a case of degree of use and length of exposure. This may have resulted in diminished treatment effects.

As a consequence of these problems, we have been careful to temper our conclusions. In addition, we attempted to triangulate the findings, wherever possible, to ensure that a conclusion seemed justifiable from several sources of evidence.

Technology Use

Almost all students, elementary and secondary, reported having a computer at home that they could use for schoolwork. Access to the Internet at home was less widespread but still reported by the majority of students (about 75%). More than 35% of students used a computer at home to complete schoolwork three or more hours per week. On this measure, there was no difference between elementary and secondary students; no difference between experimental and control students; and no difference between males and females. In contrast, approximately 45% of elementary students and 65% of secondary students who had received laptops reported using a computer for schoolwork at school three or more hours per week. This figure was only 20% for secondary students in the control board. This clearly shows that increased access to computers leads to a dramatic increase in the usage of computers for completing schoolwork. There is little evidence of a gender gap in the use of computers, at school or at home. Elementary IEP students have lower achievement scores but report that it is equally easy to use computers for performing a variety of tasks. IEP students are slightly (but not significantly) less likely to use a computer during classes but significantly more likely than other students to use a computer for homework outside classes

The integration of computers is not equally apparent in all subject areas. The number of students who report using a computer is highest for English, lowest for mathematics, and moderate for social studies and science. As expected, experimental students are far more likely than control students to use a computer at school in all of their classes. For instance, the majority of experimental students (about 70%) at the secondary level use a computer in French and English class compared to only 5% for control students. The usage of computers in mathematics classes was dramatically lower in both experimental (18% for males and 4% for females) and control secondary schools (1% for both males and females).

Attitudes and expectations

The motivational dispositions of teachers should support expectations of success with technology (e.g., I can use technology with my students to help them learn), beliefs that the use of technology has value for learning (e.g., my students will learn more and at a higher level), and that the costs of technology use are manageable (e.g., the time to integrate technology into my classroom is reasonable). Similarly, students should increase their beliefs about technology as a tool for learning, value the tool, and see limited costs associated with its use as a pedagogical tool. Motivation to learn is a key building block of successful educational reform because it addresses the core concern of providing energy for action and directing it towards a meaningful goal.

In this regard, the perceptions of teachers and students were quite interesting. In general, students held positive attitudes towards the use of technology for learning. Elementary students had more positive views about school in general and also about the use of technology to enhance learning.

There was also some variability among teacher attitudes. As would be expected, teachers in the experimental board reported much higher access to computer technology and resources than did secondary teachers in the control board. In general, elementary teachers in the experimental board, and especially those in lead schools, held more positive attitudes than secondary teachers. Attitudinal differences found between the lead and year one elementary schools are interesting. Most of the significant TIQ findings show more positive results for the schools that received laptop computers earlier. It may be that over time, initial teacher concerns about the integration of technology will be reduced with experience. It could also be that lead schools were better prepared for the deployment than were schools that received the laptops at a later stage. Although elementary teachers were generally more positive, even they acknowledged that the use of computer technology in the classroom requires adequate teacher training, technical support, and extra time to plan learning activities.

While high school teachers in the control board report higher levels of technology integration than at the experimental schools, control students overwhelmingly have indicated that this is not the case. This would be consistent with speculations voiced by administrators from the experimental board that students would be the driving force behind the success of the project. It is also consistent with the shift in paradigm between the traditional teacher-centred classrooms that preceded the reform of the Quebec Education Program to the student-centred environment that technology is supposed to facilitate.

Secondary teachers in the experimental board expressed the most negative views about technology. The implementation of the laptop initiative may have increased apprehension from the teachers at the experimental schools to change. Principals at these high schools have indicated that their teachers simply are not eager to use laptops. Overall, high school teachers seem especially resistant to change when it is being imposed by outside forces, which is the case with the laptop initiative. School board officials now realize that if new initiatives come from within the school, teachers will be more likely to adopt the initiative. In support of this hypothesis, teachers from control schools using technology voluntarily report much higher ratings of integration and satisfaction. This suggests that strategies for change using technology may need to be varied depending on the level of teaching and the context in which computers are being deployed.

Academic outcomes

In terms of key academic indicators, potentially the most interesting finding is the difference in CAT-3 scores between the experimental and control boards. Secondary students at the experimental schools had higher scores on the CAT-3 reading test and indicated making six times more frequent use of computer technology in their English classes, suggesting a possible treatment effect. Others have found that the use of technology has a positive impact on student writing especially at the secondary level (Goldberg, Russell, & Cook, 2003; Jeroski, 2003; Lowther, Ross, & Morrison, 2003; Rockman, et al., 2000). In contrast, mathematics scores were higher at the control schools where neither board indicated high levels of computer use for math. Nevertheless, these findings must be interpreted with some caution until the threats to validity of selection bias are more clearly overcome. Otherwise, the measures of core academic competencies, student self-concept and student self-regulation serve as useful and important baseline measures against which future progress can be measured.

Meaningful student-centred learning

The first phase of technology integration is to insure that teachers and students are comfortable with the uses of the tool and its features. It is evident that this familiarity is well under way as the laptops are deployed and used regularly during the school day. Subsequent phases of the laptop initiative are intended to focus on deeper, more challenging pedagogical applications, which rely increasingly on student-centred learning, and cross-curricular competencies as described in the Quebec Education Program.

Several decades ago, Benjamin Bloom argued for the importance of time-on-task as one key ingredient of learning success. It remains a key indicator of school success that teachers and students must be meaningfully engaged and, while technology can assist through its power and engagement, it is important that classroom management be in place so that the tool does not become a distraction.

This may require new strategies for classroom management. Or it may merely indicate that teachers are better able to detect off-task behaviour in a computer environment. Regardless, an exploration of how and whether off-task behaviour using computers leads to meaningful incidental learning (i.e., what are the students doing when they are off task?) should be considered.

School Success plans and goals are means by which these phases may be concretized. They can exist at multiple levels (e.g., board, school, teachers, and student) and they can be both short-term and long-term (e.g., more student collaboration, increased CAT-3 test results, etc.).

In addition, both future training and support for technology should focus on advanced uses, once familiarity has been accomplished. Ideas include: a) ensuring that teachers and students are aware of and using curricular and cross-curricular tools for learning with technology (e.g., electronic portfolios to support self-regulation); b) science and math applications which provide guided support for complex learning, and multimedia tools for developing literacy skills, etc.; c) examining the pedagogical uses of digital learning aspects found in learning object repositories and learning design tools to assemble them; d) developing collaborative learning skills among students which focus on positive interdependence and individual accountability; e) facilitating teacher and student expertise in information literacy including access and retrieval strategies for the internet and the abilities to make judgments about the quality, veracity, and completeness of information; f) encouraging communities of practice among teachers and administrators by creating distributed learning networks both within and between schools linking educators together and opening the doors of their classrooms to one another; and g) focusing assessment and evaluation of the technology integration on both the products of the initiative (i.e., what has been accomplished) as well as the process of the initiative (i.e., how it has been accomplished), documenting the success stories and best practices so they can be replicated.

Otherwise, the promise of technology must be tempered with the reality of the costs to purchase and maintain it even beyond the time and costs associated with learning how to use it wisely. In the former regard there is evidence in this initial report of the difficulties associated with maintenance. A technical problem is a reality that can compromise the dream, which future engineering developments may yet overcome. In the interim, technical support is a sine qua non of a computer-based learning environment. The hardware and software are still quirky and problematic especially for diverse and neophyte users.

Finally, we have argued that it is of essential importance to conduct rigorous and longitudinal research on the impacts of ubiquitous technology use on teaching and learning—the products of technology integration. We need unambiguous answers to whether technology is effective, on which outcomes, with which students and teachers, at what school levels, and for which subject areas. But we also need to answer questions about what technology works and how—the process of technology integration. Future research should better document the nature of teaching and learning with technology, determine aspects of effective professional development and describe the cross-curricular, collaborative and student-centred learning activities which lead to the promotion of academic competencies, enhanced self-concept and better student self-regulation.


The authors would like to thank the following people for their time, patience and continuous support: our data collectors and cleaners (Yin Liang, Bing Xiao Jiang, Lili Tang, Bounmy Thammavong, Julie Kwan, Seo Hung Lim, Gabriella Frankel, Sun Bo, Marina Adou, Mary Ann Chacko and Helen Stathopolous, Larysa Lysenko); Aline Grenier; Aida Hadzeomerovec; David Galati (Canadian Testing Centre); the members of the research committee from the experimental and control school boards; and Yuri Daschko and Susan Mongrain (Industry Canada). Finally, we wish to extend our sincere appreciation to the teachers, principals, students, and parents at the experimental and control school boards for believing that this project was an important one and for providing us with class time and giving up their free time for data collection.


Coley, R.J., Cradler, J., & Engel, P.K. (2000). Computers and the classroom: The status of technology in U.S. schools. Princeton, NJ: Policy Information Center, Educational Testing Service.

Davies, A. (2004). Finding proof of learning in a one-to-one computing classroom. Report submitted to Maine Learning Technology Initiative. Connections Publishing.

Davis, D., Garas, N., Hopstock, P., Kellum, A., & Stephenson, T. (2005, February). Henrico County Public Schools iBook survey report. Arlington, VA: Development Associates.

Fuchs, T., & Woessmann, L. (2004, November). Computers and student learning: Bivariate and multivariate evidence on the availability and use of computers at home and at school. CESifo Working Paper number 1321.

Goldberg, A., Russell, M., & Cook, A. (2003, February). The effect of computers on student writing: A meta-analysis of studies from 1992 to 2002. The Journal of Technology, Learning and Assessment, 2(1).

Gravelle, P. B. (2003, April). Early evidence from the field – the Maine Learning Technology Initiative: Impact on the digital divide. Bangor, ME: Center for Education Policy, University of Southern Maine.

Harasim, L., Hiltz, S. R., Teles, L., & Turoff, M. (1995). Learning networks: A field guide to teaching and learning on-line. Cambridge, MA: MIT Press.

Healy, J. M. (1998). Failure to connect: How computers affect children’s minds—for better and worse. New York: Simon & Schuster.

Janosz, M. (2003). Questionnaire sur l’environment socio-educatif. Montreal, QC: Université de Montréal.

Jeroski, S. (2003, July). Wireless writing project: Research report phase II. Vancouver, BC. Retrieved December 26, 2005 from http://www.prn.bc.ca/ FSJ_WWP_ Report 03.pdf

Jeroski, S. (2004, October). Implementation of the Wireless Writing Program: Phase 3. 2003-2004. Vancouver, BC. Retrieved December 26, 2005 from http://www.prn.bc.ca/WWP_Report04.pdf

Kulik, J. A. (2003, May). Effects of using instructional technology in elementary and secondary schools: What controlled evaluation studies say. Final Report. Arlington, VA: SRI International. Retrieved December 26, 2005 from http://www.sri.com/policy/csted/reports/sandt/it/Kulik_ITinK-12_ Main_Report.pdf

Kulik, J. A, & Kulik, C-L. C. (Eds.) (1989). Instructional systems [Special Issue]. International Journal of Educational Research: Meta-Analysis in Education, 13(3). 277–289.

Lane, D. M. M. (2003). The Maine Learning Technology Initiative: Impact on students and learning. Paper presented at the Annual Meeting of New England Educational Research Organization, Portsmouth, New Hampshire.

Lou, Y., Abrami, P. C., & d’Apollonia, S. (2001, Fall). Small group and individual learning with technology: A meta-analysis. Review of Educational Research, 71(3), 449–521.

Lowerison, G., Sclater, J., Schmid, R., Abrami, P.C. (in press). Student perceived effectiveness of computer technology use in postsecondary classrooms. Computers in Education.

Lowther, D. L., Ross, S. M., & Morrison, G. M. (2003). When each one has one: the influences on teaching strategies and student achievement of using laptops in the classroom. Educational Technology Research & Development, 51(3), 23-45.

Marsh, H. W. (1999) Self Description Questionnaire II. University of Western Sydney : Macarthur.

MEQ. (2002) Indice de milieu socio-économique par école, 2001-2002. Quebec , QC : The Ministry.

Rockman et al. (2000). A more complex picture: laptop use and impact in th\e context of changing home and school access. A third in a series of research studies on Microsoft’s Anytime Anywhere Learning Program. Retrieved December 26, 2005 from http://rockman.com/projects/laptop/laptop3exec.htm.

Russell, T. L. (1999). The no significant difference phenomenon. Raleigh, NC: North Carolina State University Press.

Ryan, R. M., & Connell, J. P. (1989). Perceived locus of causality and internalization: Examining reasons for acting in two domains. Journal of Personality and Social Psychology, 57, 749–761.

Sargent, K. I. (2003). The Maine Learning Technology Initiative: What is the impact on teacher beliefs and instructional practices? Paper presented at the Annual Meeting of New England Educational Research Organization, Portsmouth, New Hampshire.

Scardamalia, M., & Bereiter, C. (1996). Computer support for knowledge-building communities. In T. Koschmann, (Ed.). CSCL: Theory and practice of an emerging paradigm. Mahwah, NJ: Erbaum.

Schacter, J. (1999) The impact of education technology on student achievement: What the most current research has to say. Milken Exchange on Education Technology.

Silvernail, D. L., Harris, W. J., Lane, D. M., Fairman, J., Gravelle, P., Smith, L., Sargent, K., & McIntire W. (2003, March). Maine Learning Technology Initiative: Teacher, student, and school perspectives. Mid-year evaluation report. Bangor, ME: Maine Education Policy Research Institute.

Silvernail, D. L. & Lane, D. M. (2004, Feb.). The Impact of Maine’s one-to one laptop program on middle school teachers and students. Phase one summary evidence. Research report no.1. Bangor, ME: Maine Education Policy Research Institute, University of Southern Maine Office.

Sivin-Kachala, J., & Bialo, E. R. (2000). 2000 Research report on the effectiveness of technology in schools (7th ed.). Software & Information Industry Association.

Ungerleider, C., & Burns, T. (2002). Information and communication technologies in elementary and secondary education: a state of art review. Prepared for 2002 Pan-Canadian Education Research Agenda Symposium “Information Technology and Learning”, Montreal, QC.

Vallerand, R. J., Pelletier, L. G., Blais, M. R., Briere, N. M., Senecal, C. B., & Vallieres, E. F. (1992-1993) Academic Motivation Scale. Educational and Psychological Measurement, 52 & 53.

Waxman, H. C., Lin, M-F, & Michko, G. M. (2003, December). A meta-analysis of the effectiveness of teaching and learning with technology on student outcomes. Learning Point Associates.

Wozney, L., Venkatesh, V., & Abrami, P. C. (2006). Implementing computer technologies: Teachers’ perceptions and practices. Journal of Technology and Teacher Education, 14(1), 173–207.

End Notes

This research was supported by a grant from the Multimedia Learning Group, Industry Canada. The authors are solely responsible for the content of this article. A complete version of the year one report can be found at: http://doe.concordia.ca/cslp/Downloads/PDF/ETSB_final_report _0628.pdf

ISSN: 1499-6685

Copyright (c) 2006 Jennifer Sclater, Fiore Sicoly, Philip Abrami, C. Anne Wade

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.