HOW PERSONAL RESPONSE SYSTEMS PROMOTE ACTIVE LEARNING IN SCIENCE EDUCATION?

Sehnaz Baltaci-Goktalay
Uludag University, Turkey
sehnazb@hotmail.com

ABSTRACT

This study examines the effect of using personal response system (PRS) on students’ academic performance and their attitude towards science. Three research questions were addressed in the study: (1) Is there any difference in academic achievement of 5th grade students when it is used PRS in science and technology courses? (2) Does PRS effect 5th graders’ attitudes towards science? (3) What are the 5th graders’ attitudes towards PRS use in the classroom? The results show that there is no difference in terms of achievement between pre-test and post-test scores in both groups. On the other hand, there is a positive difference on attitudes towards science on behalf of the experimental groups. In addition, boys were found to be more positive to PRS than girls based on the PRS attitude scale. The qualitative component involved focus group discussion with six students and an interview with the class teacher. Students are also observed while they were using the PRS in the classroom. Participants provided positive feedback regarding the use of PRS and requested the increase in use because they felt the use of PRS supported and improved their classroom learning, made the course more fun, and increased the course participation. They also enjoyed the peer discussions that instructors facilitated with regard to the use of PRS. The teacher was also positive about using PRS in his classroom.

Keywords: personal response systems, clickers, TEFA, active learning, science education


Correspondence to: Sehnaz Baltaci-Goktalay, Assistant Professor of Department of Computer and Instructional Technologies Teaching, Faculty of Education, Uludag University, Bursa, Turkey, E-mail: sehnazb@hotmail.com


Creative Commons License
Journal of Learning and Teaching in Digital Age 2017. © 2017. This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

INTRODUCTION

In the last decade, the Turkish Ministry of National Education (MONE) has increased their focus on technology integration in schools to establish more student-centered classrooms and to meet the demand for 21st century skills. As in other developed countries, Turkish secondary students have access to the Internet, tablet computers and interactive smart boards through the national FATIH project. Both teachers and students are encouraged to use interactive technologies to enrich the teaching/learning process. Teaching methods that promote interaction and discussion are known to benefit learning. Personal Response Systems (PRSs) represent some of the powerful interactive technologies available in the classroom that can be used to promote active learning.

PRSs (Chan & Knight, 2010), also known as Student Response Systems (SRSs) (Anthis, 2011), Classroom Response Systems (CRSs) (Graeff et.al., 2011) or clickers (Hunsu, Adesope and Bayly, 2016), are electronic voting systems used in class for collecting student responses to a given question. PRSs allow teachers to display multiple choice questions on a screen or interactive smart board to which students can instantly respond by selecting from a list of letters or numbers on their wireless keypads. PRSs are usually composed of three main parts:

1. the receiver attached to the teacher’s computer

2. the clickers

3. the accompanying software program loaded on teacher’s computer.

Responses to the questions are displayed on the screen so that both teachers and students can see the percentages of the class that chose each response. Questions can be multiple-choice, true/false, matching or ranking items and short answer. PRSs can be used in a variety of ways, from checking conceptual understanding to starting discussion. Various researchers have sought to discover the benefits of PRSs and students’ enjoyment of the PRSs. PRSs in enhanced classrooms have three main positive effects in education (Kay and LeSage, 2009): increased student involvement, improved learning, and faster assessment. From a student involvement perspective, PRSs encourage students to go to class more (attendance), focus more (attention) and participate more in the class (anonymity). From a learning perspective, students interact more with peers to discuss ideas (interaction) and actively examine misconceptions to build knowledge (discussion); additionally, instruction can be modified (contingent teaching), learning performance increases as a result of PRS use (learning performance), learning improves, and misconceptions can be resolved (quality of learning). From an assessment perspective, students and teachers receive instant and regular feedback regarding understanding (feedback), assessment occurs during the class, thus improving the quality of teaching (formative), and students compare their responses to others (comparison).

Although PRS-enhanced classes have many benefits, it also poses challenges based on technology (broken remotes), the teacher (less experienced teachers cannot adjust to student feedback, they cover less course content if the PRSs are used, and it is time consuming to create the PRS questions) and the student (it is difficult to adapt to the new way of learning, discussion leads to confusion, more effort is needed, and they feel bad when they receive negative feedback). Research on the effect of PRSs has grown substantially in the 2000s, but this growth has not been evenly distributed across educational settings in developing countries. Kay and LeSage (2009) noted the lack of research outside of higher education settings and recommended that more research be performed to determine the impact of PRSs. This study was conducted to examine whether PRSs promote active learning in science courses in secondary education settings in Turkey.

PRSs are becoming very popular in education, and research has targeted their affective benefits, such as greater student engagement, increased student interest and heightened discussion and interactivity (Trees and Jackson, 2007; Fies and Marshall, 2006; Penuel, Boscardin, Masyn and Crawford, 2007). However, Gauci et.al (2009) argues that better learning outcomes result from changes in pedagogical focus from passive to active learning and not from the use of a specific technology. There are now numerous studies showing that deep and lasting learning is achieved when students actively engage with concepts they are learning and construct their own understanding (Cooperstein and Kocevar-Weidinger, 2004; Draper and Brown, 2004; Hinde and Hunt, 2006). Discussion, discourse, questioning and explaining are some of the activities that support active learning. In the current study, Technology-Enhanced Formative Evaluation (TEFA) (Beatty and Grace, 2009) was the pedagogical method used for teaching science with PRSs. TEFA has four principles:

· Question-driven instruction (Motivate and focus students with questioning)

· Dialogical discourse (Develop students’ understanding with dialogical discourse)

· Formative assessment (Inform and adjust teaching and learning decisions based on formative assessment)

· Meta-level communication (Help students develop higher-level skills and cooperate in the learning process)

The benefits of active learning in education are widely recognized. Guthrie and Carlin (2004) state that 21st century students are primarily active learners and lecture courses may not be appropriate for their engagement. PRSs represent one of the powerful interactive technologies available in the classroom that can be used to promote active learning. Keyser (2000) discusses the importance of identifying objectives first and then appropriately incorporating active learning into lectures. By identifying the active learning objectives first, PRS questions can be integrated into the lecture without drastic changes.

Fitch (2004) provides a comprehensive literature review of both learner-centered and active learning studies and concludes, “there is convincing evidence that interactivity is a critical part of any technology based learning.” In the current study, the theory behind learner-centered and active learning environment is TEFA. Dufresne, Gerace, Leonard, Mestre, and Wenk (1996) developed pedagogy for the use of PRSs in formative evaluation, and Beatty and Grace (2006) elaborated that pedagogy for TEFA. In TEFA pedagogy, PRSs play an important supporting role. When the teacher presents questions, students discuss the questions with peers or think individually and report their answers using PRSs. After the histogram of responses is displayed, class-wide discussions occur as the teacher summarizes and explains the correct answer. Teachers should master different skills to use TEFA. Feldman and Capobianco (2008) worked with secondary physics teachers using PRSs for formative assessment and found that teachers need to learn skills in four areas to implement formative assessment with PRSs: using PRS hardware and software, creating formative assessment items, leading productive class discussion and integrating TEFA in their larger curricula. Two teachers who were experienced with using PRSs in science classes were included in the current study. Science class is a core class throughout compulsory education in Turkey. Thus, engaging students in science classes can be a challenge. In traditional lecture classes, students passively take notes and struggle to keep their attention focused until the end of the class. Students often feel that they are the only ones who do not understand the material and are afraid of asking questions. In addition, students regularly comment that class lectures are boring and that they have little opportunity for interaction. Students say that it is not the teachers’ fault; rather, it is the material being presented. To make classes more learner-centered, interactive and fun to keep students engaged with the material being presented, PRSs were used in some of the classes. The currently available studies have only compared the use of PRSs to traditional lecture methods. In the current study, the use of PRSs is compared with another active learning method, class-wide discussion.

RESEARCH METHODOLOGY

Participants and Procedure

The participants included 61 secondary school students (32 female and 29 male) in two sections of a science class. The students were between 12 and 13 years old. Drawing on the work of Dufresne, Gerace, Leonard, Mestre, and Wenk (1996), class-wide discussion was integrated into one of the sections, while PRSs used in the other section (Table 1).

Table 1. The sequence of activities for peer-to-peer and class-wide discussion

PRS Section (Experimental group)

Class-wide Discussion Section (Control Group)

1. Question posed

2. Students given time for individual thinking

3. Students provide individual responses

4. Poll of responses histogram presented and students receive feedback

5. Peer-to-peer discussion is instructed

6. The same question is re-tested

7. Students provide individual responses

8. Poll of responses histogram is presented and students receive feedback

9. Teacher summarizes and explains the “correct” response

1. Question posed

2. Small groups discuss the question

3. Students provide individual responses

4. Students receive feedback

5. Class-wide discussion so that the students can explain their answers and listen to others’ explanations is facilitated by the teacher

6. Teacher summarizes and explains the “correct” response

The course design was different for the two sections. One section used PRSs, and the other section used class discussion. While the students in the control group responded to questions verbally, the experimental group used PRSs. The teachers prepared multiple-choice questions according to TEFA principles. Questions were presented to both groups during the class for 2 hours a week over a period of 14 weeks.

The independent variable was PRS usage during the classroom activities. In the PRS group, each student was provided a wireless clicker that was connected to the teacher’s computer. The students responded to the questions individually using the PRS. Then, the teacher facilitated class discussions based on the histogram of responses and gave additional explanations when needed. In the class discussion group, non-PRS students were presented the same questions during the lesson at some point. After the teacher asked some of the students for their answers, they facilitated a class-wide discussion and gave explanations as in the PRS group.

Data Collection

The dependent variable was student achievement. Perceptions of students using PRSs were also collected to determine students’ satisfaction with PRSs. Two paper-based exams were administered to measure student achievement in the first and the last week of the class. Each exam comprised 25 multiple-choice questions. Two subject matter experts, who have taught science, reviewed the exams to ensure validity. Course objectives and exam questions were submitted to the subject matter experts to determine whether the questions appropriately matched the course objectives and were clearly understood. The questions were revised as needed.

The students’ demographics, perceptions, and acceptance of PRSs were collected through a 56-item survey. The survey comprised three sections. The first section included eight demographics questions, the second section included 24 perception questions, and the last section included a 24-item UTAUT scale (Venkatesh, Morris, Davis, and Davis, 2003), with each item scored on a five-point Likert scale, ranging from a rating of 1 indicating “strongly disagree” to a rating of 5 indicating “strongly agree,” with 3 being “neutral.” All participants were given the opportunity to indicate on their survey whether they would be interested in participating in a follow-up focus group discussion to share their feelings regarding PRS use in their class.

RESULTS OF RESEARCH

Students’ achievement was measured by a pre-test and a post-test. The pre-test and post-test were prepared to reflect the objectives of the classes during the experiment and were revised by two subject matter experts. The mean and standard deviation for each group are shown in Table 2.

Table 2. PRS (Experimental) and Class Discussion (Control) Group Pre-test Post-test Results

Groups

N

M

SD

Pre-test

Experimental

34

87.16

8.54

Control

27

75.66

9.92

Post-test

Experimental

34

81.30

10.20

Control

27

80.65

7.30

For the pre-test, the mean for the PRS group was 77.16 (SD=8.54), while it was 75.66 (SD=9.92) for the group using class discussion. For the post-test, the mean for the PRS group was 81.30 (SD=10.20), and it was 80.65 (SD=7.30) for the class discussion group. To compare the groups’ means, an independent samples test was conducted, and the test revealed that the PRS group scored significantly higher than the class discussion group on the pre-test (t(59)=2.386, p<.05, r=.35). However, there was no significant effect of PRS use on the post-test (t(59)=-1.432, p>.05, r=.20). An analysis of variance between the post-test scores of both the PRS and non-PRS groups indicated that there was no significant difference, F(1,59)=.623, p<.432.

Students’ perceptions and acceptance of PRS were also gathered through a survey. The UTAUT scale included items regarding performance expectancy (4 items), effort expectancy (4 items), social influence (4 items), perceived playfulness (5 items), anxiety (4 items), and behavioral intention to use the system (3 items). The survey was modified and translated into Turkish with the permission of the authors. The Cronbach’s alpha coefficient of the modified scale was measured as 0.85, compared to 0.90 in the original scale. The data were analyzed descriptively, and the mean scores for some items are presented in the tables. Table 3 summarizes the mean scores of the students’ perceptions with respect to Performance Expectancy. As observed in the table, the students tended to believe that PRSs are a useful and productive tool, and they tended to be positive in terms of their perception that PRSs increase their chances of getting a better grade.

Table 3. Descriptive Statistics for Performance Expectancy (n = 34)

Questionnaire Items

Mean

Std. Deviation

PE1: I find the PRS useful in my studies.

3,9836

1,17604

PE2: Using a PRS enables me to accomplish tasks more quickly.

4,0984

1,04411

PE3: Using a PRS increases my productivity

4,1803

1,02483

PE4: If I use a PRS, I will increase my chances of getting a better grade.

4,2131

1,55265

Table 4 provides a descriptive analysis of the students’ perceptions regarding Effort Expectancy. It appears that the students tended to agree that the PRS is understandable, that it is easy acquire skill in using the PRS, and that the PRS is easy to learn. Moreover, they tend to strongly agree that PRSs are easy to use.

Table 4. Descriptive Statistics for Effort Expectancy (n=34)

Questionnaire Items

Mean

Std. Deviation

EE1: My interaction with the PRS is clear and understandable.

4,4262

1,02403

EE2: It is easy for me to become skillful at using a PRS.

3,8361

1,24070

EE3: I find the PRS easy to use.

4,1967

1,29238

EE4: Learning to operate a PRS is easy for me.

4,3279

1,24795

Table 5 suggests that others who think they should use PRSs influence the students and that and they tend to agree that the school administration and teachers support the use of PRSs. Moreover, adequate support from the administration is available to the students.

Table 5. Descriptive Statistics for Social Influence (n=34)

Questionnaire Items

Mean

Std. Deviation

SI1: People who influence my behavior think that I should use PRSs.

4,2459

1,14972

SI2: People who are important to me think that I should use PRSs.

4,1475

1,15233

SI3: The administration of this school has been supportive in the use of PRSs.

4,2295

1,16037

SI4: In general, our teacher has supported the use of PRSs.

4,1803

1,11816

The descriptive statistics in Table 6 also suggest that the students surveyed tended to believe that PRSs are a good idea and that they like to use it.

Table 6. Descriptive Statistics for Perceived Playfulness (n=34)

Questionnaire Items

Mean

Std. Deviation

PP1: Using a PRS is a good idea.

4,0164

1,32277

PP2: Using a PRS is a bad idea.

2,2131

1,49571

PP3: PRSs make classes more interesting.

4,1475

1,23607

PP4: Working with a PRS is fun.

4,3115

1,05737

PP5: I like working with PRSs.

4,0328

1,59790

Not surprisingly, given 21st century students’ exposure to technology, the students did not have a high level of anxiety when using PRSs, as indicated by the results given in Table 7. Additionally, a high level of use in terms of Behavioral Intention is reported by the results in Table 8.

Table 7. Descriptive Statistics for Anxiety (n=34)

Questionnaire Items

Mean

Std. Deviation

ANX1: I feel apprehensive about using PRSs.

2.6423

1,02589

ANX2: It scares me to think that I could lose a lot of information using PRSs by clicking the wrong button.

3.2345

1,23009

ANX3: I hesitate to use the system for fear of making mistakes I cannot correct.

1.9462

1,59723

ANX4: PRSs are somewhat intimidating to me.

2.1915

1,60051

Table 8. Descriptive Statistics for Behavioral Intention (n=34)

Questionnaire Items

Mean

Std. Deviation

BI1: I intend to use PRSs in the future.

4,3279

1,12133

BI2: I predict I will use PRSs in the future.

4,3770

1,06714

BI3: I plan to use PRSs in the future.

4,1803

1,24510

The qualitative component involved focus group discussion with a random sample of surveyed students. Students using PRSs were also observed for 14 weeks in their science class. The students provided positive feedback regarding the use of PRSs. Students requested an increase in use because they felt the use of PRSs supported and improved their classroom learning. They also enjoyed the peer discussions that instructors facilitated with regard to the use of PRSs. All students in the focus group using PRSs reported that PRS use resulted in more active involvement in learning compared with traditional lecture-based classes. One student commented “When it is just a lecture or a class discussion, you just sit there and listen to others without being involved. In the PRS class, you have to think about the questions and answer each of them and discuss with your peers. It is more fun, though. I like it better.” Students also reported that they were more engaged because of the histogram feedback. When they saw that others also gave wrong answers, it made them feel better and increased their confidence. One student commented, “Even if you are wrong, you discuss it with your peers and try to understand the right way of solving the problem. It is not easy to do this when in lecture classes.”

DISCUSSION

This study reported student perceptions of PRS use in a science class in a secondary education setting. Despite the lack of statistically significant results in this study, the perception survey data show that students perceive value in the use of PRSs and would recommend their use in future classes. As in previous research (Karaman, 2011; Liu, Gettig, and Fjortoft, 2010; Boyle and Nicol, 2003), in the current study, the students’ achievement in the PRS group did not improve more than that in the class discussion group, but several other benefits are reported.

Previous research (Hunsu, Adesope and Bayly, 2016; Caldwell, 2007; Beatty, Leonard, Grace, and Dufresne, 2006) indicated that students perceive PRSs to be very fun and useful. It is reported in the current study that the PRS made the learning environment more enjoyable. The students’ feedback was very positive and supports previous findings (Fitch, 2004; Beekes, 2006) that PRSs can make classes more interactive and enjoyable for students. Incorporating PRS questions into the lecture helped maintain student focus and increased the students’ participation in the class. The students appeared to be more attentive in the PRS group compared to the non-PRS group. Other researchers (Hunsu, Adesope and Bayly, 2016) also advocates that use of PRS produce positive effects on attendance because of the sense of anonymity.

It is commonly argued that encouraging discussion is of greater benefit than passively using PRSs in classes (Mazur, 1997). The current study supported this argument. It is clear that peer discussions supported students’ positive attitudes toward the utilization of PRSs. Overall, the use of PRSs was found to increase classroom interaction (student-student and student-teacher) and engagement in class activities, to increase attendance, to provide real-time feedback to teachers about students’ misunderstandings and to promote a more positive, active and fun learning environment.

In summary, this study utilized both quantitative and qualitative approaches to assess the effectiveness of PRS use as an active learning technique in enhancing students’ learning. While further research into the effectiveness of PRSs in science classes is needed, the author’s initial assessment is positive, and the classroom experience added support to the previous research in secondary education.

REFERENCES

Anthis, K. (2011). Is it the clicker, or is it the question? untangling the effects of student response system use. Teaching of Psychology, 38(3), 189-193

Beatty, I. D., Leonard, W. J., Grace, W. J., & Dufresne, R. J. (2006). Question driven instruction: Teaching science (well) with an audience response system. In D. A. Banks (Ed.), Audience response system in higher education: Applications and cases (pp. 96-115). Hershey, PA: Information Science.

Beatty, I.D. & Grace, W. J. (2009). Technology-Enhanced Formative Assessment: A Research-Based Pedagogy for Teaching Science with Classroom Response Technology. Journal of Science Education and Technology, 18(2), 146-162.

Beekes, W. (2006). The ‘Millionaire’ method for encouraging participation. Active Learning in Higher Education, 7(1), 25-36.

Boyle, J.T., & Nicol, D.J. (2003). Using classroom communication systems to support interaction and discussion in large class settings. Association for Learning Technology Journal, 11(3), 43-57.

Caldwell, J.E. (2007). Clickers in the large classroom: Current research and best-practice tips. CBE-Life Sciences Education, 6(1), 9-20.

Chan, E., & Knight, L. (2010). Clicking with your audience. Communications in Information Literacy, 4(2), 192-201.

Cooperstein, S.E. & Kocevar-Weidinger, E. (2004). Beyond active learning: a constructivist approach to learning. Reference Services Review, 32(2), 141-148.

Draper, W., & Brown, I. (2004). Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning, 20, 81–94.

Dufresne, R.J., Gerace, W.J., Leonard, W.J., Mestre, J.P., & Wenk, L. (1996) Classtalk: A Classroom Communication System for Active Learning. Journal of Computer in Higher Education, 7, 3-47.

Feldman A, & Capobianco B.M. (2008) Teacher learning of technology enhanced formative assessment. Journal of Science Education and Technology, 17(1), 82–99.

Fies, C., & Marshall, J. (2006). Classroom Response Systems: A Review of the Literature. Journal of Science Education and Technology, 15(1), 101-109.

Fitch, J.L. (2004). Student feedback in the college classroom: a technology solution. Education Technology Research and Development, 52 (1), 71-81.

Gauci S, Dantas A, Williams D, & Kemm R. (2009). Promoting student-centered active learning in lectures with a personal response system. Advance Physiology Education, 33, 60-71.

Graeff, E. C., Vail, M., Maldonado, A., Lund, M., Galante, S., & Tataronis, G. (2011). Click it: assessment of classroom response systems in physician assistant education. Journal of Allied Health Spring, 40(1), e1-e5.

Guthrie, R. W., & Carlin, A. (2004). Waking the dead: Using interactive technology to engage Passive listeners in the classroom. In: Proceedings of the Tenth Americas Conference on Information Systems, New York, USA

Hinde, K., & Hunt, A. (2006). Using the personal response system to enhance student learning: Some evidence from teaching economics. In D. A. Banks (Ed.), Audience response systems in higher education (pp. 140–154). Hershey, PA: Information Science Publishing.

Hunsu, N.J., Adesope, O., Bayly, D.J. (2016). A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Computers and Education, 94, 102-119.

Karaman, S. (2011). Effects of audience response systems on student achievement and long-term retention. Social Behavior and Personality, 39(10), 1431-1440.

Kay, R., & LeSage, A. (2009). Examining the benefits and challenges of using audience-response system: A review of the literature. Computers and Education, 53, 819-827.

Keyser, M.W. (2000). Active learning and cooperative learning: understanding the difference and using both styles effectively. Research Strategies , 17, 35-44.

Liu, F. C., Gettig, J. P., & Fjortoft, N. (2010). Impact of a student response system on short- and long-term learning in a drug literature evaluation course. American Journal of Pharmaceutical Education, 74(1), 1-5.

Mazur, E. (1997). Peer Instruction: A User’s Manual. New Jersey: Prentice Hall.

Penuel, W.R., Boscardin, C.K., Masyn, K., & Crawford, V.M. (2007). Teaching with student response systems in elementary and secondary education settings: A survey study. Educational Technology Research and Development, 55, 315–346.

Trees, A.R. & Jackson, M.H. (2007). The learning environment in clicker classrooms: student processes of learning and involvement in large university-level courses using student response systems. Learning, Media and Technology, 32(1), 21–40.

Venkatesh, V., Morris, M.G., Davis, G.B., & Davis, F.D. (2003). User acceptance of information technology: toward a unified view. MIS Quarterly, 27(3), 425-478

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

...........................................................................................................................................

Journal of Learning and Teaching in Digital Age. All rights reserved, 2016. ISSN:2458-8350