Başlıksız Belge

ENGAGING STUDENTS WITH QUESTIONS: ATTITUDES TOWARDS USING STUDENT RESPONSE SYSTEMS IN HIGHER EDUCATION

Meryem Ayşegül Kozak-Çakır
(ORCID ID: 0000-0001-6659-6888)
Uludağ University, Turkey
akozak@uludag.edu.tr

Received 11 April 2019, Revised 25 June 2019, Accepted 27 June 2019

ABSTRACT

The purpose of this review paper is to shed light on the impact of student response systems on students’ academic engagement in large lectures within the higher education level. 40 highly relevant studies obtained from a meticulous literature search by mainly using student response system and student academic engagement as keywords. Studies were reviewed based on publication year, context, main focus of the study, participants, and the results of the study. Review results show that research studies with student response system and student engagement has a decreasing trend in recent years, conducted mainly with the quantitative approach, predominantly conducted with the participants from the first year of college. Their results showed that student response systems in large lectures help students to increase their academic engagement with the course work, improve their academic achievement, motivation and satisfaction from the course. Additionally, use of student response systems increased student metacognitive awareness with the course content and encouraged in-class discussions among students. Further research studies are needed with student academic engagement focus and use of student response systems with national and international contexts.

Keywords: Student engagement, lecture, large classes, student response systems


Corresponding Author: Meryem Ayşegül Kozak-Çakır, Uludag University, İnegöl School of Business Administration, Management Information Systems Department, Room Number: 333, İnegöl, Bursa – TURKEY, Email: akozak@uludag.edu.tr


Creative Commons License
Journal of Learning and Teaching in Digital Age 2019. © 2019. This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

INTRODUCTION

It is well known fact that there is a positive relationship between individuals’ educational status and socioeconomic status (Psacharopoulos, 1994; Sirin, 2005). Individuals with higher education degrees tend to have higher socioeconomic status in comparison to those who do not, therefore demand for access and have a higher education diploma are in an increasing trend (Altbach, Reisberg, & Rumbley, 2009; Nicolescu & Pun, 2009). In response to this demand and trend, governments have adopted Social Demand approach of educational planning, which aims to meet educational demands from citizens and increase the capacity of the system (Psacharopoulos, 2014). Social demand approach to educational planning may not be an efficient way of meeting educational needs as it is not linked to the job market or industry’s needs for skilled people. People’s choices for higher education may amassed to certain professions or departments depending on the present economy related or social situations, which means crowded classrooms in higher education courses (Hornsby & Osman, 2014; Nelson & Pearson, 1999; Tilak, 2008).

Students in higher education expect to receive quality education in the departments they choose. However, there are many factors to provide quality education in higher education. In order to conduct quality education in higher education context, Chickering and Gamson (1987) provide a list of principles for college instructors. These principle aim to improve students’ educational experience in higher education and they can be generalized to any educational context. The authors recommend seven practices for quality undergraduate education which are;

  • Encourage contact between students and faculty: Students and instructors should be provided with opportunities to establish and maintain contact throughout the semester.
  • Develop reciprocity and cooperation among students: Students should have good grasp of tolerating each other and seek ways to collaborate with each other therefore instructors need to find ways to integrate teamwork in their coursework.
  • Encourage active learning: Learning is an active meaning making process therefore students need to have tasks in coursework that allow them to read, think, search, write, and discuss the course content.
  • Give prompt feedback: Students need to know how they are doing in their coursework in timely manner so that they can either change their study practices that do not work or continue with the practices that work best for them.
  • Emphasize time on task: Students need to work on coursework during the related classes. Time during their courses should be spent only for the course’s assigned works such as discussing the readings, working on the assignments.
  • Communicate high expectations: College instruction should set the achievement expectations higher for the students therefore instructors should inform students at the beginning of the semester about their achievement expectations from the class. These expectations should be challenging enough for students to motivate and engage with the class.
  • Respect diverse talents and ways of learning: All of us have different talents and strength when it comes to learning. Some of us are good at expressing thoughts by writing and some of us are good at expressing thoughts by speaking. College instructor need to design their courses to address different ways of learning and assessment methods.

When one closely examines these practices, it can be seen that they all require good deal of time and effort of the instructor to keep students challenged, engaged, and interactive during the class. However in large classrooms, it is very difficult for instructors to comply with the recommendations of Chickering and Gamson (1987) without help from information and communication technologies.

Due to above mentioned high demand for higher education, the class sizes grow continuously and instructors tend to choose lecturing as a primary mode of instruction in their classrooms (Nelson & Pearson, 1999). While nothing is wrong to use lecturing during a class to transfer the instructional content to students, using it as only means of  instruction hinders students’ active participation to the class. Since students are passive learners in lecture mode of instruction, it notoriously yields low information retention rate, decreases motivation, and encourages students to obtain achievement scores just sufficient to pass the course (Cooper & Robinson, 2000). Even though the classes are getting more crowded, the instructors’ role is not just to transfer the content without paying attention to students’ active participation to classes (Gibbs & Jenkins, 2014; Saville, Zinn, Neef, Norman, & Ferreri, 2006).

Large lecture classes bring about important instructional challenges to instructors and these challenges stem from the nature and the size of the courses. In higher education institutions, large classes are usually held in the auditoriums, assembles students from different departments even from different colleges, which creates very impersonal environment for students (Hornsby, 2013). This type of learning environment prevents instructors to interact with students and prevents students to interact with fellow class members (Wulff, Nyquist, & Abbott, 1987). Additionally, students’ sense of responsibility towards coursework may decrease, their anxiety to contribute to class verbally may increase (Geske, 1992). Due to high number of students in classrooms, instructor’s interaction with students and his/her ability to give them instant feedback are severely reduced. Providing feedback to students about their learning process as well as instructor’s receiving feedback about how students are doing in the course is vital for instructors to make necessary changes in the way the course is conducted and exam questions are created (Nelson & Pearson, 1999). Moreover, instructors who offer lectures for large classes usually transmit course content at the Bloom’s knowledge level and cannot go further to application, analysis, synthesis or evaluation levels (Cooper & Robinson, 2000) yet, it is important for college students to think critically about the knowledge in their course content in order for them to develop personally or professionally (Astin, 1997; Pascarella & Terenzini, 2005).

One of the ways to engage students in oversized classrooms with interactive lectures where they can express their opinions, answer questions and even receiving feedback about their performance is to use student response systems. A student response system is an information communication technology system for instruction, which helps instructors to put up questions, answers or opinions of students and let them choose among the options by using handheld devices. Student response systems were first utilized at Stanford University in 1966 and spread over the higher education and private institutions to keep the audience engaged (Abrahamson, 2006). They became available commercially in 1990s however the systems were expensive to set up, as they required their own infrastructure. With the widespread of the Internet and Web 2.0 technologies, they become highly popular since 2004. The system uses different names such as student response system, audience response system, classroom response system, clickers and voting machines. Kay (2009) and Good (2013) provided a list from literature that labels the system in 26 different names. For the sake of a clear communication, the term student response system (SRS) will be used throughout this paper.

A student response system (SRS) usually consists of four main parts; (1) a host system with software package used to gather and interpret student responses, (2) student transmitter/input devices (these devices could be keypad, smartphone, PDA, tablet PC, or laptop), (3) receiver (an electronic device that collects signals from student devices), and (4) a network infrastructure to combine and support the system.  Typically instructor uses the system as (1) instructor poses a question with options via a data projector, (2) Students respond to the question by using their handheld device, (3) a receiver collects signals from the handheld device and sends to the host computer, and (4) host computer with software interprets the data and display the results to the students in the classroom. A typical use of the system is depicted in Figure 1.


Figure 1. A typical use of student response systems (modified from Frame & Hayler (2006))

Student response systems have been used for different purposes in the classrooms (Caldwell, 2007). Along with the primary purpose to engage students with interactive lectures, it provides instructors with real-time feedback related to their teaching performance for which they can take preventive measures to change the difficulty level of the course topic to help students better understand the topic on the focus. It allows instructors to have better insight about which parts of the course topic need to be re-emphasized or repeated in order for students to understand the topic better. It allows students to provide their opinions or answers about the class topics without feeling being observed or judged. By showing contradictory facts about class topics, it helps instructors to initiate classroom discussions about the course topic. The system can be used to collect student characteristics such as gender, major, year in school. Finally, it also can be used to take class attendance since one device is assigned to each student use.The instructors on the other hand can choose to use the system by knowing students name or by allowing anonymous answers from the class. By collecting, interpreting, and showing students’ data for the questions, students also have chance to get their timely feedback from the instructors (Kay & LeSage, 2009). In their review, Aljaloud, Gromik, Billingsley, & Kwan (2015) provided a summary of benefits from using SRS in large classrooms. Three major benefits emerged from their analysis of the literature which are Interactivity, academic performance, and student engagement.

Use of SRS increases classroom interactivity by assisting instructors in creating a learner centered learning environment. It also provides opportunities to increase communication among students and instructors by having a topic for discussion (Blasco-Arcas, Buil, Hernández-Ortega, & Sese, 2013). Furthermore it provides immediate feedback on questions displayed or where the student is in terms of the general class opinions about a controversial question. All of these require students to spend some effort to interact with the content, other students, and instructors.
Another benefit of using SRS in classes is to improve student academic performance. Since SRS platform allow instructors to ask content related cognitive questions and opinions about controversial topics, which in turn guide students toward thinking about and try to find answers to these questions (Caldwell, 2007). By doing so, students’ academic performance in courses improves.

Lastly, use of SRS in classroom increases the student engagement. By supporting a fun learning environment, it helps instructors to improve students’ attendance and students developed positive attitudes toward course topic or content (Dunn, Richardson, Oprescu, & McDonald, 2013). Additionally, by providing immediate feedback on questions or where student stands in comparison to other students in the classroom, it helps students to increase their desire to improve their class performance by showing them the areas of improvement. Student academic engagement has been positively correlated with student learning outcome (Carini, Kuh, & Klein, 2006). However, it is a challenge for instructors to engage students with academically meaningful activities in large classes. Using student response systems is one way to stimulate student engagement in these classes (Cain, Black, & Rohr, 2009).

Although widely used by college instructors, very few studies in the literature examined the implementation issues and students’ attitudes towards the student response systems used in higher education settings. The purpose of this study is to provide a comprehensive review about students’ attitudes towards the use of student response systems in higher education settings. In order to fulfill this purpose following questions have guided the study;

  • What is the distribution of studies over the years?
  • What countries have the research studies been conducted?
  • In what contexts the studies were conducted?
  • What methods were used in studies with student response systems?
  • Who were the participants of the student response systems studies?
  • What are the results of student response systems studies?

The literature have few review studies up to date, however none of them were conducted by focusing on student engagement. Kay and LeSage (2009) provided a critical examination of review studies until 2009 and concluded that very few studies published in peer reviewed journals put the educational outcomes in focus and their findings were questionable. In order to fill this gap in the literature, studies with use of student response systems and student engagement in focus were reviewed. This study was prepared in hope for helping researchers to identify key benefits of student response systems to student engagement.

METHODS

Review studies aim to shed light on a topic in hand by looking at trends and issues in published research studies (Boote & Beile, 2005). The present study aims to enhance our understanding on student attitudes towards using student response systems in higher education settings. Document review and content analysis methods were utilized as the primary methodological approaches for this study.

Criteria for Inclusion

The studies reviewed in this study were selected based on certain criteria which are publication year, being a journal article, to be conducted in higher education context, and studying student engagement and student attitudes as a dependent variable or as the focus of the study. The articles that were published between 2000 and 2019 were included in the study. Although student response systems have begun to be used as early as 1966, systematic research on their effectiveness for student engagement in higher education can be traced back to the early 2000s. Therefore, the author selected the year 2000 as the beginning year for including journal articles.

The context and the study focus were two important criteria for inclusion in this review. Only the research studies that were conducted in higher education context and studied at least student engagement and attitudes towards use of student response systems in lectures were included in the study. Most articles studied effectiveness of student response systems with multiple dependent variables or with multiple research methodologies. As long as one of their studied variables is student engagement or student attitudes towards use of student response system, these studies were included in this review.

Data Sources

Use of student response systems in large classrooms at higher education institutions has gained acceleration over the past two decades due to providing solutions to improve student academic engagement during the lectures. Implementing student response systems in these classrooms has been attracting attention from researcher and instructional designers alike. Many studies have conducted on its impact on student outcomes and issues during the implementation. The scope of current study is to understand impact of student attitudes and implementation issues on student academic engagement in higher education. To fulfill the study’s aim within this focus, studies published between 2000 and 2019 were searched by using “student response system”, “audience response system”, “student attitudes”, “student academic engagement”, “implementation”, “higher education”, college keywords in Web of Science, EBSCO, Academic Search Premier, and ERIC databases. Only the articles published in English language were selected for consideration.

By using the year span and the keyword mentioned above 145 studies were obtained. A further examination of the titles and abstracts to select the ones focus on student attitudes and student academic engagement reduced the number of studies to 64. After this stage, studies were examined based on their aims and variables to select the ones closely related to student engagement and/or attitudes towards use of student response systems. This examination reduced the number of studies to 40. Studies were selected based on studying at least student engagement as one of dependent variables since some of them studied learning performance, motivation, and perceptions along with student engagement. Table 1 presents the reviewed studies

Table 1. List of reviewed studies

Author

Year

1

Barr, M. L.

 

  Encouraging college student active engagement in learning: The influence of response methods 2014

2

Bartsch, R. A., & Murphy, W.

 

  Examining the effects of an electronic classroom response system on student engagement and performance 2011

3

Blasco-Arcas, L., Buil, I., Hernández-Ortega, B., & Sese, F. J.

 

  Using clickers in class. The role of interactivity, active collaborative learning and engagement in
learning performance
2013

4

Bojinova, E., & Oigara, J.

 

  Teaching and Learning with Clickers in Higher Education 2013

5

Campbell, C., & Monk, S.

 

  How do we get students talking in first year courses? Engaging students using learner response systems 2012

6

Cantero-Chinchilla, F., Díaz-Martín, C., García-Marín, A., & Estévez, J.

 

  Innovative Student Response System Methodologies for Civil Engineering Practical Lectures 2019

7

Chen, T.-L., & Lan, Y.-L.

 

  Using a personal response system as an in-class assessment tool in the teaching of basic college chemistry 2013

8

Chui, L., Martin, K., & Pike, B.

 

  A quasi-experimental assessment of interactive student response systems on student confidence, effort,
and course performance
2013

9

Fifer, P.

 

  Student perception of clicker usage in nursing education 2012

10

FitzPatrick, K. A., Finn, K. E., & Campisi, J.

 

  Effect of personal response systems on student perception and academic performance in courses in
a health sciences curriculum
2011

11

Fortner-Wood, C., Armistead, L., Marchand, A., & Morris, F. B.

 

  The effects of student response systems on student learning and attitudes in undergraduate psychology courses 2013

12

Gachago, D., Morris, A., & Simon, E.

 

  Engagement levels in a graphic design clicker class: Students’ perceptions around attention, participation
and peer learning
2011

13

Galal, S. M., Mayberry, J. K., Chan, E., Hargis, J., & Halilovic, J.

 

  Technology vs. pedagogy: Instructional effectiveness and student perceptions of a student response system 2015

14

Guarascio, A. J., Nemecek, B. D., & Zimmerman, D. E.

 

  Evaluation of students' perceptions of the Socrative application versus a traditional student response system
and its impact on classroom engagement
2017

15

Hall, R. H., Collier, H. L., Thomas, M. L., & Hilgers, M. G.

 

  A student response system for increasing engagement, motivation, and learning in high enrollment lectures 2005

16

Han, J. H., & Finkelstein, A.

 

  Understanding the effects of professors' pedagogical development with Clicker Assessment and Feedback
technologies and the impact on
students' engagement and learning in higher education
2013

17

Heaslip, G., Donovan, P., & Cullen, J. G.

 

  Student response systems and learner engagement in large classes 2014

18

Hedgcock, W. H., & Rouwenhorst, R. M.

 

  Clicking their way to success: Using student response systems as a tool for feedback 2014

19

Jain, A., & Farley, A.

 

  Mobile phone‐based audience response system and student engagement in Large‐Group teaching 2012

20

Johnson, K., & Lillis, C.

 

  Clickers in the laboratory: Student thoughts and views 2010

21

Johnson, T. R.

 

  Leveraging an Audience Response System for Student Learning and Engagement: Competitive Team Activities
in the Classroom with Undergraduate Medical Students
2016

22

Jones, S. J., Crandall, J., Vogler, J. S., & Robinson, D. H.

 

  Classroom response systems facilitate student accountability, readiness, and learning 2013

23

Katz, L., Hallam, M. C., Duvall, M. M., & Polsky, Z.

 

  Considerations for using personal Wi-Fi enabled devices as “clickers” in a large university class 2017

24

Kulatunga, U., & Rameezdeen, R.

 

  Use of clickers to improve student engagement in learning: observations from the built environment discipline 2014

25

Kung, J. W., Slanetz, P. J., Chen, P.-H., Lee, K. S., Donohoe, K., & Eisenberg, R. L.

 

  Resident and attending physician attitudes regarding an audience response system 2012

26

McClean, S., & Crowe, W.

 

  Making room for interactivity: using the cloud-based audience response system Nearpod to enhance engagement
in lectures
2017

27

Meedzan, N., & Fisher, K.

 

  Clickers in nursing education: An active learning tool in the classroom 2009

28

Micheletto, M. J.

 

  Using audience response systems to encourage student engagement and reflection on ethical orientation and behavior 2011

29

Mula, J. M., & Kavanagh, M.

 

  Click go the students, click-click-click: The efficacy of a student response system for engaging students to improve
feedback and performance
2009

30

Noel, D., Stover, S., & McNutt, M.

 

  Student perceptions of engagement using mobile-based polling as an audience response system: Implications
for leadership studies
2015

31

Patry, M.

 

  Clickers in large classes: From student perceptions towards an understanding of best practices 2009

32

Patterson, B., Kilpatrick, J., & Woebkenberg, E.

 

  Evidence for teaching practice: The impact of clickers in a large classroom environment 2010

33

Salemi, M. K.

 

  Clickenomics: Using a classroom response system to increase student engagement in a large-enrollment
principles of economics course
2009

34

Siau, K., Sheng, H., & Nah, F.-H.

 

  Use of a classroom response system to enhance classroom interactivity 2006

35

Sun, J. C.-Y.

 

  Influence of polling technologies on student engagement: An analysis of student motivation, academic performance,
and brainwave data
2014

36

Sun, J. C.-Y., Martinez, B., & Seli, H.

 

 

Just-in-time or plenty-of-time teaching? Different electronic feedback devices and their effect on student engagement

2014

37

Terrion, J. L., & Aceti, V.

 

  Perceptions of the effects of clicker technology on student learning and engagement: a study of
freshmen Chemistry students
2012

38

Thomas, C. N., Pinter, E. B., Carlisle, A., & Goran, L.

 

  Student response systems: Learning and engagement in preservice teacher education 2015

39

Trees, A. R., & Jackson, M. H.

 

  The learning environment in clicker classrooms: student processes of learning and involvement in
large university level courses using student response systems
2007

40

Yourstone, S. A., Kraye, H. S., & Albaum, G.

 

  Classroom questioning with immediate electronic response: Do clickers improve learning? 2008

Data Analysis
The selected studies were examined based on following elements on focus; publication year, context of using SRS, student attitude and/or academic engagement as the focus of the study, main methodology, number of participants, and results of the studies pertaining to the student attitudes and academic engagement. The reasoning for selecting these criteria explained below.

  • Publication year: To understand the trends about student response systems research, number of articles published per year was counted.
  • Context of using SRS: Context was investigated to understand what variety of departments and courses use the SRS technology to add interactivity in their lectures.
  • Focus of the study: To understand what dependent variables were studied in the articles.
  • Main methodology: To understand how the studies were conducted.
  • Number of participants: SRS technology was mainly developed for large classes and number of participants were investigated if the system is used for what it was developed at the first place.
  • Results of the study: To understand what the results indicated related to student engagement and attitudes towards using student response systems. Researcher and an academic expert categorized the results based on the information provided in the published articles.

Selected articles were categorized and examined based on these criteria above. It should be noted that in this review study, the selected articles were examined based on the information they provide about student attitudes and student academic engagement that were affected by the student response system implementation issues. The results of the studies were categorized by coding each articles results section. In order to establish reliability with the results, an academic expert reviewed the categories and the results of reviewed studies.

RESULTS AND DISCUSSION

In this section the results of the reviewed studies are presented based on the research questions.

Number of Studies per Year

The first research question was aimed at to understand how the trend in publication studies has been changed over the years. Figure 2 presents number of studies published over the years.

Figure 2. Number of studies published over the years

When looked at Figure 2, one can see that there were no studies published between 2000 and 2005 which focuses on student response system and student engagement as learning outcome. A reason for that Chickering and Gamson’s (1987) seven good practices for undergraduate education was commonly labeled as student engagement by higher education literature after 1998 with the works of Kuh and his friends (National Survey of Student Engagement, 1998) and Fredricks, Blumenfeld, and Paris (2004). As the concept of student engagement spreads towards other study fields of education, studies that focus on student engagement have shown an increasing trend between 2008 and 2013. After 2014 the number studies published related to student response systems and student engagement has shown a decreasing trend.

Country of Origin for Reviewed Studies

Reviewed research studies were originated from different countries. By looking at distribution of country of origin for reviewed studies, one can understand which higher education system is widely utilize student response systems. Table 2 provides distribution of number of studies by country of origin.

Table 2. Distribution of number of studies by country of origin


Countries

N

%

United States

24

60

Canada

4

10

Australia

3

7,5

Ireland

2

5

United Kingdom

2

5

Taiwan

2

5

Spain

2

5

South Africa

1

2,5

Total

40

100


Table 2 shows studies that have focused on student engagement while utilizing student response systems. It is shown that majority of studies are originated from the United States (60%) followed by Canada and Australia. Two studies were originated from Ireland, UK, Taiwan and Spain respectively. Furthermore, the earliest study that was conducted by countries other than US, Canada or Australia was published in 2010 in Ireland. Since historically the student response systems were invented and implemented first in the US higher education system, the number of publications originated from the United States are the highest. However, publications from other countries have started to pick up from 2010 up to date.

Context of the Study

Context of the study means where the study is conducted. Student response systems and student engagement studies in higher education are conducted in basically two main contexts; either in departmental courses or university wide courses. Table 3 shows the distribution of research studies based on field of study.

 Table 3. Distribution of research contexts based on field of study


Field of Study

N

%

Health Sciences

12

30

University Wide

7

17,5

Business School

7

17,5

Education

7

17,5

Economics

3

7,5

Civil engineering

2

5

Graphic design

1

2,5

Psychology

1

2,5

Total

40

100


It is important to understand what departments and courses the systems were used. According to the results, 30 percent of the studies were conducted in health sciences departments. Health sciences departments in these studies were nursing education, pharmacy, and medical education. One reason for health sciences departments to invest in student response systems is to increase student engagement and learning with various innovative ways. For example, the problem based learning, one of the highly studied instructional method, was invented in McMaster University medical school (Hung, Jonassen, & Liu, 2008). It seems that, as the classes become more crowded in the departments related to health sciences, the educators in health sciences seek for innovative ways of engaging students with the course materials and disseminate the results through scholarly journals. 


Other fields of studies that use student response systems frequently were Business School, Education, and University wide courses. All these fields contributed to this review study with 7 published journal articles. Student response systems were used in large freshman classes in business schools as a way of engaging students with the course materials and increase discussions between students. Similar purpose was seen in Education context as well with the pre-service teachers. University wide courses include Introductory Chemistry, Astrology, and Sociology and they have students from all majors and years. The studies conducted in university wide courses had 98 students per class in average so it can be counted as large class.
Table 3 also shows the range of field of studies that use student response systems as a way to engage students with the coursework. As an instructional technology, student response systems have been implemented in majority of departments in a higher education level. It can be considered as an indicator for dissemination of instructional technologies.

Methodological Approach

The trends in methodological approach of studies indicate the types of studies being conducted in student response systems. While Table 3 shows distribution of main methodological approaches as quantitative, qualitative, and quantitative and qualitative together, Table 4 shows primary methods of the studies.

Table 4. Distribution of main methodological approach.


Methodological Approach

N

%

Quantitative

31

77,5

Quantitative and Qualitative

8

 

20

Qualitative

1

2,5

Total

40

100

As a main methodological approach, quantitative research approach in education is aimed to quantify human behavior and connect it to certain educational outcomes to answer “What” question. Majority of reviewed studies (around 78 percent) preferred quantitative approach to guide their methods preferences. Since the main aim, sometimes the only aim, of the reviewed studies is to understand the impact of student respond systems on student engagement in large classrooms, the reviewed studies collected data by using cognitive tests and engagement scales to measure learning performance and students’ academic engagement with the classwork.


Qualitative research approach aims to help researchers to understand relations and interconnectedness amongst the studied phenomenon or actors to answer “How” question. Only one study was designed solely on qualitative approach (Campbell & Monk, 2012). The purpose of this study was to understand how the use of student response systems is related to quality of in-class discussions and student academic engagement.

Rest of the studies (20%) preferred to implement quantitative and qualitative approach together in the same study. Studies reviewed took this approach mainly to investigate and report two goals of the study; first to understand the impact of student response systems on students’ learning outcomes including at least student engagement, second they conducted the qualitative part to understand the students’ perceptions about implementation issues to receive formative feedback about the use of student response system in their classes through focus group or individual interviews. This mixed design studies are aimed to understand both the impact of the student response systems and the issues related to their implementations in classrooms (Creswell & Creswell, 2017).

Table 5 presents the distribution of research methods used in the reviewed studies. In Table 5, Research methods presented are the primary method of the studies reviewed. As Table 5 indicates, 8 studies used quantitative and qualitative approach together, however they used quantitative approach as their primary method of study. Table 5 was prepared based on primary method of the reviewed studies.

Table 5. Distribution of primary methods of reviewed studies.


Primary Method

N

%

Experimental

21

52,5

Survey study

16

40

Correlational

2

5

Case Study

1

2,5

Total

40

100

As Table 5 indicates, majority of the studies (52,5 percent) utilized experimental design. In these experimental studies, researchers compared situations where use and non-use of student response systems took place in the instruction and situations where they compared different types of student response systems along with non-use of any student response systems. However, there were various versions of quasi-experimental design as in many educational research studies. The first type of experimental design includes two groups of students as experiment and control groups, where the classes were selected based on convenience. The researchers did not have any control to form the groups randomly.  The second type of studies employed single group quasi-experimental design. The single group quasi-experimental design was used where the group was treated as control group and then following semester the instructor used student response system hence the group became an experimental group. The third type of studies compared impacts of different types of student response systems. Nonetheless, all studies employed experimental methodologies tried to understand significant differences of students’ educational outcomes within and between groups.

Second major group was survey studies. Although survey studies are not as robust as experimental studies, they provide a picture of the situation at hand by describing measured variables. Reviewed studies utilized survey methodology for either describe the use of student response systems or analyze the differences between groups established based on certain demographics. The analysis methods used in the survey studies were descriptive statistics, such as reporting mean, standard deviation or frequency of responses to items in the data collection tolls, and analysis of differences in variance of groups (t-test or ANOVA).

Correlational and case studies made up a very small portion of reviewed studies. Major aim for correlational studies was to understand the correlation between certain educational outcomes in the case of using student response systems in the large classrooms. There were only one case study aimed to explain students’ perceptions towards impact and use of student response systems in their classroom.

Participants

Participants of all studies were from higher education institutions, however due to the nature of the courses that the researches were conducted and reported, it is impossible to give exact proportions of freshman, sophomore, juniors, seniors, or graduate students. Based on the review of the methods section of these articles majority of studies were conducted on freshman classes. The number of participants were ranged from 26 to 5459. Average number of participants was 373 per study with median being 114. 78% of the studies (31 studies) reviewed had more than 50 participants contributed to their study. Based on these findings it is fair to say majority of studies were conducted in large higher education classes, which makes use of student response systems appropriate.

Results of the Studies

The research studies reviewed in this study had at least one question or dependent variable about student academic engagement or attitudes about using student response system, therefore the findings from reviewed studies were grouped based on their findings about student engagement and student attitudes. Grouping of findings from studies were yielded 12 distinct categories. Showing the impact on student engagement and attitudes towards using student response systems. Table 6 shows these categories.

The most frequently findings obtained from reviewed studies were “increased student engagement” and “increased learning performance”. This finding mainly comes from the results of experimental studies. The studies which compared use and non-use group of student response system analyzed the differences between groups and reached the conclusion that use of student response systems increases student engagement or student learning performance measured by cognitive tests.  However five studies that have similar aim and methodology claimed that use of student response systems did not make any difference on student learning and performance.

Table 6. Summary of findings from reviewed studies

Category

Frequency

  • Increased Engagement

17

  • Increased Learning performance

12

  • Satisfaction and recommendation of the course

8

  • Increased and maintained attention during lecture

7

  • Increased participation due to decrease anxiety

7

  • Increased Metacognitive awareness of the course content

6

  • Receive instant feedback

6

  • Increased Motivation

4

  • Increased Interactivity

4

  • Encourage in-class discussions among students

4

  • Encourage attendance

4

  • No difference on learning performance or student engagement

5

 

 

 

 

 

 

 

 

 


The third frequently reported finding is related to student satisfaction. Using student satisfaction surveys, reviewed studies found that using student response system in large lecture classes increases student satisfaction and their likelihood of recommending the course to other students. Fourth and fifth categories reported in the studies reviewed were increased and maintained attention during lectures and increased participation due to decreased anxiety. One of the utilization aims of student response systems is to add unexpected questions or administer quick quizzes during the lectures. Because of this utilization, studies reported that students’ attention to lectures is increased and they were able to maintain it throughout the lectures. Another use of student response systems is to collect answers anonymously. The studies reported that this use of the system encouraged students to give their answers in anonymity and helped them to decrease anxiety to speak in public in large lectures. The other benefit of the system reported by the studies was to help to increase metacognitive awareness. Using instant quizzes or questions during the lessons gave students an idea about what is important in the course content and helped them to spend more time on the important parts of the content. In relation to help to increase metacognitive awareness, the use of system helped students to receive instant feedback about their response and their performance level in their class. Through the information they received from instant feedback they regulate their learning process.

Last four benefits from the utilization of the system have equal number of mentions in the reviewed studies. Each category reported four times in the studies. The utilization of the system increased motivation of the students, interactivity in the classroom encouraged in-class discussions among students, and encourage attendance to classes. Due to use of questions and quizzes, studies reported that students had increased motivations to participate the lectures. It also increased interactivity with among the students and instructors since the questions provide a medium for discussing the answer of the questions. In some studies, instructors posted questions that required students to discuss in small teams before giving their answers. This type of utilization increased encouraged and improved the quality of discussions among students. Finally, the use of systems improved student attendance to lectures due to making the lectures interesting and receiving points for each correct answer posted on the lectures using student response systems.


Although majority of studies reported positive outcomes for the utilization of student response systems, five studies did not find any difference on student academic engagement and student learning performance between the groups using and non-using student response systems. A further investigation of the result of these studies revealed that class level could have some impact on it. There was an interesting relationship between the level of students’ class and the impact of the system. Since well majority of courses in reviewed research studies were freshman level, it is fair to say that the impact of the student response system implementations are positive for freshman level students, however, two studies took class level into account and reported that while SRSs have positive impact on freshman students, it did not generate significant differences in engagement and learning outcomes when compared to traditional way of lectures. Even in one study (Jain & Farley, 2012), participants among junior students stated how student response systems negatively impacted their meaningful understanding because of the quiz questions during the lectures.

The findings of this review study is consistent with the previous review studies. Use of student response systems positively impacts student academic engagement and student academic achievement. Regardless of the context and the field of the study, the student response systems helped students to improve their engagement, achievement, motivation, interest, and interaction with the courses. However the studies reported these impacts were generally conducted on freshman level courses. The studies conducted on junior/senior level or graduate level courses reported no impact of utilization of student response systems on student learning outcomes. This situation might stem from novelty effect of the system on students. Having lectures with a student response system in their first year of higher education, students might be interested in this innovation. As they advance in the higher education, they might lose their interest in the questions in lectures thus the use of system might not affect their educational outcomes.

CONCLUSION

The purpose of this study is to review the research studies that focus on use of student response systems and student academic engagement. In general sense, student academic engagement is defined as “students’ engagement with academically meaningful activities” (Pike & Kuh, 2005). These activities can be reading, writing, discussing, presenting, and interacting with peers and instructors. In parallel to Chickering and Gamson’s (1987) seven good practices of instruction, student academic engagement took place in research studies as reading and interacting with the peers. Research studies between 2000 and 2019 were reviewed. Research trend in use of student response systems and their impact on student academic engagement tends to decrease recent years. Studies were conducted in all types of academic subjects in higher education especially in health related sciences, business and education related departments. Majority of the studies utilized experimental methods and they reported positive impact of student response systems on student engagement. The reported engagement activities such as receiving instant feedback, increased interactivity, increased quality of in-class discussions and encouraging attendance were in conformity with the Chickering and Gamson’s (1987) undergraduate education practices.

In order to understand how the student response systems affect student academic engagement, further studies are needed. Majority of studies that investigate impact on student response system and student academic engagement were not conducted by taking the student academic engagement in their focus, therefore their methods to measure student engagement somewhat raise questions about validity and reliability issues. To address these issues, the literature needs more studies on measuring student academic engagement with valid and reliable instruments. Second, majority of the studies conducted with the first year courses therefore with first year higher education students. A few studies with higher level of classes reported negative effects of student response systems. Therefore more studies needed to understand the impact of student response systems on different levels of classes in higher education. Lastly, majority of the studies were conducted in the Universities in the United States, more international studies with focusing on student academic engagement needed to be able to understand the impact of student response systems in different higher education systems and to be able to make international comparison of the uses and impacts of student response systems.

REFERENCES

Asano, M., Mikawa, K., Nishina, K., Maekawa, N., & Obara, H. (1995). Improvement of the accuracy of references in the Canadian Journal of Anaesthesia. Canadian journal of Anaesthesia, 42(5), pp. 370 – 372.

Basak, S. K. (2016). Examining the inaccuracies of citations and referencing A Case of a Computer Science and Engineering Journal. 9th International Conference on Electrical and Computer Engineering 20 – 22 December 2016, Dhaka, Bangladesh, pp. 106 – 109

Collins, A., & Halverson, R. (2009). Rethinking education in the age of technology: The digital revolution and schooling in America. San Francisco: John Wiley and Sons.

Evans, J. T., Nadjari, H. I., & Burchell, S. A. (1990). Quotational and reference accuracy in surgical journals. A continuing peer review problem. JAMA – The Journal of the American Medical Association, 263(10), pp. 1353 – 1354.

Gabriel, T. (2010). Plagiarism lines blur for students in digital age. The New York Times, 1(8).

Gosling, C. M., Cameron, M., & Gibbons, P. F. (2004). Referencing and quotation accuracy in four manual therapy journals. Manual Therapy, 9(1), pp. 36 – 40

Harzing, A. W. (2002). Are our referencing errors undermining our scholarship and credibility? The case of expatriate failure rates. Journal of Organizational Behavior, 23(1), pp. 127 – 148.

Hensley, M. K. (2011). Citation management software: Features and futures. Reference & User Services Quarterly, 50(3), 204 – 208.

Lane, A. (2009). The impact of openness on bridging educational digital divides. The International Review of Research in Open and Distributed Learning, 10(5).

Liang, L., Zhong, Z., & Rousseau, R. (2014). Scientists’ referencing (mis)behavior revealed by the dissemination network of referencing errors. Scientometrics, 101(3), 1973 – 1986.
Maurer, H. A., Kappe, F., & Zaka, B. (2006). Plagiarism-A survey. J. UCS, 12(8), pp. 1050 – 1084.

Mertens, S., & Baethge, C. (2011). The virtues of correct citation: careful referencing is important but is often neglected/even in peer-reviewed articles. Deutsches Ärzteblatt International, 108(33), p. 550.

Nzekwe-Excel, C. (2019). Addressing students’ referencing errors. Retrieved from http://www.emeraldgrouppublishing.com/teaching/issues/referencing_errors.htm on 03 April 2019.

O’Connor, A. E., Lukin, W., Eriksson, L., & O’Connor, C. (2013). Improvement in the accuracy of references in the journal Emergency Medicine Australasia. Emergency Medicine Australasia, 25(1), pp. 64 – 67.

Ochieng, D., & Dheskali, J. (2016). Simple tools on the internet for English non-native academic writers in Africa. Academic Writing and Research across Disciplines in Africa. Göttingen: Cuvillier, pp. 59 – 80.

Peters, O. (2000). Digital learning environments: New possibilities and opportunities. The International Review of Research in Open and Distributed Learning, 1(1).

Rakhshan, V. (2012). Reliance of Scientific Publication on Citation Management Software: The New Generation. Journal of Dental Research, Dental Clinic, Dental Prospects, 6(4), p. 158 – 159.

Sweetland, J. H. (1989). Errors in bibliographic citations: A continuing problem. The Library Quarterly, 59(4), pp. 291 – 304.

Weller, M., & Anderson, T. (2013). Digital resilience in higher education. European Journal of Open, Distance and E-learning, 16(1).

Wilks, S. E., & Spivey, C. A. (2004). Views of reference list accuracy from social work journal editors and published authors. Advances in Social Work, 5(2), 172 – 181.

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

...........................................................................................................................................

Journal of Learning and Teaching in Digital Age. All rights reserved, 2016. ISSN:2458-8350