Julie Damron and Jennifer Quinlan assess student outcomes in the blended classroom
Located in Provo, Utah, Brigham Young University (BYU) is a private institution with one of the nation’s largest language teaching programs—70% of students speak a second language and .32% of students take world language courses. Over 55 languages are regularly taught on campus, with over 40 more available based on student needs. As programs continue to expand, unique needs arise, such as more classroom space, more flexible course scheduling, and academically meaningful study abroad and internship experiences. In response to some of these needs, several departments are developing or expanding their online and blended course options. Enabling educators to effectively teach a language online implies instructional design that reflects extensive scaffolding and careful, relevant implementation of technology and learning resources, as well as knowledge of assessment of online language teaching and learning and its impact on language education. Assessment of language-learning outcomes in an online environment and documenting students’ learning and progress in the blended classroom can be challenging. This article explores the creation, implementation, and assessment of a blended Korean language class at Brigham Young University. It also compares assessment methods and student attainment of learning outcomes in the same course administered via a traditional classroom and an online class.
We compared data from three beginning Korean classes: on campus face-to-face (F2F), blended (online and F2F interaction), and online (no F2F element). All three classes were held in the fall and winter of 2014, with 37 students in the blended section of the class, 30 students in the F2F section, and 26 students in the online section.
-The learning outcomes for all three classes were the same:
-Read (with limited comprehension) and write proficiently.
Discuss topics such as family, school, months of the year, hobbies, and vacation plans.
Interact linguistically on a limited basis using middle and high language.
Instruction and assessment
All three sections of the course were taught by the same professor, with three different TAs. All three courses used the same textbook and workbook and performed similar speaking and listening activities, assignments, and assessments.
The F2F class met five days a week on campus with the professor and the TA. All work was completed in the classroom or at home in a hard-copy workbook. Using a flipped classroom model, the blended class met together four days a week, with additional material (lectures, slides, quizzes, tests, and chat rooms) online for a fifth day of self-study. The online class was delivered via a series of lessons with synchronous and asynchronous meetings through a web browser with no face-to-face contact; course access was not limited to time or place.
At the end of each semester, we examined student success in overall course grade, quiz scores, chapter tests, midterm exam, and final exam. We also looked at student minutes online in comparison to minutes in the classroom and examined any relationship between minutes online and final course grade. Finally, we compared positive and negative student comments in all three sections of the course.
The faculty and an instructional designer evaluated significant course elements which could be delivered online versus face to face. We did not develop a web-facilitated version of the course. The Conversation Café is an online forum moderated by a TA. Students can drop in at any time during open hours (there are set hours five days per week) to practice speaking and applying concepts from class. The Conversation Café was available to students in the blended and online sections; it was not available in the F2F section. TAs reported that students used the Conversation Café to seek tutoring/assistance on specific items, as well as to practice free and unscripted dialogue. While the textbook content of the course included many scripted dialogues for practice, the benefit the Conversation Café seemed to provide was an element of spontaneous oral production.
Overview of Findings
Results of the study revealed the following
-Time spent online had a positive correlation with overall grade. This may come as no surprise, but students who logged more minutes online/in the course material reflected higher final course grades than their counterparts who spent less time online. This finding is not truly representative, however, as a student can be “logged in” but not actively engaged with the online content. We have not found a learning management system which can discriminate between actively interacting with online course material (e.g., reading, scrolling, answering questions, etc.) and simply accessing material (e.g., logging in and then stepping away from the desk).
-Students in the traditional classroom appeared to spend significantly more observable time with class material (tests, quizzes, slides, etc.). We emphasize observable time, as it is difficult to know how much of F2F classroom time every student actually spends engaged. Likewise, it is difficult to know how much time blended/online students might be spending studying and reviewing without being logged into the online material.
-Course grades for each class were similar. The difference among quizzes, midterm exams, final exams, and overall course grades for each of the three sections was less than 2%. However, the grade for chapter tests had a variance of 7%, with the lower score evident in the blended and online courses. We are exploring the factors which may contribute to this variance.
-Student evaluations were slightly lower for blended classes. Throughout the semester, the faculty asked students if they wanted to have a fifth day of instruction face to face for some extra review or in-person work. Consistently, their response was no. However, in end-of-course evaluations, students indicated they wanted more interaction with their professor. They also noted some negative responses to the LMS and glitches with some of the online course elements. This was the first time the faculty had run a blended section, making it a first for students as well. There was a sense of a steep learning curve for both bodies.
There was positive and negative feedback regarding each of the sections. Some of the feedback has already been identified. Further, while some students may feel more comfortable with the familiarity of face-to-face instruction, evaluations reflected that students valued the time/place flexibility offered by the blended and online sections. While they seemed to seek more in-person teacher interaction (as indicated in student end-of-course surveys), which is clearly afforded by the F2F section, they also valued the ability to access online material repeatedly and at their convenience. For example, blended students revisited course material on average twice as many times as it was presented in the F2F class. In the F2F section, students did not have access to any course materials online. Finally, while there were some technical glitches in the course which received negative feedback, students gave positive feedback regarding the ease and convenience of taking quizzes online.
There were a few unanticipated findings and/or limitations in this research. First, we had little control over who took the online course. If students were heritage or native speakers of Korean, that may have skewed the findings. Second, we found that test scores in the online course dropped significantly when they became proctored tests. Third, we found that students preferred not meeting with the professor five days a week during the semester, but then wrote negative comments about not seeing the professor enough on end-of-semester evaluations. Finally, we became aware of the extent to which online students were “binge studying,” or accessing significant amounts of course material immediately prior to an assignment deadline. Based on our findings, we are anticipating running a second phase of research to identify more discrete points of data, thus validating the results from this study. We are also implementing an ongoing measurement of student satisfaction with the course experience and a structure for student support when they encounter technical issues. Additionally, further research will explore the effects of the Conversation Café on oral proficiency, student engagement, sense of community, and mastery of learning outcomes. We anticipate comparing these elements among students in each set of classes (F2F, blended, online); the impact of the café on overall oral proficiency were not measured in this study. As part of normal course evaluation and refinement that BYU conducts for blended and online courses, we will also examine the difficulty and discrimination of assessment items and compare student performance on discrete elements within assessments. Our intent is to identify if students score better or worse on specific items within each assessment even though questions are identical in all three sections.
In developing the blended and online course design and implementation strategies, we explored learning theories (e.g., Bandura, Vygotsky, Gagne), pedagogical approaches to student learning objectives, industry reviews of blended and online instruction, and elements of classroom formats (face-to-face, blended, online). This information helped guide the instructional design of the courses as well as explore techniques for interaction/student engagement, instruction, and assessment. While we anticipated student scores in the blended classroom would not be as high as those in the traditional F2F classroom, we discovered successful student learning in the blended language classroom was possible.
Dr. Julie Damron is associate professor and associate section head of Asian and Near Eastern Languages and Jennifer Quinlan, MFS, is academic product consultant for world languages and a second-language acquisition PhD candidate at Brigham Young University, Provo, Utah.
ADFL Guidelines on the Administration of Foreign Language Departments. www.adfl.org/resources
Allen, I., and Seaman, J. (2013). Changing Course: Ten years of tracking online education in the United States. Retrieved April 6, 2015.
UCF. “Benefits of Blended Learning.” blended.online.ucf.edu/about/benefits-of-blended-learning
Clayton Christensen Institute for Disruptive Innovation. January 1, 2012. “Blended Learning Model Definitions.”www.christenseninstitute.org/blended-learning-definitions-and-models/
Center for Digital Education. (2012). “Realizing the Full Potential of Blended Learning.” echo360.com/sites/default/files/CDE12%20STRATEGY%20Echo360-V.pdf
Elvers, G., Polzella, D., & Graetz, K. (2003). “Procrastination in Online Courses: Performance and attitudinal differences.” Teaching of Psychology, 30(2), 159-162. www.anitacrawley.net/Articles/elversAttitudinalDifference.pdf
Hill, P. (February 26, 2013). “The Most Thorough Summary (to date) of MOOC Completion Rates.” mfeldstein.com/the-most-thorough-summary-to-date-of-mooc-completion-rates
Ho, A., and Lu, L. (2006). “Testing the Reluctant Professor’s Hypothesis: Evaluating a blended-learning approach to distance education. Journal of Public Affairs Education, 12(1), 81-102. www.jstor.org/stable/pdf/40215727.pdf?acceptTC=true
“Is Blended Learning the Best of Both Worlds?” (January 17, 2013). onlinelearninginsights.wordpress.com/2013/01/17/is-blended-learning-the-best-of-both-worlds
Rovai, A., and Jordan, H. (2004). “Blended Learning and Sense of Community: A comparative analysis with traditional and fully online graduate courses.” The International Review of Research in Open and Distributed Learning, 5(2). www.irrodl.org/index.php/irrodl/article/viewArticle/192/274
The Student View of Blended Learning. (January 1, 2011). www.ecsu.edu
Suppes, P., and Morningstar, M. (1969). “Computer-Assisted Instruction.” Science, 166, 343-350. suppes-corpus.stanford.edu
Sheehy, K. (2013, January 8). “Online Course Enrollment Climbs for Tenth Straight Year.” www.usnews.com/education/online-education/articles/2013/01/08/online-course-enrollment-climbs-for-10th-straight-year
“Understanding the Spacing Effect.” www.knowledgefactor.com