In ShortSasha Calhoun and Evan Hazenberg recently transformed the printed exercises provided in their course notes into online quizzes in Blackboard. These were helpful for not only students in allowing them get practical experience using linguistic skills while gaining course credit, but also for teaching staff, allowing them to obtain feedback on student progress and reducing their test marking load.
Full DetailIn 2015, Sasha and her colleagues the School of Linguistics and Applied Language Studies decided to change the 200 level Introduction to Linguistics course to 100 level to introduce students to the formal, methodological elements of the subject earlier in their degree. In the past, she has noticed that Bachelor of Arts students were often not familiar with the quasi-mathematical type of analysis commonly used in formal linguistics and thus decided that students needed more supporting opportunities to practice these skills. There was a set of course notes that had been developed over the years for the 200 level introductory course available which had useful practice activities in it. However, these had no accompanying answers. Further, Sasha and colleagues had no way of monitoring student use of them, providing feedback or guiding learning, so they often wondered how many students actually completed the activities beyond what was compulsory and what if any value there was for student learning.
“We have known from past experience that students find the linguistics analysis skills challenging, and particularly for Arts students, as it’s a big break from what they have done before. So we wanted to make every effort to make sure that first-years had sufficient support to develop those skills in a way that was manageable for teaching staff. I mean we couldn’t have one-to-one tutorials with 100 students to guide through how to do allophones!” – Sasha In response to this problem, Sasha and her colleagues decided to convert these exercises from paper into a digital format in the form of online tests in Blackboard. It was important to Sasha that these tests be easy to use and manage whilst limiting any extra workload for staff. She also wanted these resources to be useful for students and be able to give both staff and students feedback about progress in the course.
After receiving a Teaching and Learning Grant from the Faculty of Humanities, Sasha and her colleagues employed Evan Hazenberg (a PhD candidate and tutor in the course) to work on constructing the activities in Blackboard. With about 140 hours of paid work at his disposal, Evan was able to create digital versions of the problems found within the course notes and generate worked solutions. Although there were challenges with the use of linguistic symbols, handling problems with multiple steps, and feedback for automatic marking within Blackboard, he found the overall process of test creation manageable and straight-forward.
The primary focus for the activities was to get students to practice the linguistic concepts and skills as they were covered in class and to revise prior to the end of course exam. To this end, the tests were attached to the existing modules within the course and made available as that content was covered in lectures. Students were required to complete four activity tests over the trimester and these counted towards a small component of their overall course grade. Other optional tests were available for students within each module.
In doing the tests students got to practically apply the skills learnt in lectures while obtaining useful feedback on their knowledge of course content. Likewise, teaching staff on the program were able to regularly track student knowledge and progress. An application of this was a pre-tutorial test for students to complete, which tutors reviewed before their tutorial. Tutors then analysed their groups’ results to see where the gaps were in student knowledge, which they were then able to focus on in their tutorial. The tutorials were therefore optimised to focus only on what students were struggling with, not what they already had grasped, to the benefit of both students and tutors.
Sasha and Evan were pleased with the results of digitising this activity material. They reflect that along with the benefits mentioned above, they now have a resource which can be redeployed into other courses, for example as a reminder for 200 and 300 level students about concepts they have already covered. They also found that students appeared to be more engaged in lectures and tutorials than other Linguistics courses they had taught in.
There has been some very positive evidence supporting the use of these activity tests. Evan found that there was a strong trend that students who completed more of the online tests performed better in the final exam than those who had not engaged with the tests. The end of year evaluation by students was also overwhelmingly positive. Out of the 47 responses, 94% of students either agreed or strongly agreed that online components of the course contributed to their learning. Likewise, of the 42 respondents who chose to comment, 22 of them mentioned the online content and quizzes, and all comments did so in a positive light.
Pedagogical PerspectivesCommentaries on the pedagogical ideas behind this case study, written by academics from the Centre for Academic Development
Learning Design and Application
The key word in this case study is “formative.” Formative activities provide strategies for ongoing improvement and development (as opposed to summative ones which provide a measure of capability or knowledge at a specific point in time). The challenge when designing formative activities is including an effective pedagogical strategy aimed at guiding students to adopt productive approaches to their learning.
Online quiz tools provide a variety of ways of setting students problems and collecting answers from them. Typically they fall into two major types – questions that present a variety of responses and ask the student to select one (e.g. multiple choice, or images with selection points) or ask the student to provide the correct response in the form of a numerical or text response (e.g. maths problems or open text questions).
Multiple choice style questions with a limited set of responses predefined by the teacher are generally much easier to design to provide formative feedback. The choices can be presented in a variety of ways including as a list or as indicated areas in an image. Each of the answers presented to the student should be linked to a specific piece of feedback. The correct answer should elaborate on why it is correct, identifying the key features to reinforce them to the student. Incorrect answers should all be designed to respond to known fallacies of reasoning or errors in understanding, and the associated feedback should be aimed at guiding students to where they can learn how to correct this. Further detailed guidance on designing effective multiple choice style questions can be found here
Questions that are open-ended or provided without a prescribed list of responses are much harder to design to provide formative feedback. Considerable care is needed to select problems that generate a predicable range of responses such that suitable advice on how to address mistakes can be given automatically. One strategy is to only provide correct/incorrect feedback initially but to also retain records of student responses that can be analysed in aggregate promptly by a human. This can then be communicated back to the students through a forum or class session with examples of common mistakes identified along with likely causes and ways to address these.
The final point to emphasise is that formative approaches depend upon the ability of the student to subsequently demonstrate that they have learnt. Students need the ability to complete activities multiple times until they reliably demonstrate their learning. An effective strategy to support this involves the creation of question pools that are very much larger than the delivered quizzes, and then a selection are presented to students as a quiz that changes each time. This represents a much greater amount of work to create, but is ultimately more reusable. If this approach is used, care should be taken to analyse student responses to each question to ensure that strong students do not get the wrong answer (suggesting an issue with the framing of the question) or that all students do not get it right (suggesting it is too simple or contains an obvious pointer to the correct answer). Educational Commentary
Sasha Calhoun has moved original pen-and-paper exercises in a 100-level linguistics course online. This decision was motivated by the need for first-year linguistics students to develop key analysis skills. These are the skills students are likely to need in a number of courses they will take as part of their degree programmes in linguistics. Students taking this course are often new to linguistic analysis and need adequate time on task (Chickering & Gamson, 1987) to practice these newly learned skills. Receiving regular personalised feedback on performance (Hattie & Gan, 2011) helps students identify misunderstandings and fine-tune their application of the target skills.
By moving this learning activity online and delivering it in the form of formative assessment (Cross & Angelo, 1988), both its effectiveness and efficiency are improved. Students are able to complete the quizzes multiple times, each time receiving immediate feedback on their performance. This makes learning more interactive and creates additional practice and learning opportunities (Corbett & Anderson, 2001). Because the quizzes are automatically marked and furnished with model answers, feedback and explanations, this activity requires no additional marking or feedback time from the lecturers or tutors, creating teaching efficiencies. In addition, because the quizzes are completed online, the information about students’ patterns of learning is automatically stored in Blackboard, and the lecturer is able to see how students are progressing in the course, how often they engage with the quizzes, and what concepts and skills need additional attention during lectures and tutorials.
This simple digital innovation worked well because it builds on the affordances of online self-marking exercises to make learning more flexible, independent and personalised (students can access exercises online at their convenience, complete them multiple times and make mistakes without losing face). It also builds an important component of learner autonomy - the ability to evaluate personal progress in a course and make informed decisions about the kind and amount of study needed to achieve set learning goals.
ReferencesChickering, A., & Gamson, Z. (1987). Seven principles of good practice in undergraduate education. AAHE Bulletin, 39, 3-7.
Corbett, A. T., & Anderson, J. R. (2001). Locus of feedback control in computer-based tutoring: Impact on learning rate, achievement and attitudes. Proceedings of ACM CHI’2001 Conference on Human Factors in Computing Systems, (pp. 245–252). New York: ACM Press.
Cross, K. P., & Angelo, T. A. (1993). Classroom Assessment Techniques (2nd ed.). San Francisco: Jossey-Bass.
Hattie, J.A.C., & Gan. M. (2011). Instruction based on feedback. In R. Mayer & P. Alexander (Eds.), Handbook of Research on Learning and Instruction. (pp. 249-271). New York: Routledge. Reproduce this in Your Own TeachingThis is a quick-start guide for using Blackboard tests in your own teaching. If you would like additional support, contact one of our learning and teaching team
Step 1Consider which formative activities from your course could benefit from being digitised. Examples include:
Step 2Speak to one of our learning and teaching team about how best to convert these activities into Blackboard quizzes.
Step 3Allocate time to the project. Digitising activities will take time but the product will be reusable in future iterations of the course.
Step 4Once uploaded to Blackboard, take the time to test the quizzes before they are given to students.
ResourcesHelpful resources related to this case study.
Related Technology
Related Case Studies
Case studies which cover related examples.
Getting HelpContact one of our learning and teaching team to discuss these ideas further and for support using the technologies.
|