Data for decision-making around online teaching
Along with all of our colleagues around Australia and the world, the chemists at Deakin had to rapidly transition to online delivery and assessment during our first trimester in 2020 after 2 weeks of class. We are very fortunate to work at an institution that is flexible and well-equipped with technology, so many aspects of the transition were seamless.
Second and third year units adopted the common approach of recording videos of their usual practical activities and having students complete their usual worksheets and reports, with live online consultation sessions replacing face to face practicals. Lectures and tutorials for these smaller classes were moved to BbCollaborate, and students had little feedback except appreciation of our efforts. A big effort was made to generate multiple equivalent versions of exam questions, with students uploading photos of their handwritten answers within a short time limit, which we are confident limited cheating. Overall results were similar to previous years.
My main effort was in the extremely large introductory chemistry unit. Our lecture theatres are equipped with Echo360 technology, which was already being used to offer live streams and recordings to students who could not attend in person prior to the shutdown. Our in-class polling system is online, so could be continued unchanged without students in the room. Combining this delivery with a live chat function, we found that many students actually engaged more through the remote sessions because they did not have to be confident to ask a question through the chat. Responses to polls were at the same level as the previous year. Results show that engagement in the in-class polls had the same impact on student outcomes as in previous years. Student feedback was positive for this form of lecture delivery.
We developed and implemented home practicals using ordinary household items. Together with structured worksheets, we are confident that these activities had a positive impact on student engagement and learning. While the home practicals are considered suitable at the first year level, they are not adequate for higher level study and some face to face practicals are necessary for students to become acquainted with the equipment and hazards of a chemical laboratory. Practical considerations such as timetabling are also relevant, because home practicals do not require space and specific times.
The most important data comes from the direct comparison of students who attended face to face practicals prior to the shutdown, with those who were given videos of those same experiments and the same worksheet to complete in a similar time frame. A mean 0.9/10 decrease in score was observed for the latter group, indicating that something about the support or learning in the laboratory environment was not adequately replicated online.
In common with our later year units, the final exam was run as an open book, timed quiz with handwritten answers and each student receiving a unique exam from question pools for each question. Cheating was minimal but we have evidence that some students used Google for some responses, leading to characteristic wrong answers. We are still analysing this data; interestingly, historically a bimodal distribution has been observed for the final exam, but in 2020 a normal distribution was found.
Tutorials were moved onto BbCollaborate, and our data shows that of all the transitions to online delivery, this change resulted in the only decrease in impact of engagement compared to face to face tutorials. Previously, each tutorial attended led to an average 2.7/100 increase in final score, whereas in 2020 the increase per tutorial was only 1.8.
Our careful data analysis from this extremely large class has led to evidence that in some cases supports and in other cases challenges our assumptions about student learning and engagement. Such evidence is required, rather than emotions, when decisions are being made about future teaching and assessment approaches.