top
Articles
  • OpenAccess
  • Enhancing University Student Engagement Using Online Multiple Choice Questions and Answers  [ICHER 2015]
  • DOI: 10.4236/jss.2015.39011   PP.71 - 76
  • Author(s)
  • D. Biggins, E. Crowley, E. Bolat, M. Dupac, H. Dogan
  • ABSTRACT
  • For many education providers, student engagement can be a major issue. Given the positive correlation between engagement and good performance, providers are continually looking for ways to engage students in the learning process. The growth of student digital literacy, the wide proliferation of online tools and the understanding of why online gaming can be addictive have combined to create a set of tools that providers can leverage to enhance engagement. One such tool is Peerwise, https://peerwise.cs.auckland.ac.nz/, an online, multiple choice question (MCQ) and answer tool in which students create questions that are answered by other students. Why use MCQs? Using MCQs tests knowledge, provides reassurance of learning, identifies gaps and makes this data available to student and provider. Students use this information to focus their time on areas requiring additional work [1], benefiting from the early feedback provided. Formative assess- ments using MCQs are beneficial in preparing students for summative testing and are appreciated and liked by students [2]. Providers can use this information to determine how the material is being received and react accordingly. Students use Peerwise to create MCQs that are answered, rated and commented on by their peers. Students’ engagement in Peerwise earns trophies for contributing regular use and for providing feedback, all of which act to stimulate further engagement, using the principles of gamification. Bournemouth University, a public university in the UK with over 18,000 students, has been embedding Peerwise in under-graduate and post-graduate units since 2014. The results experienced by Bournemouth University have been beneficial and correlate with other studies of using Peerwise [3] [4]. A statistically significant improvement was seen by one cohort of students compared to the previous year where Peerwise was not used. However, no correlation was found between Peerwise participation and a student’s unit mark. The processes followed by Bournemouth University and the advantages and disadvantages, backed by qualitative and quantitative data, will be presented so that other institutions can gain an informed view of the merits of Peerwise for their own teaching and learning environments.

  • KEYWORDS
  • Student Engagement, Peerwise, MCQ, Gamification, Technology-Enhanced Learning
  • References
  • [1]
    Fielding, M. (2001) Students as Radical Agents of Change. Journal of Educational Change, 2, 123-141.
    http://dx.doi.org/10.1023/A:1017949213447
    [2]
    Foos, P.W. (1989) Effects of Student-Written Questions on Student Test Performance. Teaching of Psychology, 16, 77- 78.
    http://dx.doi.org/10.1207/s15328023top1602_10
    [3]
    Denny, P. (2010) Motivating Online Collaborative Learning. ITiCSE’10: Proceedings of the 15th Annual Conference on Innovation and Technology in Computer Science Education, Ankara, June 2010, 26-30.
    http://dx.doi.org/10.1145/1822090.1822176
    [4]
    Luxton-Reilly, A., Denny, P., Plimmer, B. and Sheehan, R. (2012) Activities, Affordances and Attitude: How Student- Generated Questions Assist Learning. Proceedings of the 17th ACM Annual Conference on Innovation and Technology in Computer Science Education, 3-5 July 2012.
    http://dx.doi.org/10.1145/2325296.2325302
    [5]
    Bloxham, S. (2007) The Busy Teacher Educator’s Guide to Assess-ment.
    http://dera.ioe.ac.uk/13028/
    [6]
    Simon, B. and Cutts, Q. (2012) Peer Instruction: A Teaching Method to Foster Deep Understanding, Communications of the ACM, 55, 27-29.
    http://dx.doi.org/10.1145/2076450.2076459
    [7]
    Entwistle, N. (2000) Promoting Deep Learning through Teaching and Assessment: Conceptual Frameworks and Educational Contexts. 1st Annual Conference ESRC Teaching and Learning Research Programme (TLRP), University of Leicester, November 2000.
    http://www.tlrp.org/acadpub/Entwistle2000.pdf
    [8]
    Biggs, J. (2003) Aligning Teaching and Assessing to Course Objectives. Teaching and Learning in Higher Education: New Trends and Innovations, University of Aveiro, 13-17 April 2003.
    [9]
    Denny, P., Luxton-Reilly, A. and Hamer, J. (2008) Student Use of the PeerWise System. ITICSE’ 08: Proceedings of the 13th Annual Conference on Innovation and Technology in Computer Science Education, Madrid, 30 June-02 July 2008, 73-77.
    http://dx.doi.org/10.1145/1384271.1384293
    [10]
    Hinton, D. and Cooner, T.S. (2014) Blended Learning Design Planner v1.2 Resource Pack. Design for Inquiry-Based Blended Learning (DIBL), University of Birmingham, Birmingham.
    http://www.birmingham.ac.uk/Documents/college-social-sciences/social-policy/CEIMH/DiBLPlannerResourcePack.pdf
    [11]
    Hanrahan, M.U. (1998) The Effect of Learning Environment Factors on Students’ Motivation and Learning. In-ternational Journal of Science Education, 20,737-753.
    http://eprints.qut.edu.au/1352/#
    [12]
    Biggs, J. and Moore P. (1993) The Process of Learning. Prentice Hall, New York.
    [13]
    Gibbs, G. and Simpson, C. (2004) Conditions under Which Assessment Supports Students’ Learning. Learning and Teaching in Higher Education (LATHE), 1, 3-31.
    http://insight.glos.ac.uk/tli/resources/lathe/Documents/issue%201/articles/simpson.pdf
    [14]
    Hounsell, D. (2007) Towards More Sustainable Feedback to Students. In: Falchikov, N. and Boud, D., Eds., Rethinking Assessment in Higher Education, Routledge, London, 101-113.
    [15]
    Draper, SW. (2009) Catalytic Assessment: Understanding How MCQs and EVS Can Foster Deep Learning. British Journal of Educational Technology, 40.
    http://dx.doi.org/10.1111/j.1467-8535.2008.00920.x
    [16]
    Denny, P., Luxton-Reilly, A. and Simon, B. (2009) Quality of Student Contributed Questions Using Peerwise. ACE’ 09: Proceedings of the 11th Australasian Conference on Computing Education, Wellington, 55-63.
    http://dl.acm.org/citation.cfm?id=1862724
    [17]
    Swailes, S and Senior, B. (1999) The Dimensionality of Honey and Mumford’s Learning Style Questionnaire. International Journal of Selection and Assessment.
    http://dx.doi.org/10.1111/1468-2389.00099
    [18]
    Anderson, L.W. and Krathwohl, D.R. (2001) A Taxonomy for Learning, Teaching, and Assessing. Longman, New York.
    [19]
    Purchase, H., Hamer, J., Denny, P. and Luxton-Reilly, A. (2010) The Quality of a PeerWise MCQ repository. ACE’ 10: Proceedings of the 12th Australasian Conference on Computing Education, Brisbane, 37-146.
    http://dl.acm.org/citation.cfm?id=1862219.1862238

Engineering Information Institute is the member of/source content provider to

http://www.scirp.org http://www.hanspub.org/ http://www.crossref.org/index.html http://www.oalib.com/ http://www.ebscohost.com/ http://www.proquest.co.uk/en-UK/aboutus/default.shtml http://ip-science.thomsonreuters.com/cgi-bin/jrnlst/jlresults.cgi?PC=MASTER&Full=journal%20of%20Bioequivalence%20%26%20Bioavailability http://publishers.indexcopernicus.com/index.php