Christopher Watson
No Tests Required: Comparing Traditional and Dynamic Predictors of Programming Success
Watson, Christopher; Li, Frederick W.B.; Godwin, Jamie L.
Authors
Contributors
J. D. Dougherty
Editor
Kris Nagel
Editor
Adrienne Decker
Editor
Kurt Eiselt
Editor
Abstract
Research over the past fifty years into predictors of programming performance has yielded little improvement in the identification of at-risk students. This is possibly because research to date is based upon using static tests, which fail to reflect changes in a student's learning progress over time. In this paper, the effectiveness of 38 traditional predictors of programming performance are compared to 12 new data-driven predictors, that are based upon analyzing directly logged data, describing the programming behavior of students. Whilst few strong correlations were found between the traditional predictors and performance, an abundance of strong significant correlations based upon programming behavior were found. A model based upon two of these metrics (Watwin score and percentage of lab time spent resolving errors) could explain 56.3% of the variance in coursework results. The implication of this study is that a student's programming behavior is one of the strongest indicators of their performance, and future work should continue to explore such predictors in different teaching contexts.
Citation
Watson, C., Li, F. W., & Godwin, J. L. (2014, December). No Tests Required: Comparing Traditional and Dynamic Predictors of Programming Success. Presented at 45th ACM Technical Symposium on Computer Science Education (SIGCSE '14), Atlanta GA
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | 45th ACM Technical Symposium on Computer Science Education (SIGCSE '14) |
Acceptance Date | Nov 30, 2014 |
Publication Date | Jan 1, 2014 |
Deposit Date | Sep 6, 2014 |
Publicly Available Date | Jul 13, 2016 |
Publisher | Association for Computing Machinery (ACM) |
Pages | 469-474 |
Book Title | Proceedings of the 45th ACM Technical Symposium on Computer Science Education. |
DOI | https://doi.org/10.1145/2538862.2538930 |
Public URL | https://durham-repository.worktribe.com/output/1155073 |
Publisher URL | http://doi.acm.org/10.1145/2538862.2538930 |
Files
Accepted Conference Proceeding
(514 Kb)
PDF
Copyright Statement
© 2014 ACM. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the 45th ACM Technical Symposium on Computer Science Education, 2014, http://doi.acm.org/10.1145/2538862.2538930
You might also like
Classification and Detection of Electrical Control System Faults Through SCADA Data Analysis
(2013)
Journal Article
BlueFix: Using Crowd-sourced Feedback to Support Programming Students in Error Diagnosis and Repair
(-0001)
Presentation / Conference Contribution
Predicting Performance in an Introductory Programming Course by Logging and Analyzing Student Programming Behavior
(-0001)
Presentation / Conference Contribution
Failure rates in introductory programming revisited
(-0001)
Presentation / Conference Contribution
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search