Grin logo
de en es fr
Shop
GRIN Website
Publicación mundial de textos académicos
Go to shop › Ciencias de la computación - Otras

Exams Security and Fairness Trade-Offs

Analysis of Multiple Versions in Introductory Programming Exam

Título: Exams Security and Fairness Trade-Offs

Trabajo Escrito , 2025 , 8 Páginas , Calificación: 87

Autor:in: Mangeti Wesonga (Autor)

Ciencias de la computación - Otras
Extracto de texto & Detalles   Leer eBook
Resumen Extracto de texto Detalles

Exam security strategies that are frequently used to stop cheating in a range of situations include using numerous copies of the test. Large, commercial testing firms sometimes employ psychometric procedures to guarantee test version equivalency, but these methods are usually too expensive and time-consuming for individual schools. Because of this, there is a practical conflict with test versions between exam security (improved by versioning) and fairness (arising from variations in difficulty among versions). In this study, we collected data from 2 sets of exams of students to see the trade-off between test security and fairness on a versioned programming exam. The results indicate that certain populations place a higher importance on one feature than the other. Nevertheless, developing equivalent versions, which will have equal levels of difficulty and require the same level of conceptual understanding, remains a significant task. The present analysis project involved data obtained from the latest exam trials the students performed in the introductory programming course, which delivers various classes over many days. Through thorough scrutiny, this article aims to reveal the complex factors involved in choosing exam versions, which can lead to student assessment inequality but also provides security measures for the system.

Extracto


Table of Contents

1. INTRODUCTION

1.1 Research Questions:

2. RELATED WORK

2.1 Exam Security Measures

2.2 Fairness Considerations in Assessment

2.3 Research Framework

2.4 Research Gaps

3. METHODS

4. RESULTS

4.1 Descriptive Statistics

4.2 Data Visualization

4.3 Interpretation of Exam Score Variability Across Test Version

5. DISCUSSION AND LIMITATION

5.1 Discussion

5.2 Limitations

6. CONCLUSION

Objectives & Core Themes

This research aims to analyze the complex trade-offs between exam security and assessment fairness in introductory programming courses when multiple test versions are employed to mitigate academic dishonesty. The study investigates how variations in exam difficulty across different versions correlate with student performance and their perceptions of justice within the academic evaluation process.

  • Trade-offs between exam versioning, security, and fairness.
  • Impact of exam difficulty variation on student performance.
  • Quantitative analysis of exam score distributions using descriptive statistics and ANOVA.
  • Psychometric challenges in ensuring test version equivalency.
  • Influence of organizational justice perceptions on student learning outcomes.

Excerpt from the Book

2.2 Fairness Considerations in Assessment

Equitable assessment is the main principle in education which ensures that all students are given the same opportunities to demonstrate their capabilities and skills [9]. Concurrently, on the one hand, trying different versions of the exam to identify cheating students leads to the necessity of more fairness. The chief problem that can be called upon regarding the different versions of the same exam is the possible difference in difficulty levels between the versions [3]. Although every step is taken to ensure the questions are of the same level of difficulty, factors like the choice of questions, the complexity, and the randomness in the generation of the items can result in unpredicted differences [7]. The research in this area looks at the extent of the difficulty variation of the exam versions and its effect on the student's grades. It also contemplates the alternatives to eradicate the gaps and thus, achieve equality in the results of the tests.

Summary of Chapters

1. INTRODUCTION: Outlines the problem of ensuring academic integrity in programming courses while maintaining fairness, and defines the research questions.

2. RELATED WORK: Reviews existing literature on exam security measures, fairness in assessment, organizational justice frameworks, and identifies current research gaps.

3. METHODS: Describes the data set collected from programming exams, the preprocessing steps, and the statistical tests, including ANOVA, used to analyze score variability.

4. RESULTS: Presents descriptive statistics and visualizations (histograms, boxplots) to compare test versions, followed by a formal interpretation of the ANOVA results.

5. DISCUSSION AND LIMITATION: Evaluates the findings regarding the balance between security and fairness and acknowledges the limitations of the current study regarding scope and methodology.

6. CONCLUSION: Synthesizes the findings, provides practical recommendations for educators, and suggests directions for future longitudinal research.

Keywords

Exam, Security, Test, Versioning, Student, Assessment, Fairness, Programming, Integrity, ANOVA, Data Visualization, Educational Evaluation, Difficulty, Organizational Justice, Academic Performance.

Frequently Asked Questions

What is the core focus of this research?

The research examines the conflict between maintaining exam security via versioning and ensuring assessment fairness in introductory programming courses.

What are the central themes of the work?

The themes include academic integrity, exam versioning strategies, the psychometric difficulty of test items, and student perceptions of organizational justice.

What is the primary research goal?

To unfold the complexity of factors involved in exam versioning and to provide evidence-based insights into how assessment strategies can be improved to balance security and fairness.

Which scientific methods are applied?

The researchers utilize descriptive statistics, data visualization (histograms, boxplots), and quantitative statistical testing through ANOVA to assess grade differences across exam versions.

What topics are discussed in the main body?

The main body covers a literature review on security measures and fairness, a detailed methodology for data analysis, an empirical analysis of score distributions from two tests, and a discussion of limitations.

Which keywords best characterize this study?

Key terms include Exam Security, Fairness, Versioning, Assessment, Introductory Programming, and Organizational Justice.

How do organizational justice dimensions apply to this study?

The study relies on procedural justice, as the use of different question versions implies that the grading process may not be identical for all students, potentially leading to perceptions of unfairness.

Why is Test 1 considered to have had higher performance than Test 2?

The results indicate Test 1 had a higher mean score and lower variability, suggesting the exam questions were potentially easier or better aligned with student preparation compared to Test 2.

What is the significance of the ANOVA results in this document?

The ANOVA confirms a statistically significant difference in the mean scores between the two exam versions, providing quantitative evidence that the versioning process effectively created unintended difficulty disparities.

Final del extracto de 8 páginas  - subir

Detalles

Título
Exams Security and Fairness Trade-Offs
Subtítulo
Analysis of Multiple Versions in Introductory Programming Exam
Universidad
University of Auckland
Curso
COMPSCI 399
Calificación
87
Autor
Mangeti Wesonga (Autor)
Año de publicación
2025
Páginas
8
No. de catálogo
V1707792
ISBN (PDF)
9783389184820
Idioma
Inglés
Etiqueta
Exam; Security; Test; Versioning; Student; Assessment.
Seguridad del producto
GRIN Publishing Ltd.
Citar trabajo
Mangeti Wesonga (Autor), 2025, Exams Security and Fairness Trade-Offs, Múnich, GRIN Verlag, https://www.grin.com/document/1707792
Leer eBook
  • Si ve este mensaje, la imagen no pudo ser cargada y visualizada.
  • Si ve este mensaje, la imagen no pudo ser cargada y visualizada.
  • Si ve este mensaje, la imagen no pudo ser cargada y visualizada.
  • Si ve este mensaje, la imagen no pudo ser cargada y visualizada.
  • Si ve este mensaje, la imagen no pudo ser cargada y visualizada.
  • Si ve este mensaje, la imagen no pudo ser cargada y visualizada.
Extracto de  8  Páginas
Grin logo
  • Grin.com
  • Envío
  • Contacto
  • Privacidad
  • Aviso legal
  • Imprint