Donna Davis | Professional Portfolio
Module 7 Showcase 3:
Grant Cycle B Proposals
Small Grant Proposal
Alternative Assessment Strategies in Online Higher Education: Effectiveness, Perceptions, and Student Success
Proposal Summary
This study investigates the effectiveness of alternative assessment strategies, such as project-based, portfolio, and competency-based assessments, compared to traditional exams in online higher education. The research will evaluate how these alternative methods impact student learning outcomes, retention, motivation, and engagement while examining instructor and student perceptions of these assessments.
The study aims to provide empirical evidence on whether alternative assessments better align with the skills and competencies required for success in higher education and professional fields. Using a mixed-methods approach, this research will collect and analyze student performance data, surveys, and instructor interviews to assess the impact and feasibility of alternative assessments in online learning.
This study will offer practical recommendations for universities, online program administrators, and educators to improve assessment models that foster deep learning and student engagement by addressing gaps in assessment practices for higher education institutions.
-
Project Duration: 12 months
-
Funding Requested: $30,000
-
Principal Investigator: Donna Davis
Research Questions
-
How do alternative assessment methods (e.g., project-based, portfolio, and competency-based assessments) compare to traditional exams in evaluating student learning outcomes in online higher education, particularly in terms of knowledge retention, skill application, and critical thinking?
-
What are the perceptions of students and instructors in higher education regarding the effectiveness, fairness, and feasibility of alternative assessments in online learning environments?
-
In what ways do alternative assessment strategies impact student retention, motivation, and engagement in online higher education courses, and what institutional or instructional factors contribute to their success or challenges in implementation?
Proposal Narrative
1. Introduction & Significance
Assessment plays a crucial role in higher education, influencing how students learn and interact with course material. Traditional assessment methods, such as multiple-choice exams and timed tests, have long dominated online education but often do not adequately assess higher-order cognitive skills, critical thinking, and real-world problem-solving abilities.
As online higher education continues to expand, there is a growing need to explore alternative assessment strategies that align with the evolving expectations of universities, employers, and students. Alternative methods like project-based assessments, portfolios, and competency-based evaluations can provide a more comprehensive understanding of student learning while enhancing engagement and retention.
Despite the increasing adoption of these strategies in higher education institutions, research remains limited on their effectiveness and perception in online learning environments. This study will contribute to the field by:
-
Comparing learning outcomes of students assessed through alternative methods vs. traditional exams
-
Investigating student and instructor perceptions of alternative assessment strategies
-
Examining the impact of alternative assessments on student retention, motivation, and engagement
Findings from this research will help higher education institutions design evidence-based assessment policies that foster student-centered, equitable, and meaningful learning experiences in online courses.
2. Literature Review & Theoretical Framework
Research suggests that alternative assessments support deeper learning and skill development (Boud & Falchikov, 2006; Nicol, 2009). However, there is a lack of comparative studies in online higher education.
Many traditional assessment models focus on evaluating content recall, whereas alternative assessments are often designed to measure higher-order thinking, problem-solving, and real-world applications.
One key framework that informs this study is Wiggins and McTighe’s (2005) Backward Design model. This approach prioritizes learning outcomes first, followed by developing assessments that directly measure those outcomes. In online higher education, backward design can be particularly effective in aligning alternative assessments with desired competencies, ensuring that students engage in meaningful learning experiences rather than rote memorization.
This study will also build on:
-
Constructivist Learning Theory, which emphasizes active, student-centered learning. Alternative assessments align with this theory by fostering authentic application of knowledge.
-
Self-Determination Theory (Deci & Ryan, 1985), which examines how different assessments impact student motivation in higher education, particularly how they enhance autonomy, competence, and engagement.
By integrating Backward Design with constructivist and motivational theories, this research will evaluate how well alternative assessment strategies support deep learning and engagement in online higher education settings.
3. Research Methods
Study Design & Data Collection
This study will utilize a mixed-methods approach, integrating quantitative analysis of student performance data with qualitative insights from surveys and interviews to evaluate the effectiveness of alternative assessments in online higher education.
Phase 1: Quantitative Analysis (Student Performance & Retention)
-
Comparison of Learning Outcomes: Student performance will be analyzed across courses that use alternative assessments (e.g., project-based, portfolio, competency-based assessments) versus those that rely on traditional exams.
-
Key Metrics: Student exam scores, project evaluations, retention rates, and engagement levels will be collected and analyzed.
-
Data Sources: Data will be gathered from three higher education institutions offering fully online degree programs to ensure variability in assessment methods.
Phase 2: Qualitative Analysis (Student & Instructor Perspectives)
-
Student Surveys (n=200): Students enrolled in online courses will complete surveys examining perceptions of assessment fairness, effectiveness, and impact on motivation and engagement.
-
Instructor Interviews (n=20): Faculty members who design and implement online course assessments will participate in semi-structured interviews to explore the benefits, challenges, and institutional support needed for alternative assessments.
Sample & Participants
To ensure broad applicability, the study will include participants from diverse academic disciplines and institutions with varying levels of assessment innovation.
-
Higher education students (undergraduate and graduate) enrolled in fully online programs.
-
Faculty and instructors are responsible for designing and implementing assessments in online courses.
-
Participating institutions will be selected based on their use of diverse assessment strategies and their commitment to innovation in online learning.
Data Analysis
A structured approach will be used to analyze both quantitative and qualitative data, ensuring a comprehensive understanding of alternative assessments in online higher education.
Quantitative Data Analysis (Student Performance & Surveys)
-
Descriptive Statistics: Student grades, retention rates, and project evaluations will be analyzed using means, percentages, and standard deviations to assess trends in student performance.
-
Comparative Analysis: T-tests or chi-square tests will be applied to determine significant differences between students assessed through alternative assessments versus traditional exams.
-
Survey Data: Likert-scale survey responses will be analyzed using frequency distributions and cross-tabulations to identify trends in student engagement, motivation, and perceptions of assessment fairness.
Qualitative Data Analysis (Instructor Interviews & Open-Ended Survey Responses)
-
Thematic Analysis: Responses from instructor interviews and open-ended student survey questions will be coded and categorized using a thematic approach (Braun & Clarke, 2006).
-
Coding Process: An inductive coding strategy will be used to allow themes to emerge naturally from participant responses.
-
Inter-Rater Reliability: To ensure consistency, a secondary researcher will review coded themes, enhancing the credibility and reliability of findings.
Integration of Findings (Mixed-Methods Approach)
By triangulating quantitative and qualitative findings, the study will develop evidence-based recommendations for higher education institutions on best practices for alternative assessment strategies in online learning.
4. Expected Outcomes & Impact
This study will provide higher education leaders, instructors, and policymakers with actionable insights into the effectiveness of alternative assessment strategies in online learning.
Anticipated outcomes include:
- Empirical evidence on how alternative assessments impact student learning outcomes in online higher education
- Improved understanding of student and instructor perceptions of assessment strategies in online courses
- Recommendations for designing and implementing effective assessment policies in online higher education
Results will be disseminated through academic journal publications, conference presentations, and workshops for online education professionals.
5. Budget & Justification
6. Project Timeline (12 months)
7. Project Team
-
Principal Investigator (PI): Donna Davis – Leading research design, data collection, and dissemination
-
Graduate Research Assistant: Supports data collection, survey administration, and qualitative analysis
-
Institutional Partners: Higher education faculty participating in interviews
Conclusion
This study will address a critical gap in online higher education assessment by evaluating alternative strategies for measuring student success, engagement, and retention. Findings will inform policy and instructional design, ensuring that assessment methods evolve to meet the needs of diverse learners in digital higher education environments.
Expense | Amount ($) | Justification |
---|---|---|
PI Salary (Course Release) | 12,000 | PI’s time for research, data collection, and analysis |
Graduate Research Assistant | 8,000 | Support for data collection, survey administration, and coding |
Participant Stipends | 3,000 | Incentives for student survey participation |
Transcription Services | 2,000 | Transcription of instructor interviews |
Software (SPSS, Qualtrics) | 2,000 | Data analysis and survey tools |
Conference Travel & Dissemination | 3,000 | Presentation of findings at an educational research conference |
Total | $30,000 |
Phase | Activities | Duration |
---|---|---|
Month 1-2 | IRB approval, recruitment of institutions & participants | 3 months |
Month 3-6 | Data collection (quantitative and qualitative) | 4 months |
Month 7-8 | Data analysis & interpretation | 3 months |
Month 9-10 | Drafting findings, preparing journal submission | 4 months |
Month 11-12 | Final dissemination, conference presentation | 4 months |