Qualitative Analysis of RMIT Course Experience Survey Data | Rakesh Patibanda
Qualitative Analysis of RMIT Course Experience Survey Data
UX Audit

Qualitative Analysis of RMIT Course Experience Survey Data

The Problem
RMIT had large volumes of end-of-semester student survey data but no systematic qualitative analysis — quantitative scores were tracked but the 'why' behind them wasn't being surfaced, producing generic rather than actionable interventions.
My Role
Lead Researcher – designed the coding framework, conducted thematic analysis across multiple courses, synthesised findings into actionable recommendations.
Outcome
Delivered a comprehensive report that informed strategic decisions about course content, assessment methods, and teaching quality across the RMIT Education Portfolio.
Organisation
RMIT University (Education Portfolio)
My Role
Lead Qualitative Researcher
Team
Education Portfolio stakeholders, course coordinators, faculty
Timeline
2018–2020
Key Constraints
Large data volume; needed to maintain rigour while producing actionable insights on a practical timeline; findings needed to be credible to academic stakeholders.
1

The Problem

The CES scores told RMIT that students were dissatisfied with assessment methods in certain courses – but not why, and not what to change. Qualitative analysis of open-response data is labour-intensive, which is why it was being skipped. The result was that interventions were generic ('improve feedback') when they needed to be specific ('engineering students receive written feedback too late to act on before the next assessment').

The case for systematic qualitative analysis was that it transforms 'satisfaction scores' into 'design briefs.'

2

My Approach

I categorised collected data into themes (course content, teaching quality, assessment methods, learning resources), then assigned codes using a mix of predetermined and emergent coding — a standard qualitative research approach but applied systematically across a large dataset.

I conducted a deep analysis of coded data examining recurring themes, emerging patterns, and notable outliers. Rather than treating all feedback equally, I weighted findings by frequency and impact — surfacing the issues affecting the most students most severely.

3

The Work

Delivered a comprehensive analysis report covering multiple courses and faculties. Report structure: executive summary for leadership (key findings in 5 minutes), detailed thematic analysis for coordinators, and specific actionable recommendations mapped to each finding. Presented findings to course coordinators, faculty members, and university administration.

4

Evidence of Impact

Provided RMIT's Education Portfolio with a clearer understanding of student experiences, enabling targeted interventions. Findings informed several strategic decisions including changes to assessment timing, teaching methods, and learning resource formats across the portfolio.

📄 Download CES Analysis Report
5

What I'd Do Differently

I'd automate the initial coding pass using NLP tooling and spend the human analysis time on interpretation and edge cases – first-pass coding of 100+ survey responses by hand was the least defensible use of researcher time. I'd also build the reporting template upfront so findings map directly onto decision points rather than being reformatted after analysis is complete.

Skills & Methods

Qualitative Research Thematic Analysis Data Coding Research Reporting Stakeholder Communication Educational Research
← Back to Industry Portfolio