By Sonia Sodha –
An assumption in the UK Government’s higher education reforms is that students will behave as savvy consumers, making informed choices that will drive up quality and value for money in the market. However our research shows that, as is the case across other markets, students are not the rational decision-makers that economic theory suggests. Two years into the higher fee regime and we are already seeing signs of a problem, with a third of students saying that they may have made a different choice if they knew what they did now, and evidence highlighting concerns about the quality of the academic offer in places. The Government’s announcement in the Autumn Statement that it will lift the cap on student numbers by 2015 may deliver more choice. But without the necessary quality safeguards in place there is a danger that quality could suffer. In this article we set out the problem and some of the solutions that we will be exploring further over the coming months.
Is the higher education market delivering value for money?
The Government’s perspective
When the Government introduced the higher fee regime, its expectation was that fees over £6,000 would be the exception rather than the rule and that the market would become more responsive to student demands, which would in turn increase the quality of the academic offer, but also produce more innovation with respect to the type of degrees on offer (for example, two-year degrees).
So far, there is little evidence that this is the case. Fees average £8,500 this year and, in terms of the academic experience, a Which?/HEPI survey of 26,000 students identified considerable variation in what students receive, raising questions about value for money and academic quality. For example:
- Size of teaching group: students on average received four hours per week in groups of fewer than 15, but some students received only 30 minutes of teaching time per week in groups of this size.
- Who is doing the teaching: some students were receiving just half of all their teaching from an academic member of staff rather than a research assistant or PhD student; whereas for others this was as high as 97 per cent.
- Feedback: while three in ten students said that they normally received feedback in person, a significant minority (15 per cent) said they normally received a grade with no written feedback. In some institutions this rose to 38 per cent of students.
- Assignments: whereas the average student was set five assignments to be marked in the last term in some cases students were only set one assignment. 53 per cent were told to do mandatory work during the holidays whereas 43 per cent were not.
- Total workloads: while total workloads could be more than 50 hours a week in some universities, in others they were as low as 15 hours. The average student was working for, on average, 900 hours a year – that’s 25 per cent fewer hours than the 1,200 assumed in official quality guidelines and in some cases much lower, with some students working as little as 460 hours per year.
Changes to the student academic experience since 2006
|Scheduled contact hours||13.75||13.91||13.98||14.04|
|Private Study hours||12.81||12.45||14.37||14.13|
|Total workload hours
(scheduled contact + private study)
|Total study hours||N/A||N/A||15.05||14.96|
|Time spent in small teaching groups
|Proportion of time in small groups with
academic member of staff
There is also no sign that universities have improved the quality of their offer during the period that fees have increased, with scheduled teaching time and size of groups remaining largely unaltered. There are particularly concerning trends in flexible provision: the number of part-time places has declined, and there is no sign yet of any innovation away from the traditional three-year model, including a lack of expansion of two-year accelerated degrees.
The student perspective
If you considered the very high satisfaction scores within higher education it seems that, even in the context of higher fees, students are pretty happy with their lot. But dig beneath these numbers and there is some cause for concern. Two thirds of students in the survey said that their course didn’t meet their expectations in some way, with the main reasons being that the course was poorly organised, teaching hours too few, that they didn’t feel supported in their private study, or that the size of teaching groups were too large. One third of all students we surveyed said that they may have chosen a different course if they knew what they did now and, among first year students paying higher fees, one third thought that their course was poor value for money.
In some respects, concerns about value for money are not surprising given the higher fees students are paying and that, in contrast to what students might think, from a university perspective, this was largely income replacement for government funding rather than additional funds flowing into the system. However, even when fees were just £1,000, 16 per cent of students thought that the course was poor value and the HESA longitudinal survey of graduates finds that 14 per cent of those graduating in 2008/09 thought so. This means quite significant numbers of graduates might have made different choices with more information. In a survey we conducted of recent graduates, one quarter said that, in hindsight, they would have conducted more research into their institution or course and nearly one fifth said that they would have given more consideration to options other than going to university.
What’s the problem?
An imperfect market
Part of the problem is that the market is imperfect in higher education, with one of the biggest challenges being the complexity involved in understanding the quality and value of a degree. Price is not a good indicator of quality and people apply to university for a range of reasons, and the relative importance of these can change over time which means that its true value may not be understood until later in life.
There is a lack of information on the quality of the academic experience: this is not provided by the quality regulator (QAA). The QAA regulates on the basis of process rather than the quality of the teaching, which means that their assessments of individual institutions relate to the QAA’s confidence that they have the right processes in place rather than saying anything specifically about the quality of the teaching.
This lack of information means that employment prospects and reputation are drawn on as proxies, but these are not without their problems. The former, while important, is arguably a better reflection of a students’ underlying ability, rather than a measure for the quality of the university, and reputation is often based on league table rankings which are often a reflection of research rather than teaching excellence.
The only information that exists in a comparable format on the teaching experience is the proportion of time a student can expect to spend in private study versus teaching, and the balance of assessment between coursework and exams. But without the actual numbers of hours or essays it is difficult for students to actually see what this will mean in practice.
Universities are often not providing this information themselves – and even if they did, it would take a very engaged student to review numerous prospectuses. A review that we conducted of twenty institutions’ websites and prospectuses, looking in particular at information provided about English courses, found that:
- Only two provided comprehensive information on the total number of contact hours per week but even then this was not broken down by lectures or tutorials.
- Only two of the twenty gave an indication of the amount of private study that was required.
- Six out of twenty gave an idea of the size of the seminar/ tutorial class.
No single institution provided information on all of these things. This chimes with what students tell us about information provided by institutions. A fifth of students (21 per cent) thought that information provided by universities was vague and one in 10 (9 per cent) thought it was misleading.
So the decision-making process is complex, making it more difficult to make a informed choice. And yet the wider features and limitations of the market make making the right choice all the more important: students generally cannot switch easily without losing their credits and demand at present outstrips supply.
It was in recognition of this that we launched Which? University in September 2012, to support students to make the right choice for them. It draws on official data as well as a survey of 17,000 students to get real life views on what it is actually like to be a student at that institution. The site has achieved nearly three million visits since then. But we recognise that supporting student choice is no easy feat – a degree is a complex experience and qualification – and we are continuously exploring how we can support students in this process.
Students are not savvy
The system also relies on students being ‘savvy’ users who have clear preferences and use the existing information to ensure that they are making the best possible choice. But as we know from our experience of supporting people to make decisions in other markets, this isn’t the case. When we asked prospective students what factors they researched at the time of making their choice just 38 per cent said that they had considered the employment performance of students who had taken that course, despite employment being the main driver for going to university, and around three in ten had considered factors related to the academic experience such as who was doing the teaching or the learning and assessment style. 23 per cent hadn’t been to an open day.
When you consider that many had not received any advice at the time of making their choice this is less surprising: only one third had received one-to-one advice from a careers adviser when making their choice. When we conducted qualitative research with students to understand why they had not considered the academic experience of the university when making their choice, many said that they didn’t know that there would be variations, and they were not always sure what their preferences were in any case. They needed advice and support to work this through.
What needs to change?
Part of the problem is the lack of information and advice available to young people to help them make informed choices. On the former, this is critical but also under threat given the £200 million cut in careers funding and the transferral of this function to schools. On the latter, while pinning down the right indicators of quality is challenging, the tendency for the debate to get bogged down in whether outcomes are more important than inputs, and what these might be, has resulted in little useful information being provided at all. And the current information on employment earnings, which may not indicate the quality of the university but is still useful, is limited to earnings information six months post-graduation. We think this needs to change.
But information alone will not result in the wider changes in the market that both the Government and Which? would like to see. This means looking again at the way the sector is regulated, how universities collect and respond to feedback and complaints, and how switching – which still remains very difficult – can be made easier. Which? will be exploring these in the coming months.
Sonia Sodha is Head of Public Services and Consumer Rights at Which?
Which? University is a free website designed to help students make more informed decisions about their higher education choices (http://university.which.co.uk).
- Youthsight on behalf of Which? surveyed 17,090 full-time undergraduate students in their first, second, third and fourth years at UK institutions. The fieldwork took place between the 26 February and 21 March 2013. When we combined this year’s results with last we achieved a sample of 26,000 students.
- Which?/Hepi, The Student Academic Experience Survey, May 2013.
- Youthsight, on behalf of Which?, surveyed 1003 UK applicants intending to start university in September 2014, online between 7th and 24th February 2014. Data were weighted to be representative of the applicant demographic.