MOT Workshop 45 – MOT Testing Quality

MOT Testing Quality

 In Britain, over 20,000 MOT Testing Stations, and 50,000 Testers carry out over 28 million MOT Tests every year – but a surprisingly high proportion of those are not done properly! Why is that? Surely VOSA (now DVSA) could do more to help Testers get it right? Well, MOT quality isn’t as straightforward as you might think.
This updated article originally appeared in MOT Testing Magazine, written by Jim Punter.

MOT quality deteriorating?

MOT Test Performance Report Brake test
Perhaps MOT quality should focus on whether or not Testers get it right on road-safety sensitive inspections, like the brake performance check seen here.

Every year, DVSA select at random about two thousand Testing Stations which their VEs then visit to re-examine recently MOT Tested vehicles. The results are then compiled by DVSA into a comprehensive report called the ‘MOT Compliance Survey’ (MCS). Before 2010/11, however, not all of the information was available – MCS reports were censored.

Then, in February 2012, Justine Greening then Secretary of State for Transport published the full, uncensored MSC report for 2010/11, and criticised the 12% pass/fail error rate, saying “…there is still room for improvement…”

Yet we had to put in a ‘Freedom of Information’ request to get the full 2011/12 report, and found that despite Greening moaning about the 12% error rate in 2010/11, in 2011/12 it was worse, at 14.6%. Although for 2012/13 it had improved to 12.9%, that’s still higher than in 2011, when Greening had complained…

Right result, wrong reason – the real error rate!

But the ‘headline’ MOT error rate only measures whether Testers get the right answer, ‘pass or fail’. It doesn’t show how many Tests with correct ‘fail’ outcomes were still not done properly because the VE disagreed with the Tester’s finding. Maybe the Tester failed a number plate, but missed ‘dangerous to drive’ defect. Or a failure with one defect ‘picked-up’ by the Tester, is judged by the VE to have further defects. In both cases, with the defect fixed there’s a car on the roads with a pass it shouldn’t have!

DVSA’s focus whether or not the correct pass/fail MOT decision was made by Testers puts a false ‘gloss’ on MOT quality and can disguise dangers to road safety. ‘Defects disagreed’ by VEs, as Greening noted for 2010/11, were 27.7% of failures, so what about 2012/13? Well, it’s gone up, it’s now 28.5%.

Clearly when it comes to deciding what is a ‘good’ MOT, DVSA’s focus on the correct pass/fail result is misleading, and understates the real error rate. In theory, from the latest figures, almost a third of all MOTs aren’t done exactly correctly. But is it that bad? In fact that’s not the whole story. Some errors were relatively trivial – as evidenced by the disciplinary outcomes from those ‘compliance’ re-examinations – perhaps it’s not as bad as it seems.

Disciplinary discrepancy

When VEs conduct ‘Compliance’ checks on recently Tested vehicles, they’re still DVSA’s ‘policemen’, and take disciplinary action if there’s a problem.

So, how serious were Testers’ errors? Actually, there were less disciplinary outcomes than would be expected. The 2012/13 report says, “Not all test errors result in formal disciplinary action…” The disciplinary points awarded often didn’t exceed the threshold for formal action. All told, in 2012/13 there were 214 disciplinary outcomes – 9.9%% of vehicles re-checked. Of these, few had formal disciplinary action taken against them, most only had an advisory warning letter. The majority of issues identified by VEs where they disagreed with the Tester’s decision were, it seems, relatively trivial.

Dangerous to drive

In 2012/13, MOT Testers discover approximately 750,000 vehicles (Class 4 & 7) with defects they believed rendered a vehicle ‘dangerous to drive’. That’s about 2.7%% of all those vehicles MOT Tested. During DVSA’s Compliance surveys, Vehicle Examiners don’t look for ‘dangerous to drive’ defects, but do assess a similar criteria, that is whether or not vehicles attract an ‘immediate ‘prohibition’ notice’. For 2012/13 Vehicle Examiners found that 5% of the vehicles they re-checked attracted such a notice. If such numbers were extrapolated to MOTs generally, that represents about half a million less than the number of dangerous to drive vehicles discovered by Testers.

Yet that was something of an improvement on 2011/12 where the compliance survey for that year showed VEs had found the number of vehicles meriting an ‘immediate prohibition’, had more than doubled from 2.5% to 6.2% in 2010/11, an astonishing 1.736 million vehicles over the year, 4,756 every day – over twice the number of vehicles tagged as ‘dangerous to drive’ by Testers in 2010/11. Evidently that difference has reduced somewhat, but does highlight a serious discrepancy between what defects Testers consider as ‘dangerous to drive’, and defects VEs believe merit an immediate Prohibition – with Testers being more lenient!

This is contentious. Motorists should have consistency between VE and Tester judgment when it comes to such dangerous defects. In fact if the EU changes in their current form go through, there will be a specific definition of ‘dangerous to drive defects’ for Testers to follow.

Under–stated MOT failure rate

Over recent years, the MOT failure rate nationally hovers around the 40% mark – surprisingly high for an annual vehicle testing regime, as compared to other EU countries with a two yearly MOT. But there’s a really interesting table in the 2011/12 report giving data from 2006/07 to 2011/12 of what the VE’s failure rate was, as compared to that of Testers.  As the vehicles randomly selected were considered to be a statistically significant sample, the VE pass/fail rate should mimic what ought to be the national failure rate.

The historical results show that over the five-year period considered, Testers always pass more vehicles than they should. That means that the MOT failure rate as shown in DVSA’s published statistics is always an underestimate – but by how much?

In 2010/11, the VE failure rate was 53.2%, dropping to 49.4% in 2011/12, and to 48.5% for 2012/13 – which means, that if every MOT Test of the 28 million carried out annually, were done correctly, about half of the cars MOT Tested would fail. That would have been an interesting statistic with which to confront the DfT during the two yearly MOT testing ‘crisis’!

MOT quality – the Government’s responsibility…

Percentage of Tests with the wrong pass/fail result
Year2006/72007/82008/92009/102010/112011/122012/13
Error14.5%15.68%15.42%17.7%12.4%14.6%12.9%
MOT pass/fail error rate has hardly changed since 2006/7, (the recent drop to 12.9% was noted by DVSA as not being ‘statistically significant’) maybe a re-think of ‘risk assessment’ is needed.

Strangely, in her statement in 2011, the then Secretary of State, Justine Greening complained about the MOT error rate as if it were nothing to do with the Department for Transport – yet the quality of MOT Testing is entirely the Department for Transport’s responsibility – they control DVSA’s strategic direction. Of course, poor Testing reflects badly on the MOT Trade, but in the final analysis, how DVSA run the MOT Scheme, and the resources they put into controlling it, will have the greatest effect on MOT quality – how many checks they make, how many Vehicle Examiners they employ, the refresher training they require, and so on.

The method DVSA have adopted to ensure consistently good MOT quality is maintained, is their ‘risk assessment’ system with the red/amber/green ‘league table’ of VTSs. Unfortunately, however, this does not seem to be entirely effective – if it were, the headline (pass/fail) error rate would be reducing with time whereas it has remained relatively unchanged from 14.5% in 2006/7 to 12.9% in 2012/13. So why is that? Ultimately it is about resources.

DVSA allocate about 126 ‘man-years’ of VE’s time to MOT Testing each year. But the number of Testing Stations and Testers has also significantly increased. These days most VE activity is directed to the ‘risk assessment’ process – or in visiting amber and red Testing Stations, or making those 1,800 ‘compliance’ visits to VTSs to randomly check a recently Tested vehicle. Arguably, just 126 VEs employed to ensure the quality of 28 million MOTs every year carried out by over 50,000 Testers at about 22,000 Testing Stations is simply not enough – so something has to give. And that ‘something’ is any kind of check on, or support to the 12,000 or so ‘green’ Testing Stations who are unlikely to see a DVSA Vehicle Examiner between their three yearly ‘risk assessment’ visits.

There’s a simple truth here specifically stated in some of DVSA’s Compliance reports:

“…There was no significant difference between the risk rating and test error…”

This, in Top Gear parlance, is a ‘bombshell’. It unequivocally means that all those VE resources used by DVSA in their ‘risk assessment’ process have failed to properly identify and ‘grade’ Testing Stations into appropriate red, amber and green bands. The system just doesn’t seem to work!
Improving MOT quality.

So what can be done to improve MOT quality on a national basis? Well surely the first step is to decide what is a ‘good’ MOT. The huge contrast in the MOT Compliance data between a over 25% or more of MOTs being done incorrectly, yet only 11% of such errors attracting disciplinary action, (most being relatively minor), suggests that such a ‘zero tolerance’ approach to measuring MOT Testing errors is not very helpful; aside perhaps from measuring it in a ‘customer service’, value for money perspective – but with some Dealership MOTs offered for free, that’s not too helpful either.

Yet DVSA’s (previously VOSA’s) ‘headline’ pass/fail error rate of between about 12.5% – 17.7% over the years, hides Testers’ failing to get it right when it comes to issuing failure certificates. So the first thing that’s needed is, perhaps, a more sensible measure of MOT quality, which relates Testers’ error rates to the threat to road safety posed by the Testing mistakes.
Passes that should have failed, for example, are only of key importance if the defect not detected was seriously ‘road safety sensitive’ – and this especially in the light of DVSA’s ‘pass and advise’, policy which will inevitably lead to fewer MOT failures. The same, of course, should apply to either citing an incorrect defect, or failing to spot a defect on an MOT failure – was it safety sensitive?

Once MOT quality is better defined against road safety criteria, then refresher training could be targeted to provide an appropriate improvement – and better training is the key, and not just more of it, but a regimen specifically pitched to experienced Testers whose mistakes are not due to lack of knowledge, but simply because they have become complacent over time. Such training should be keyed into the sort of errors picked up in the Compliance checks. And, of course, more VEs and resources aimed at improving quality.
MOT modernisation to 2015 and beyond…

Of course much of this is already planned into the new ‘online’ MOT computer system for 2015 and/or beyond. Training will, however, be largely desk-based and online. And, as we understand it, also be linked into the risk assessment system. But before that happens, either the ‘risk assessment’ scheme should be abandoned because it doesn’t work, or better designed and developed so that it does.

Performance reportsMOT Quality Performance Report

Performance reports may highlight an issue,
but unless VOSA are prepared to assist conscientious AEs
who ask for help, they might as well not be available!

Another issue is that of the NTs ‘Performance reports’ which AEs are supposed to scrutinise and act upon if they find anything untoward. Our editor does that, and found his Quality Controller has a significantly higher failure rate than the average. But is that a problem?  Well maybe not; DVSA’s ‘compliance reports’ show most Testers should fail more cars. Trying to sort it out, he asked for a VE to visit and assist – the answer? “No!” Another Tester does both cars and motorcycles – but in that Tester’s ‘Performance report’ the results are merged – so he’s got a lower than average failure rate (bike failures are less than cars). What’s he supposed to glean from that?

MOT Testing quality should and could be better. It’s the Government’s scheme, they control it, isn’t it about time they did something more about it.

Related posts

Comment on this article