Dramatically improving test scores have been held up as proof of the leap in standards in our primary schools. But a new study tells a very different story. Nick Morrison reports.

BY any reckoning, it is mightily impressive. The statistics showed that primary school children made clear and significant advances in both English and maths in the space of just a few years.

Here was unambiguous proof that all the policies and initiatives of successive governments, not to mention the extra resources put into education, were more effective than even their originators could ever have hoped.

In English, the number of 11-year-olds reaching level four standard rose from 48 per cent in 1995, to a remarkable 75 per cent five years later. Equally striking was the improvement in maths, from 44 per cent at level four in 1995, to 72 per cent in 2000.

This was the answer to all the doubters who questioned whether the money was being spent wisely, and in the Prime Minister's most beloved arena as well, enabling the Government to claim at the last general election that there had been startling improvements in primary education.

According to one of Tony Blair's most senior advisors, this rise in standards was a demonstration that large scale reform could be achieved quickly, while to others it was evidence of the success of the National Literacy Strategy.

Or was it?

Not according to one of the country's leading experts on school testing.

By analysing a number of long-term studies, Professor Peter Tymms, of Durham University, has concluded that the rise in standards is far lower than that claimed by the Government.

Furthermore, the rise in standards was a self-fulfilling prophecy: the agency which trumpeted the increase, the Qualifications and Curriculum Authority (QCA), was deciding by how much standards rose.

PROFESSOR Tymms, director of the Curriculum, Evaluation and Management Centre at Durham, says he started to question the accuracy of the Key Stage Two SATS results, taken at the end of primary school, when they differed markedly from the centre's own informal test results.

"We run unofficial monitoring projects for schools and we noticed that when these SATS results were rising, our results were not rising.

"We also noticed that some of the secondary schools who were getting new children in from their primary schools, were saying these kids were supposed to be getting better results, but we're not seeing that. These two things made us worried," he says.

The pattern of the increases in maths and English standards also aroused suspicion. The percentage of pupils achieving level four or above showed a steep rise every year from 1995 to 2000 in both subjects, with the exception of a drop in maths in 1998, attributed at the time to the introduction of an oral test.

After 2000, both maths and English scores level out. In nine years of tests, standards in the two subjects have tracked each other almost exactly. They both rose at the same rate, they both levelled out at the same time.

Conventional wisdom is that it is easier for a school to influence results in maths than in English, where standards depend much more on what happens in the home. Therefore, it might have been expected that test scores in maths would increase at a higher rate.

"It is surprising to see the maths and English lines running so parallel to one another. Surely one would be expected to rise more quickly than the other or one to flatten off earlier or for some other pattern to appear," Prof Tymms says.

Prof Tymms then gathered together a range of independent studies, which have also tested pupils' ability at age 11, to see if they had shown a corresponding increase in standards.

"They all tell the same story: practically no change in reading, contrary to what the official statutory test data say, and in maths standards have risen a little bit," he says.

HE says the independent tests show that far from the level four result in English rising from 48 per cent in 1995 to 75 per cent in 2000, the estimated real rise was to 58 per cent. In maths, instead of being from 44 per cent to 72 per cent, it rose to an estimated 64 per cent.

These conclusions are backed up by the findings from a study commissioned by the QCA itself.

The QCA asked Alf Massey, of the University of Cambridge Local Examinations Syndicate, to look at the Key Stage tests, amid concern that standards were being watered down to produce higher marks.

His findings for KS2 maths and English show that the rise in achievement in English was largely a result of a fall in the standard of the test.

Pupils in Northern Ireland who were given the KS2 tests for 1996 and 1999 were reported to have found the later test easier. Massey's conclusion was that the rise in standards in English was "illusory". This study was handed to the QCA in January 2002 but was not published for almost two years, until November last year.

Prof Tymms says the inescapable conclusion is that the tests have become easier. He says by setting a new test every year, the QCA was unable to maintain a baseline standard, by which any change in achievements could be measured.

He rejects any suggestions of a conspiracy, but says that by deciding what mark is going to count as a pass, the QCA has become vulnerable to political pressure to show that the extra money and initiatives being put into education are having an effect.

But when it became clear that the dramatic rises were not supported by reality, the tests were tightened, leading to the levelling off in results after 2000.

"The QCA has to decide what mark is going to count as a pass, and between 1995 and 2000 they were not setting that pass mark properly.

"They thought the standards were rising, but they were deciding how much they rose by. They should set the mark without knowing how many are going to get it," says Prof Tymms.

"I don't think it was deliberate; it happened because they didn't have good enough quality control. But it was predictable, and it has happened elsewhere."

The experience of school tests in Texas followed exactly the same pattern. New tests were introduced which students had to pass to graduate from high school. The tests were also linked to evaluations of teachers and principals.

The test scores rose dramatically in a trend dubbed the "Texas miracle", until a federal assessment body stepped in and found that the miracle was an illusion. The increase in standards had been much more modest, and no larger than the national average, with the exception of a small rise in maths.

PROF Tymms says the solution is to have an independent body set up to monitor how standards have changed. This body would give a small sample of pupils the same test every year, so they were being measured against the same standard.

The test should be carried out in secret, without teachers' involvement, so pupils cannot be taught test techniques. The test would therefore be a purer gauge of ability, rather than of how well the children had been prepared.

Existing tests could still be used to compare schools, but not to support any claims of a rise or fall in standards.

"It is in everybody's interests to say standards are getting better, but you don't learn as a society. If you have put millions of pounds into education and it is not changing, we need to know that," says Prof Tymms.

"Labour has trumpeted the rise in standards, but to admit all the initiatives haven't worked would be like admitting there are no weapons of mass destruction.

"We don't have good scientific evidence for the impact of government initiatives that would help us to know what works. We need to know which of these policies work to know if we are doing the right thing. They put in policies and they don't work, so what do they do? Do they say we made a mistake? No, they say of course they work. They should come out and say they made a mistake."