For years we have been handed excuses by schools for failing to teach their children basic literacy: ‘We’ve got lots of boys in the class’. ‘Many of the children have summer birthdays’. ‘A majority of our children are from very poor backgrounds’. ‘Lots of our children don’t speak English as a first language’. Time and again the excuses have been trotted out until they have become clichés, as if by their very invocation the school/class teacher are absolved of the responsibility for ensuring that all their children become literate.
Now, the teachers at St George’s Church of England Primary School have shown conclusively just how wrong are the claims made by the jeremiahs and by those who have never understood how the sounds of the English language relate to the writing system!
Ten years ago this year Sounds-Write published evidence of what could be achieved over the first three years of schooling in a school (St Thomas Aquinas) that by no means lies in some affluent, leafy suburb. Today, we are presenting you with further confirmation that high quality phonics teaching works. What’s more it works for all children, with the possible exception of a tiny percentage of children with speech and language difficulties.
St George’s is a school in one of the very most deprived areas of London, where well over half the children are entitled to free school meals, and where the language spoken at home by many children is not English. The evidence provided by the school is so powerful it speaks for itself.
The table below presents the results of a spelling test taken by all children at St George’s Church of England Primary School at the beginning of Y2 in September 2015. There were, at the time of the test, apart from a pupil who had joined the school in September, twenty-nine children in the class, all of whom had been taught using Sounds-Write in YR and Y1. The same twenty-nine children (100%) had also passed the Phonics Screening Check the previous summer. The class contains twenty boys and nine girls.
It is immediately obvious that not a single one of the children from this very poor neighbourhood scored a spelling age below their chronological age (CA). In fact, apart from two pupils who were five months and eight months ahead of their CA respectively, all the remaining 27 children were ahead by double figures – eighteen of them by more than eighteen months ahead of their CA. Two, at ages 6 years and 3 months and 6 six years and 8 months respectively, hit the ceiling (11 years) on this test.
We used Dennis Young’s Parallel Spelling Test because, as we have long argued, spelling gives a more reliable indication of a pupil’s literacy than a reading test. Firstly, as everyone knows, spelling is harder than reading because what you have in front of you gives an immediate visual prompt – recognition memory; whereas, when spelling, we have to access the sounds we hear and represent them by using our recall memory. It is also pretty obvious that if you spell a word, there is a very strong likelihood that you can read it.
The teachers at St George’s are able to achieve these fantastic results because they have been properly trained in linguistic phonics and because the leadership in the school make sure that no child is left behind and that all children are taught to read and write. Every school should aspire to that same goal.
5 thoughts on “‘The best that we can be’”
These are interesting data, John. Although this is a Spelling Test, the words could also serve as a Reading Test. It would be good to know how many of the words the kids could read but not spell, and what the pattern of grapheme-phoneme differences is.
Of the 29 kids, only 9 are girls. Any idea why the disproportion?
Raw scores range from 13 to 33. How do you account for the wide variability?
The raw scores (by eyeball) are uncorrelated with Chronological Age. What is the logic for "Spelling Age?"
Sorry for the delay in publishing your comment.
I've no idea why there is such disproportion between boys and girls in this class. Looking at the data we have on other classes, there seems to be much more balance. I'll ask and get back.
As to the variability, the test starts off by presenting a number of words comprised of one-to-one correspondences but then moves quickly towards testing some consonant digraphs and then on to more ways than one of spelling a sound. As you are aware, knowing which particular spelling to choose when there is more than one requires a deeper kind of memory. I would guess that the pupils at the lower end of the continuum, having had only a year of learning a limited number of spellings of each sound, would be struggling to remember which specific ones to use in the words chosen for the test. I would also expect that the success rate for reading the words would be much higher.
I'm about to publish the results from another class that has had two years of teaching the complexities of the code and you'll see an interesting difference, although it's clear that those kids who make a good start are definitely the rich who get richer!
I'll have to look again at the rationale for the test before answering your last question.
As always, thanks for your comment and questions.
What superb results, John! This shows what we can expect with explicit teaching of the code. Would that these results were evident in all our primary schools. We live in hope.
As a further rejoinder to your question about the imbalance between boys and girls, I asked the head teacher and she said that it was simply a blip, and that most of the other classes in the school are pretty evenly balanced between the sexes.
Comments are closed.