Pages

Thursday, 14 April 2011

Understanding Attainment, Achievement and Statistics Commonly used

Analysing pupil performance from the previous summer national tests and examinations is a key item on autumn term agendas.School Governors need to understand their school’s performance data and what questions to ask at meetings.

The schools’ performance data is contained in RAISEonline. This stands for Reporting and Analysis for Improvement through School Self-Evaluation. This is an online resource and schools can provide access for governors to download the full report. However, this is not published until all the data has been validated, usually in the spring term of the following academic year. Nonetheless schools have details of the 2010 national tests and examinations and headteachers should use this information to report the results to governors.

Schools also have access to Fischer Family Trust (FFT) data/ reports which is particularly helpful for target setting and self-evaluation.

Raise Online
https://www.raiseonline.org/

Fischer Family Trust
http://www.fischertrust.org/

The Head teacher’s termly report to governors should always have a section on standards/progress. This can be supplemented by reports from subject and key stage leaders.


Attainment: the standard of the pupils’ work shown by test and examination results. In other words how many pupils have reached the expected level/grade at each Key Stage or end of year assessment?

Achievement: the progress and success of a pupil in their learning, development or training i.e. distance travelled between two points in time – this can be the start/ end of a term/ academic year or between key stages etc.

Remember – It’s quite possible for a school to have good attainment results i.e. at or above the threshold, but lower scores for achievement. In other words, are overall results at the end of a Key Stage concealing poor progress? Conversely, although a school’s attainment results may be average/ below average it’s quite possible that pupil progress can be good or even outstanding.

Ofsted make a judgement on both attainment and pupils’ learning and progress, and remember… pupils’ learning and progress is a limiting grade i.e. it will impact on the overall effectiveness grade.

In order to measure attainment and achievement, a number of statistics have been developed. Apart from percentages (%), these include average points score (APS), value added (VA) and contextual valued added (CVA).

Average points score (APS) is based on the formula referred to above, which links national curriculum levels to points scores, and is used to measure a number of outcomes . These include the performance of individual pupils (or different groups of pupils), different subjects, or a key stage. You can also use APS to work out the average national curriculum level for each subject/ overall at each key stage (see Appendix for worked examples).

School A might show high percentages of pupils achieving Level 4 and above, while school B shows lower percentages. However this may be a reflection, in part, of the school’s catchment area rather than the school’s effectiveness. Similarly, pupils at school B may have made more progress than other pupils who were performing at the same level at KS1 for example, and therefore have a higher value added “score” than school A. Therefore, value added measures are designed to allow a fairer comparison between schools with different pupil intakes/starting points, and to give schools greater recognition for the work they do.

Value added is used to measure the progress made by an individual compared with the average pupils make by similar pupils nationally between two points in time, typically key stage assessments. In other words, it’s a measure of relative rather than absolute performance.

Each pupil’s value added (VA) score is based on comparing their KS2 performance with the median – or middle – performance of other pupils with the same or similar results at KS1. The individual scores are averaged for the school to give a score that is represented as a number based on 100. This indicates the value the school has added on average for their pupils.

Scores above 100 represent schools where pupils on average made more progress than similar pupils nationally, while scores below 100 represent schools where pupils made less progress (see note on statistical significance).

Each pupil’s VA score is based on a comparison between their best eight results at GCSE – sometimes referred to as their capped points score – and the median or middle performance of other pupils with the same or similar results at KS2. The individual pupil scores are added together and averaged to produce the school level VA measure. This number is presented as a number based around 1000.

Measures above 1000 represent schools where pupils on average made more progress than similar pupils nationally, while measures below 1000 represent schools where pupils made less progress (see note on statistical significance).

NB. The accuracy that can be attached to any schools VA measure depends, amongst other things, on the number of pupils in the value added calculation. The smaller the number of pupils, the less confidence can be placed on the VA measure as an indicator of whether the effectiveness of a school is significantly above or below average – see note on statistical significance below.


VA measures take account of prior attainment, which is the biggest single factor affecting pupil results. However, contextual factors which are outside the school’s control, such as gender, mobility and levels of deprivation may have a further impact on pupil results, even after allowing for prior attainment.

In order to introduce more of a level playing field when comparing schools, and take account of these additional factors, a more complex measure known as CVA has been developed. Like VA, this provides an indicator of relative rather than absolute performance and attempts to isolate the “school effect” i.e. whether the school, with the pupils it has, is doing better than, worse than, or broadly the same as other schools, with the pupils they have.

The additional factors which CVA take into account include:

Prior attainment in English and maths

Ethnicity

Gender

Age within age group (month of birth)

Special educational needs

Eligibility for free school meals

First language other than English

Deprivation based on pupil postcode (using IDACI – that is the Income Deprivation Affecting Children Index)

Looked after children

Geographic mobility

Although CVA calculations use a more complex model than “simple” VA, the basic principle remains the same. Quite simply this involves comparing the peformance of an individual pupil with the performance of children with similar prior attainment and similar circumstances.

An individual’s CVA score will be the difference (positive or negative) between their actual performance and that predicted taking into account the national data for all the factors in the model. These differences are then collated to provide a CVA score for the school. As with “simple” VA, this is based around a medium score of 100 for primary schools and 1000 for secondary schools.

A school’s CVA of more than 100/ 1000 means that, overall, the school has performed better that most schools with a similar mix of students and factors. Under 100/1000 means the performance of this group of students is below average.

Whilst CVA can provide powerful insights into the impact which schools have, the figures need interpreting with caution (see note on statistical significance).

NB: Contextual Value Added scores should not be used to set lower expectations for any pupil/ groups of pupils. DfE advise that schools should, when setting targets for future performance expectations, strive to set equally challenging aspirations for all pupils.

When using statistics the term “statistical significance” is often used. In everyday language “significant” means important or meaningful; in statistics it’s the likelihood that a finding, result or relationship is caused by something other than just chance.

In other words, what statistical significance tries to show is how confident we can be that a result is reliable or true.

In RAISEonline most reports show attainment or progress scores for your school relative to the national average/ mean. Significance tests have been performed on data using a 95% confidence interval, and where the school value differs significantly from the corresponding national value, sig+ or sig- boxes are used: the sig+ boxes are usually coloured greenwhilst the sig- boxes are coloured blue. Where a school figure is significantly above that of the previous year an up (↑) or down (↓) arrow is displayed to the right of the figure.

Significance tests are heavily influenced by numbers of pupils in a cohort, so large schools are more likely to see Sig+ or Sig- boxes than small schools, even when differences to national averages are the same.

DfE Guide to CVA
http://www.education.gov.uk/performancetables/schools_08/2007_2008_Guide_to_CVA.pdf

In the White Paper, The Importance of Teaching, published on 24 November 2010, the Secretary of State announced the introduction of the English Baccalaureate. The Department of Education uses a new English Baccalaureate indicator. The 2010 Tables, for the first time, show the proportion of pupils at school, local authority and national level achieving good GCSE grades (A*-C) in both English and maths.Their intention is to include science in this 'Basics indicator’ from next year. Development of a School Report Card, proposed by the previous government, has been discontinued.


http://www.education.gov.uk/performancetables/

No comments:

Post a Comment