Skip to content
Closer - The home of longitudinal research

BCS70 – Age 34 – Literacy and Numeracy Skills

The 1970 British Cohort Study (BCS70) assessed their cohort members (CMs) during the study’s age 34 sweep using the Literacy and Numeracy Skills’ measure.

Details on this measure and the data collected from the CMs are outlined in the table below.

Domain:Basic adult literacy and numeracy skills
Measures:The multiple-choice assessments measured adult literacy and numeracy based on items from the Skills for Life Survey (2003) using the National Standards of adult literacy and numeracy.
The adult literacy core curriculum covers 'Speaking and Listening', 'Reading' and 'Writing'. This assessment covered Reading and Writing (and not speaking and listening). In the reading domain the questions measured: Reading Comprehension (RC), Grammar and Punctuation (GP) and Vocabulary, Word Recognition, Phonics (VWRP); while the writing domains were: Writing Composition (WC), Grammar and Punctuation (GP) and Spelling and Handwriting (SH). As with the Skills for Life Survey, item selection was heavily concentrated on the many aspects of 'Reading Comprehension'.
The numeracy assessment covered seven aspects of number skill from the numeracy curriculum, using items in the original Skills for Life Survey. The items included: Basic Money (BM), Whole Numbers and Time (NT), Measures and Proportions (MP), Weights and Scales (WS), Length and Scaling (LS), Charts and Data (CD) and Money Calculations (MC).
Administrative method:CASI self-completion (Where the cohort member was unable or reluctant to use the laptop, the interviewer assisted, and if necessary administered the self-completion as an interview).
Procedure:The assessment consisted of 20 questions assessing literacy skills and 17 questions assessing numeracy skills. Each question consisted of a visual image and some text. The question always appeared at the top of the screen, the image at the centre, and the four (in most cases) possible answers appeared at the bottom of the screen. The cohort member read the question on the screen and entered his / her answer, and then the next item appeared automatically. For most interviews, the multiple-choice was completed as a CASI, but there was an option for the interviewer to enter the cohort member's responses if he / she was asked to do so.
Literacy: The literacy assessment consisted of 'two tiers' (upper and lower tier). A total of 20 multiple-choice literacy questions were asked, the first 10 were screening questions (Entry Level 3) covering Reading Comprehension x 7, Spelling and Handwriting (SH) x 2 and Writing Composition (WC) x 1. Respondents failing to answer at least six of these questions correctly went on to answer ten Entry Level 2 questions on the lower tier (RC x 4; WC x 2; GP x 2; SH x 1; VWRP x 1). Respondents who answered between six and ten screening questions correctly proceeded to the upper tier and answered five Level 1 (RC x 3; GP x 1; SH x 1) and five Level 2 (RC x 2; GP x 2; WC x1) questions.
Numeracy: 17 multiple-choice questions were administered in order of difficulty within each curriculum topic. The order was as follows: Whole Numbers and Time (NT) x 2, Measures and Proportions (MP) x 2, Weights and Scales (WS) x 3, Length and Scaling (LS) x 3, Charts and Data (CD) x1, Money Calculations (MC) x 4, Basic Money (BM) x2. The assessment started and finished on an 'Entry level 3 question' (Parsons, 2012).
Link to questionnaire:Not available: Examples in Parsons (2012)
Scoring:Literacy: Scored ranged from 0 to 20 for each of the two tiers, where any correct answer was given a 1, any incorrect answer 0. However, to calculate an overall score including all participants, those who answered the lower tier i.e. less difficult section were assumed not to have been able to answer the higher tier questions and accordingly scored 0, while those completing the higher tire received a score of 1.
Numeracy: Scores ranged from 0 to 17; any correct answer was given a 1, any incorrect answer 0
(See bcs70_2004_user_guide.pdf pp. 25-38 for details on scoring).
Item-level variable(s):N/A
Total score/derived variable(s):Literacy:
litmc20 (raw score 0 - 17)
litmc30 (total raw score: lower tier 0 - 15 and upper tier 16 - 30)
litall27, litall37 (total raw MC and OR score)
litlev, litlevg (banded by National Standards level)
nummct (raw score 0 - 17)
numall (total raw MC and OR score 0 - 23)
NUMLEV, numlevg (banded by National Standards level)
Descriptives:litmc30 (raw score)nummct (raw score)
Range0 - 300 - 17
(click image to enlarge)
(click image to enlarge)
Other sweep and/or cohort:None
Source:Williams, J., Clemens, S., Oleinikova, K., & Tarvin, K. (2003). The Skills for Life survey: A national needs and impact survey of literacy, numeracy and ICT skills. DfES Research Report 490.
Devised by the Centre for the Development and Evaluation of Lifelong Learning (CDELL) at the University of Nottingham. Carried out by BMRB on behalf of the Department for Education and Skills in 2002.
Technical resources:Parsons, S. (2012). User guide to accompany the 1970 British Cohort Study 2004 adult literacy and numeracy assessment data. CLS, working paper. London: Centre for Longitudinal Studies.
For further details see Parsons, S., & Bynner J. (2006) 'Measuring Basic Skills for Longitudinal Study' Literacy and Numeracy Studies.
Reference examples:de Coulon, A., Meschi, E., & Vignoles, A. (2011). Parents' skills and children's cognitive and non-cognitive outcomes. Education Economics, 19(5), 451-474.
Vignoles, A., De Coulon, A., & Marcenaro-Gutierrez, O. (2011). The value of basic skills in the British labour market. Oxford Economic Papers, 63(1), 27-48.

Go to:

This page is part of CLOSER’s ‘A guide to the cognitive measures in five British birth cohort studies’.