, ,

Flex Maths Focused Revision Booklets

One of the most powerful features of Flex is that you can produce personalised work for each student based on what they need to improve in, perfect for homework or intervention. Over 200 sets of questions for GCSE Maths have been created, meaning that you can produce these booklets by just clicking the ‘download student next step work’ button in the QLA (or gap analysis), shown below.

Below are some examples of the booklets that can be produced.

Each worksheet has been split up into stages of difficulty, starting from the easiest type of question and building up to clones of exam questions, often including worked examples for the harder questions.

Students can even access their personalised work online, saving your valuable printing budget. You can also schedule a date from when the answers become available, allowing students to self mark. Try Flex now and create personalised work for all your students in one click.


The 5 Minute QLA Plan

5 Minute QLA Plan small

To make it as quick and simple as possible to get the most from a QLA, I’ve created the ‘5 Minute QLA Plan’. It’s a one page PDF that will guide you through the most important things you need to consider. Print it out and scribble all over it!


I’ve done my Question Level Analysis – Now What?

So you’ve got your QLA and have all the student marks in it (entered yourself or by your students). And while it looks great with all those colours, you may well be asking yourself “So what do I do now?!”.

In this article I’ll discuss how you can use a QLA and get the most important information from it quickly so you can make the biggest impact with your students.

Whole class analysis

What topics and skills were green (>70%) for most students? Well done! Your students are generally fine with these areas but will need some practice to make sure they don’t forget.

What topics and skills were amber (between 30-70%)? Your students need more practice in these areas. Consider delivering this extra practice using lesson starters or homework. As well as allowing you to efficiently cover lots of different areas, using lesson starters that are completely different to the main part of your lesson is an easy way to use spaced practice. You could even use a snazzy spreadsheet to plan when you’ll revisit each area.

When I get my list of green and amber topics from a QLA, I create (or reuse) several one slide PowerPoint files, each with questions/tasks from a topic highlighted by the QLA (or sometimes a combination of topics). Then I simply use that as a lesson starter without changing my main lesson PowerPoint, allowing me to easily use different starters with different classes even if I’m using the same lesson.

When designing next step tasks, bear in mind that different students may have answered the same question incorrectly for different reasons. See my previous article for some important points about how to design these tasks.

What topics and skills were red (<30%) for most students? Can these be improved with questions/tasks like above or do they need re-teaching? Will you use flipped learning, or will you re-teach during a lesson? If you’ll be re-teaching, will it be for part of a lesson or a whole lesson? If a colleague has done the same QLA, did their students do any better in these areas? If so it may be worth discussing with them how they teach those areas to see if they do anything differently.

When re-teaching, the important thing is to do something different. Don’t go and re-teach the exact same lesson again. If it didn’t stick the first time, it probably won’t a second.

Should the lesson or scheme of work be changed? When you re-teach something differently, does it work well? If so, then consider adding it to the lesson or scheme of work.

Individual student analysis

Which students did really well and were green for lots of topics and skills? How will you recognise this? It’s so easy for these students to fall under our radar.

Which students have improved? Perhaps some were mostly red in all areas for a previous QLA but now they are getting more green and amber areas. Ace! How will you recognise and praise these students too?

Recognising and praising the improving and well as high achieving students will not only make them feel good but will also send out the right message to the whole class, that we value hard work.

Which students were red for lots of topics and skills? Are there any other factors beyond the classroom (such as ill health) that we need to consider?

When feeding back to these students its easy for them to get the impression that they did really badly and to potentially make them feel like there’s little point in them trying in our subject. Did they get any green areas at all? If so then make sure you feature these in your feedback, highlighting that they can improve in the other areas too. Share with them what topics most of the class did badly in so they recognise it wasn’t just them that found some areas difficult.

Do these students need anything extra that the other students don’t? Maybe some extra work in some areas?

Do you need to contact any parents? Either for really good performances/improvement or really poor performance/effort. This doesn’t have to be a phone call, you could send an email or ask your school admin team to send a text home.

How will you measure improvement in the identified areas? Will you re-do the whole assessment or just certain questions? Will you use the same assessment/questions or different ones? When will you do that? If you do it too close to the re-teaching then you may be measuring performance instead of learning. A minimum of a two-week gap is recommended to make sure you’re not measuring performance.

Introducing the 5 Minute QLA Plan

5 Minute QLA Plan smallTo make it as quick and simple as possible to get the most from a QLA, I’ve created a ‘5 Minute QLA Plan’ which is a one page PDF that will guide you through all the questions discussed above. Print it out and scribble all over it!

Click here to download now.


Question Level Analysis – The Good, the Bad and the Ugly

As teachers we want to make the biggest impact on our students. However we also know that just because we’ve taught something doesn’t necessarily mean that it’s been learnt and retained by our students!

For the last four years I’ve been using Question Level Analysis (QLA) of tests and assessments to try to better understand what my students have really learnt. This has been hugely powerful; it’s allowed me to see which topics I need to re-teach and which areas students need more practice on. This is even more important since I was awful at predicting how my students would perform; my predictions were wrong 59% of the time and I tended to over-predict their performance. After targeted re-teaching and further practice followed by re-testing, I’ve found that my student’s performance has increased by 20-25% on average, which is an equivalent increase by 1-2 GCSE grades.

It’s also had a powerful effect on my students. For instance a student that got a low overall number of marks (10 out of 60) gained more confidence when she saw that most of those marks had come from just two different topic areas. She could see that she knew those really well and that she could answer the questions on them. She was no longer disheartened by her low overall score. This motivated her to go and work on some of the other topics she hadn’t performed so well on.

However, like anything in education, to get good results, QLA needs to be used in the right way.

Claw hammer

Source: Wikipedia.

For instance a hammer is a great tool for hammering nails into wood. However would I get the same result if I tried to hammer a nail with the claw part of a hammer? Probably not! Just because something is a good tool for a job doesn’t mean that it can’t be used in the wrong way, and so give disappointing results. The same applies to QLA.

In this article I’m going to highlight what you can do to get the most from QLA (the good) by looking at what not to do (the bad and the ugly).

The Ugly…

An example of a really bad Question Level Analysis can be found in this blog article by Jasper Green. I’ve reproduced the image below from his article.

In his analysis Jasper says ‘Take this exam question, the topic is electrolysis. This student scored 1/3. Analysis of the paper by topic would suggest that this student needs to go away and learn electrolysis. But do they? If we look more closely at the individual elements of this question we can see that there are actually some other, much more fundamental aspects of chemistry that this student does not understand.”

The first mistake that Jasper has made here is to use the score for the whole question. Now this can only be done if all parts of the question are testing the same thing. If they aren’t then your analysis is going to be misleading.

The second mistake here is that Jasper has confused the context of the question with what it is actually testing. Although the context is electrolysis the question is only testing a very small part of a students knowledge of electrolysis. In fact part (b)(i) is not testing electrolysis at all, and (b)(ii) is testing if the student knows a use of a product of electrolysis, not electrolysis per se. Only part (a) is directly testing electrolysis.

So we need to look at each question item rather than the whole question, and what each item is really testing, not the overall context of the question.

So what are these three items testing? I’d argue that part (a) is testing if the student can predict the products of electrolysis, (b)(i) is testing whether the student can write the formula of simple covalent compounds, and (b)(ii) is testing if the student knows a use of a product of electrolysis. We also need to acknowledge that (b)(ii) is a multiple choice item and so getting this correct doesn’t necessarily mean that the same student would be able to give a correct written response to a similar question.

If we mock up what the QLA would look like in a spreadsheet for Jasper’s analysis versus mine, we get a very different view (see side image).

Here the top image is the QLA Jasper would have seen given his analysis; not very useful. The bottom is the QLA we’d see by looking at each question item and what each item is actually testing, not the context.

This second QLA is really useful as now we can say that the student needs to practice predicting the products of electrolysis and writing the formula of simple covalent compounds.

Jasper makes a third mistake when analysing this students responses: making invalid conclusions.

In his analysis of the students response to part (a) Jasper writes “they don’t know that elements are always conserved in chemical reactions”.

Maybe, and maybe not. The student could have just guessed carbon dioxide. I’ve actually heard one student say in class “if in doubt, just put carbon dioxide as it comes up so much”. But we can say that the student needs to practice predicting the products of electrolysis, and if they don’t understand that elements are conserved during chemical reactions then working on this will help that too.

Now lets look at Jasper’s analysis of part (b)(i): “The student states that the formula of a molecule of chlorine is Cl instead of Cl2. The student clearly does not understand the concept of diatomic molecules. Simply reviewing the paper in class and getting students to make corrections will only bring about progress if that exact question appears again. A much more effective approach would be to review diatomic molecules and covalent bonding”

Again maybe the student doesn’t understand diatomic molecules. But maybe not. I don’t think you can say that from this one question item. Usually students who write Cl instead of Cl2 can draw a correct dot-and-cross diagram of a chlorine molecule, and students won’t usually think of this when they are writing out the formula. The student could understand diatomic molecules but have simply forgotten to write Cl2. So here all that’s needed is to check if they can accurately draw a correct dot-and-cross diagram of a chlorine molecule and be reminded that most non-metallic elements exist as diatomic molecules (Cl2, H2, N2 etc.).

Whenever we are thinking about why a student may have got an item wrong we have to be careful that we don’t read too much into it. One item doesn’t tell us all about a students understanding of a particular topic, and we could easily come up with the wrong reason for why students get questions wrong. The QLA gives us a starting point; there will usually be a range of reasons why a student could have got a question item wrong. If we bear this in mind we will be able to design tasks that enable students to practice all the skills that are related to the question item and that can give us further feedback about student’s knowledge and skills.

The bad…

To illustrate the fourth mistake we can make with QLA, look at the following GCSE Mathematics question, taken from OCR. What is it testing?

Source: OCR J560-06 H SAM

I’d argue that this question is testing two things: forming equations from word descriptions and substituting one equation into another to solve for an unknown.

However if a student answers it completely incorrectly (or leaves it blank) we don’t know which part was the problem. Is it that they can’t write equations from word descriptions? Can they substitute one equation into another to solve for an unknown?

The problem is this question in not diagnostic. And in fact most past exam questions are not. So using assessments only composed of past exam questions is flawed. They are designed to separate students, not diagnose difficulties.

That doesn’t mean that we shouldn’t use past exam questions in our assessments. We need to know how well students can answers these, as well as getting them used to the style. But if they are all we use, then the conclusions we can draw about what our students know and can do are limited.

So what can we do to improve our assessments and make them more diagnostic? A better way is to complement any past exam questions with questions that test each skill or item of knowledge individually. That way we can more accurately determine what it is that is causing our students difficulty.

So if we were using the above past exam question in an assessment, we could also include the following questions in different parts of the assessment:

  • Marty and George complete some homework. Marty takes 15 minutes longer to complete his homework than George. Write this as an equation.
  • Find the values of M, G and B if M = G +15, B = 3G and M + B + G = 200.

Each of these questions tests the same knowledge and skills as the above exam question, but because we test each part individually we can quickly diagnose what is causing our students difficulty.

The easiest way to do this is to have assessments in two parts. One part is composed of past exam questions whilst the other part is composed of diagnostic questions that test the same knowledge and skills as the exam questions but individually. These parts can be given to students either at the same time (effectively as one assessment) or at different times (one or a few lessons apart).

The good…

We’ve seen the common mistakes it’s easy to make with QLA. In order to get the most out of QLA we need to:

  • Analyse the marks for individual question items, not overall questions (if they have multiple parts).
  • Be specific with what the question item is actually assessing. This may not be the context of the question.
  • View the QLA as the starting point; there will usually be a range of reasons why a student could have got a question item wrong and it is very easy to jump to the wrong conclusion. If we bear this in mind we will be able to design tasks that enable students to practice all the skills that are related to the question item and that can give us further feedback about what our students know and can do.
  • Not using assessments composed of only past exam questions. Include diagnostic questions that test the same knowledge and skills as the exam questions but individually.

Making Assessment Work

This is a reblog of an article I wrote for the BESA website, published December 8th 2016.

Assessment has been broken for a long time. For too long it’s been primarily a reporting and accountability tool, when really it should drive everything that happens in the classroom. As a teacher I was concerned about what my students had really learnt and retained from my lessons. Students forget things, or don’t even learn them in the first place. So the role of assessment should be the diagnosis of each student’s individual learning gaps, allowing teachers to plan high quality lessons and activities to close those gaps.

Early in my teaching career I discovered Question Level Analysis (where you input the mark for each question item that a student gains on a test or mock exam, into a spreadsheet and calculate percentages for each sub-topic and skill). It was incredibly powerful, allowing me to see exactly what each student needed further help with. My teaching became truly responsive to the needs of my students. However the workload of creating the spreadsheets and inputting the marks for each student on every question item was huge, and prevented me doing this with all but a few of my classes.

So I wanted to create an online system that would take the pain out of Question Level Analysis and allow it to be implemented simply in any classroom. That system is Flex Assessment, which will be launched at Bett 2017.

My goal is to bring the power of Question Level Analysis to as many classrooms as possible and to totally change the way assessment is used, whilst saving time and helping students to make better progress.

I’m looking forward to that journey starting at Bett 2017 and hope to see you on stand F452.