Curriculum and Assessment a Symbiotic Relationship

Over the past two years curriculum and assessment have changed and developed more rapidly than I can remember; the emphasis on evidence informed practice is refreshing and brings clear rationale to the ‘why’ when individuals, faculties or whole schools introduce change. The drive to improve becomes purposeful and a collaboration when exploring the evidence together.

Curriculum and assessment cannot exist on their own, they form a symbiotic relationship that forms a clear Teaching and Learning strategy. This relies on both curriculum and assessment being well designed and integrated together to complement each other. The T&L Strategy then presents the potential for high quality teaching and learning; bring in the craft skills of the class teacher to execute the T&L Strategy and students will receive a high quality education.

  • A high quality curriculum provides the classroom teacher with a logical Learning Journey to take their students on, but without assessment it is just a sequence of lessons with no way of informing the teacher how to teach responsively to the needs of the class.
  • An assessment strategy provides the classroom teacher with a means of checking what students know, but without a strong curriculum the assessment is less informative and limits how responsive a teacher can be to the needs of the class.

Along with lots of reading and researching (see references at the end for many of these), the current recipe for constructing our curriculum and assessment strategy has included:

  • Sequencing the concepts of the science curriculum into a logical Learning Journey of ‘building blocks’, considering the Threshold Concepts that underpin progression.
  • Building a teaching order that ensures topics are regularly revisited and build upon, this involved a ‘spiral’ curriculum design that brings in space practice and helps teachers plan interleaving of concepts based on the needs of their class.
  • Constructing clear age-related expectations and success criteria that students can be measured against.
  • Introducing a whole school assessment strategy that enables all teachers to recognise how they use formative assessment to act responsively in the classroom; identify ‘hinge points’ in a topic for more in depth checking of understanding; and long term summative assessments to measure long term retention and plan for linking learning and finding the starting point for the next time the topic is taught.
  • Developing pedagogy that will support delivery of the curriculum, much of it based upon Rosenshine’s Principles of Instruction.

The most useful tool in helping begin bringing together our curriculum and assessment has been the RADAAR framework recently published by the EEF. RADAAR stands for: Research, Anticipate, Diagnose, Address, Assess and Review. This model provides a great foundation for enabling teachers to think and plan the sequence of learning and assessments for their class.

After first viewing this together as a faculty we noted how challenging this planning process is due to the amount of deep thinking required to be really cohesive, and also that non-specialists often lack the depth of knowledge to conduct this without much researching first. This encouraged us to develop our own version of the RADAAR framework that complemented our curriculum and assessment strategy. The aim being to produce a planning tool that does the ‘leg work’; the core thinking that goes into planning a topic, but leaving the autonomy for the class teacher to design lessons to meet the needs of their class; ensuring of suitable breadth and depth of knowledge, pace and appropriate activities to support learning.

The front page of the planning guide begins with the Age-Related Expectations and Success Criteria that underpin the topic.
Over the page we address all the components of that help build Teaching and Learning for the topic.

Beginning with the age-related expectations (AREs) and success criteria we begin with the foundations of the topic. The AREs are what we are going to measure students knowledge, understanding and skills against; while the success criteria identify the types of things students will be able to do to in the pursuit of the knowledge, understanding and skills.

Next we continue in the ‘Research and Anticipate’ phases as we consider:

  • Threshold Concepts – any specific concepts that are achieved in the topic that we consider are ‘portals’ to future learning, they are transformative, irreversible, integrative and troublesome. In some cases it may be that the threshold is in the future however students progress into a liminal state towards that threshold during this topic too.
  • Misconceptions – what are students likely to arrive at the lesson thinking is true but actually it’s not. These may come from general learning through home life and the media; or due to some of the limitations of models at primary level (another interesting discussion!) Knowing this enables us to pre-empt and dispel these by paying particular attention to them and using teaching strategies to replace with correct science.
  • Maths – What maths skills will be needed in this topic? How do our maths faculty teach it? This is to help us deliver consistently and enable students to transfer skills across the curriculum.
  • Scientific Literacy – What tier 3 language will students encounter in this topic? What is the etymology of these terms? Knowing this helps us to make sure we teach these terms explicitly.

Our next phase is of our own inclusion (and suitably ruins the acronym RADAAR into the unpronounceable RALJDAAR…a dinosaur roar!) This focuses on the Learning Journey, and provides a key summary to support non-specialists to sequence the ‘building blocks’ effectively. There are two sections:

  • Links to previous topics. To sign post the ‘baseline’ for students and things to formatively assess pre-topic and begin at an appropriate level of challenge.
  • Learning Journey. A suggested order of concepts so that learning builds sequentially. We do not consider some of these orders to be gospel, but merely a suggestion for those who are unfamiliar with the topic.
  • At this point it also important to draw attention to the bottom right corner box of Links to future topics. These are shared to enable staff to think about they will frame concepts to promote access to more complex ideas in the future without introducing misconceptions that need undoing in the future.

Diagnose and Address:

  • The diagnose section identifies examples of resources that could be used to check for prior knowledge and misconceptions throughout the topic. The BEST resources for science have been invaluable here, along with concept cartoons too.
  • Unpick and rebuild identifies suggestions on how to then tackle misunderstandings and misconceptions with classes, such as useful models and the ‘response’ BEST resources.
  • Practicals and Demos really speaks for itself. A particularly useful section to support new staff and non-specialists in identifying practicals and demos that will support understanding the concepts of the topic.
  • Context and Hinterland. Examples and ideas on how to bring the abstract science into concrete contexts to help students visualise and apply their learning. The Hinterland also provides scope for students to explore how classroom learning links to other ideas and real life scenarios.

Finally, Assess and Review. There are two sections to this phase, assessment and feedback. This section outlines ideas for ‘hinge point’ checks to ensure students have the underlying knowledge to access the next phase of learning. It also identifies opportunities for assessing skills such as extended writing and working scientifically. Feedback (not marking!) is encouraged to ensure students engage in improving their understanding after these assessment points, and that time consuming written comments do not go unaddressed when students do not know how to do so independently (see Feedback not Marking!)

The first planning guides are under trial this term with a view to finessing them and improving them to support us fully from September. One currently on trial is below.

References

  • Misconceptions in Primary Science – Michael Allen
  • Making every science lesson count – Shaun Allison
  • The ResearchEd Guide to Explicit and Direct Instruction – Adam Boxer
  • The Feedback Pendulum – Michael Chiles
  • The CRAFT of Assessment – Michael Chiles
  • The ResearchEd Guide to Assessment – Sarah Donarski
  • Responsive Teaching by Harry Fletcher-Wood
  • Powerful Ideas of Science and How to Teach Them – Jasper Green
  • Retrieval Practice by Kate Jones
  • Teach Like a Champion – Doug Lemov
  • Gallimaufry to Coherence – Mary Myatt
  • High Challenge, Low Threat – Mary Myatt
  • Rosenshine’s Principles in Action by Tom Sherrington
  • Walk Thrus – Tom Sherrington and Oli Caviglioli

Deeper Curriculum Thinking in Science

Over the last two years in science we have worked hard on thinking about how we sequence our curriculum. In doing so we introduced a ‘spiral curriculum’ which splits topics up to ensure deliberate spacing and returning to topics to encourage retrieval and consolidation of each topic. Our next step was to then consider the content in each topic to ensure a logical order to the Learning Journey; coupled with our pedagogical work on ‘Making the Learning Stick’, we have seen improvement in student attainment and enthusiasm for science.

Initially our work on ensuring a coherent Learning Journey was based mainly on previous experience and expert subject knowledge. As we began exploring evidence that detailed ideas for teaching sequences we felt confident in many of our assertions and made few small adjustments as we learned of different rationales for sequencing learning.

Our next steps have been linked to our College Improvement Plan, and together we have begun to think more about how to build a more coherent curriculum between subjects. This has lead us to begin thinking more deeply about our curriculum in science, and consider a more robust rationale to justify our topic sequencing. A really effective guide for focusing our discussions in science has been the RADAAR framework developed by Niki Kaiser at the EEF. The elements of RADAAR have prompted us to explore a wide range of considerations in curriculum planning; some of these elements we had begun thinking about before and some were new. The RADAAR framework has helped us think about how we link all of these components together, including prior learning at KS2, potential misconceptions and retrieval practice.

This has also linked neatly with our whole school focus on thinking about Threshold Concepts. We have used these to give more in depth justifications for our topic and concept sequencing, and to really think collaboratively on whether some of the barriers to student understandings are rooted in the order we teach.

To begin with, we shared a definition of a threshold concept; based on the original research by Meyer and Land, who defined a threshold concept as a “portal” or “conceptual gateway”, that acts to build knowledge and understanding in a transformative, integrative and irreversible way. The concept is likely to be troublesome to learn too. Working within this remit we asked ourselves “What are the threshold concepts in biology/chemistry/physics?” Ensuring we justified our conclusions of what made it a threshold concept rather than a key/core concept. We next identified how students work towards these (in a state of liminality), and when they might cross each threshold.

Some of our most interesting conclusions to draw from this:

  • When does a key/core concept become a threshold concept? In discussion it was clear that some concepts can be more troublesome or transformative than others. We had to decide whether every concept even those which are slightly transformative are a threshold concept or at what point a concept is a true ‘portal’ to cause irreversible change in understanding.
  • The threshold concept might not occur during the time a student is at secondary school but they may enter a liminal state as we introduce some key concepts that are the building blocks of a threshold concept at A Level or beyond.
  • Threshold concepts focused on factual knowledge and understanding were easier to identify, but it was more challenging to identify threshold concepts for practical skills.

Our discussions surrounding threshold concepts have spanned an entire term, and have served to greatly increase our awareness and understanding of our curriculum. For instance, the biologists have questioned the best order for teaching about cells; the chemists have explored and discussed the plethora of links there are between topics to support schema building in chemistry; and the physicists have given consideration to how a topic such as waves at secondary level is perhaps fundamental knowledge that helps build towards threshold concepts met at A Level.

Throughout our discussions we have grappled with the interpretation of what a threshold concept is, and how to differentiate from a key concept, especially when there appear to be ‘levels’ of concept within these definitions, and as such, it probably helps explain why there is no definitive list of the threshold concepts for each subject area. Our lists we have created are by no means exhaustive, an most definitely not a checklist to work through. We see them as a tool to help us in our next stages of developing ourselves professionally to have a deeper understanding of our curriculum and the links within and beyond science.

The next phase of our thinking is going to be creating a threshold concept map of the links between the three science specialisms to enable us to better sequence our curriculum and ensure we are building knowledge, and in turn, science schema in a coherent manner. We will then plan how we can record our thinking and curriculum understanding in a way that supports non-specialists and new staff in picking up and teaching less familiar science; the RADAAR framework is again proving invaluable in helping us design our planning tool that brings each strand of our thinking together: The Learning Journey, Threshold Concepts, Age-related expectations, Learning Goals and Success Criteria, assessment strategy, retrieval practice and subject knowledge and misconceptions.

References

EEF Improving Secondary Science Report https://educationendowmentfoundation.org.uk/tools/guidance-reports/improving-secondary-science/

EEF RADAAR Framework https://educationendowmentfoundation.org.uk/news/eef-blog-introducing-new-resources-for-sorting-out-scientific-misconceptions/

Meyer and Land, 2003, Threshold Concepts and Troublesome Knowledge: Linkages to Ways of Thinking and Practising within the Disciplines http://www.etl.tla.ed.ac.uk/docs/ETLreport4.pdf (accessed March 2021)

Cousin, 2006, An Introduction to Threshold Concepts https://www.ee.ucl.ac.uk/~mflanaga/Cousin%20Planet%2017.pdf (accessed March 2021)

Talanquer, 2015, Threshold Concepts in Chemistry: The Critical Role of Implicit
Schemas https://pubs.acs.org/doi/pdf/10.1021/ed500679k (accessed March 2021)

Park, 2015, Impact of Teachers’ Overcoming Experience of Threshold Concepts in Chemistry on Pedagogical Content Knowledge (PCK) Development. https://www.koreascience.or.kr/article/JAKO201530848423866.pdf (accessed March 2021)

Chandler-Grevatt, 2015, Challenging Concepts in Chemistry https://edu.rsc.org/feature/challenging-concepts-in-chemistry/2000069.article (accessed March 2021)

Extensive index on Threshold Concepts: https://www.ee.ucl.ac.uk/~mflanaga/thresholds.html

Thinking Differently not Working More

Following on from considering the components that will help to structure a lesson for online learning that ensures not just engagement of students but progress in their learning, it is important to think about how PPA is utilised to maximise impact. It is all too common to hear the phrase ‘online teaching is more tiring’. If we can develop an awareness of what is different about online preparation we can once again ensure we are working in a time efficient manner. Online teaching does not require working more, but is does require thinking differently.

Planning the Lesson Content

Things that are the sameThings that are different
The Learning Journey – lessons need clear sequence and each lesson needs to have the same ‘building blocks’, just as it would in class to build knowledge, understanding and skillsUse of technology: the delivery methods for new knowledge, understanding and skills.
Modelling new concepts‘Transitions’ in the lesson
Giving students thinking timeNon-verbal cues not visible
The lessons require resourcingHow resources will be provided to the students – no printing, but need to be provided electronically
Methods of assessment are needed to check for student understanding and over time retention of understandingMethods for implementing assessment need altering for an online environment
Students require feedbackMethods of providing feedback

The Learning Journey and Lesson Delivery

The main construct of a lesson is the same, the adjustment is in the process of preparing and thinking of the actions that will need to be taken to deliver the lesson.

When planning a lesson it is only natural to ‘play it through’, and think about what to say and how to say it. With online learning this has become even more important as delivery of an explanation will not be punctuated by non-verbal cues from the students. In a lesson, while explaining, a teacher will be measuring comprehension and confusion from students’ faces and also be aware of anyone who may be losing concentration; they will then adjust their explanation to meet the students’ needs, for example: re-framing the explanation, or pausing for thinking time and questions. It is this unplanned, responsive element that is lost online. Therefore, in planning for the lesson it is important to take even more care than normal to prepare for breaks in the explanation and consider how to scaffold students’ thinking and processing. This is a very slight nuance – naturally we would plan for this, but it is more important when online.

Some key ideas to consider when planning explanations (that we consider anyway even in F2F teaching):

  • Make the Learning Journey clear – again another thing that is normal for every lesson, but having real clarity throughout the whole lesson on the purpose of the learning is a great motivator for students. Be clear in how the learning links to prior knowledge and future learning.
  • Modelling Concepts – What will these models look like online? Can they be the same as in the classroom? In what ways can technology support the modelling process?
  • Thinking Time – Do not rush through, when pausing to give students time to process or complete an activity use an actual clock to measure the time, a silent 3 minutes feels like an eternity without non-verbal cues.
  • Key terms – What subject specific terminology will be used? Does it need explaining before the full concept is addressed?
  • Length of explanation – if there are too many things to process all at once students will not be able to address any of it, break the explanation up into sections.

Resources

The lesson will still require resources – whether this is utilising presentation software or needing to prepare for a ‘whiteboard and pen’ approach. Most of this can be prepared for in the same way, however extra care needs to be given to thinking about how the delivery will appear to students – for example, students will be following the mouse pointer/pen and not the visual of the teacher gesticulating and pointing directly at the board. Think of the ‘magic pencil’ that used to float across the screen on Words and Pictures in the mid-90’s! Would this have worked if the pencil hadn’t been there and letter just appeared?

Magic Pencil!

Magic Pencil, actually has a second key thinking point, not only the ‘floating pencil’ but consider the construction of the visual – it was actually cleverly prepared to minimise all aspects of extraneous cognitive load, a plain background, clear lettering, and plenty of empty space! Nothing glaringly distracting or irrelevant to the goal – it communicates exactly what the student needs.

Thinking further about resourcing the lesson, there is a bonus of no printing, no going into battle with the photocopier or clearing paper jams! Instead uploading to an online platform becomes the new challenge, and the teacher mantra of ‘practice, practice, practice’ proves itself again; practice and you will get quicker – along with also sussing out methods for speeding things up, for example, ‘reuse post’ on Google Classroom. Design a template to use when posting each assignment and then duplicate and edit rather than writing from scratch each time is also a great time saver.

Transitions

Another key difference to consider in the lesson is ‘Transitions’. The same as in the classroom this is from one phase of learning to another, however this can be quite different in the online classroom. For example, a transition from listening to a teacher explanation to a written piece on a mini whiteboard.

  • In a lesson this is: Stop looking at the teacher/board -> pick up pen and whiteboard from desk -> begin task.
  • In an online lesson this is: Stop looking at the lesson screen -> change windows to the mini whiteboard -> select the relevant tool to create textbook, freeform, draw etc. -> complete task.

These may seem fairly similar, but when considering familiarity with the equipment, the potential need to keep swapping screens to refer to instructions, use of the chat box to type questions/unmute to ask verbally; the activity takes on a whole new style that students need to learn to navigate; admittedly most learn this very quickly. On the flip side, other transitions, e.g. handing out scissors/glue, may be easier or slightly different; use Google Slides to ‘drag and drop’ instead of ‘cut and stick’.

Use of Technology

The above example also links directly to another key difference, use of technology. This is perhaps the single thing which can make things time consuming, and perhaps be considered the ‘tiring’ bit. Ultimately, there is a LOT of technology out there which could be used to support delivery of online lessons, and this can seem a pretty daunting prospect to think of what could be used. The vital thing is to use technology that enhances the learning experience.

Consider this ‘would your rather’: If you were a student would you rather:

  • Attend a lesson that used lot of different technology, but the teacher found it difficult to use and spent ages ‘making it work’ and you did not get to practice your learning properly.
  • Attend a lesson that did not use much technology, but the teacher used the technology effectively, communicated clearly what you needed to learn and gave you time to practice it.

Students are joining their online lessons to learn, just as they attend to school to learn; for us to achieve this goal there are some aspects of technology that we must learn to use (e.g. the VLE and live meet software), there are then things we can learn to use that will enhance the learning experience if used well (e.g. Sharing Google Slides for Collaborative work, Visualisers, Google Forms, Whiteboard.fi). It is this ‘second tier’ of technology where we need to exercise control, trying to do everything will be less effective (and very time consuming) compared to learning to do one or two things well. Also, do not compare yourself to other members of staff – we are all on a similar journey in learning to use different forms of technology, and we will all learn at different rates!

Assessment

Then finally in the planning stage the most challenging question – how to know if the students are understanding if the work cannot be seen in real time?! The actual ‘what’ to assess remains the same as it would in the classroom, the difference lies in how to deliver this online. Some ideas have been discussed already – whiteboard.fi and google forms. Google slides also provides a creative way of assessing students, students can be assigned a slide to work on and the teacher can scroll through in real time observing students work and giving live feedback. Perhaps the closest to circulating the classroom so far. The key thing is to make sure the content of the assessment is carefully constructed and the method of delivery is prepared thoroughly to ensure it can be implemented smoothly.

Preparing Feedback

This final phase of PPA, preparing feedback for students; this is perhaps the least different, and from my perspective being online is a brilliant driver of ‘feedback not marking’

Firstly, students are (hopefully) uploading everything. This amounts to a huge workload if we try to read and respond to every. single. piece. individually. In fact, it is pretty much impossible, and we would not try to do this day to day in the classroom. However if we consider why we are asking students to upload their work each lesson it is because we need to see it to help us achieve the goals we want for them:

  1. We want to be able to help them further their learning.
  2. We want to make sure they have not got any misunderstandings or misconceptions.
  3. We want to know they have been thinking, and not just passively listening.

Therefore, we need to see their work so:

  1. We can adjust the Learning Journey of future lessons accordingly.
  2. We can identify specific areas of need for groups/individuals.
  3. We have a collective overview for planning longer term.

So, due to the volume of work uploaded we need to employ strategies that are informative and workload efficient.

  • Feedback not Marking. This is vital, marking takes hours, feedback does not.
  • Be Selective. In the classroom, we would not try to read and mark everything. Select a specific task to focus on that will diagnose the understanding of each student in the lesson.
  • Sample Work. In the same way we can be selective on marking a specific piece of work, it could also work to sample the class. Select a range of students from the class and read their work from the lesson, make a summary of common errors and areas for improvement; address them in the next lesson with the whole class.
  • Self-Assessment. Just as we would in the classroom, use self-assessment strategies so that students check their work and instantly make corrections to their understanding.
  • Self-Marking Software. An opportunity for using technology to save time! Use software such as Google Forms to create MCQs that self-mark and provide data overviews of class performance. Time invested to create the form will provide instant data to guide the Learning Journey.

Summary

Same actions/processesNew actions/processesActions/processes ceased
Planning the Learning JourneyUsing the VLE and live meet softwareNo photocopying/printing
Preparing lesson resourcesUsing other software to enhance the learning experience.Handing out / collecting equipment and printouts
Assessing student understandingTeaching students to use new softwareCirculating during independent work
Principles of Direct InstructionMarking paper copies
Wait Time
Using previous lessons to inform future planning
Providing students with feedback

Things to consider

  • Focus on understanding the technology side of things
  • Consider whether using millions of ‘whizzy’ methods of delivery are really necessary or whether they are ‘time eaters’ and a simpler method can be used.
  • Use assessment methods that inform you of student understanding quickly.
  • Be kind to yourself – and think about how you are improving lesson-to-lesson!

Adjusting to Online Teaching

Following the first week back teaching remotely, it has been noticeable how full edu-social media has been with teachers sharing tips and ideas on how to use a range of different technology for ‘live teaching’ – it’s brilliant! Technical barriers aside, there have also been adjustments to make in the delivery and execution of the lesson itself.

  • The start of the lesson – students arriving on time, how to check prior knowledge of all students, how to engage all students with you and not home distractions.
  • New content – how to continue to maintain engagement, how to ensure students access, understand and progress with their learning.
  • Checking understanding – How to check understanding efficiently when you haven’t been able to circulate during the lesson, and, without needing to read/mark every single page uploaded from home.

The start of the lesson (first 15 minutes): There is no universal bell ringing in every house to signal lesson starts/change overs – the impact? Students arriving across a wider time frame. What would usually be 5 minutes to settle and complete a recall quiz at the start of the lesson to check understanding becomes a mixture of some students present and engaged for 10 minutes to some just arriving: “Sorry I was late, what are we doing?” The time frame for all students to arrive and be ready is closer to 10 minutes (hopefully less over time if students are challenged to improve time-keeping and their competency with technology!)

To adjust effectively, consider the purpose of the starting phase of a lesson; it is to achieve two things: Settle and engage students in the lesson, and crucially to recall and check prior knowledge before delivering new content.

Both of these aims can be achieved using virtual mini whiteboards such as whiteboard.fi Student responses can be seen in real time. The task that would be used face-to-face does not necessarily need to be adjusted – plan for the ‘5 minute recall’ as normal, but in addition be prepared with a ‘5 minute extension’ which can be delivered verbally to ‘fast finishers’. It may be helpful to use the extension to start students’ thinking about the new content which will be modelled to students in the next phase of learning.

After the 10 minute period that has allowed you to welcome students, ensure their engagement and complete the recall task; the next step is to provide whole class feedback, be responsive and ready with a virtual whiteboard (e.g. whiteboard.fi or Google Jamboard).

Finally, introduce the Learning Journey to students – make sure they know how the previous knowledge that they have been recalling links to today, what the learning goal is today, and how this will link to future learning.

New Content (30-35 minutes): The next challenge exists both in the classroom and remotely – how to best impart new knowledge and understanding to students. However, remotely there are some additional challenges: the loss of non-verbal communication (the ‘confused’ look), the ability to circulate and see students work ‘live’, and ensuring engagement is maintained has additional challenges (are they checking their phone?!)

This section of the lesson now requires additional care in planning to overcome the extra challenges. The sequence of Direct Instruction > Guided Practice > Independent Practice is essential in succeeding here. Some quick and helpful references to these ideas: The researchED Guide to Explicit & Direct Instruction by Adam Boxer and Tom Bennett, and ‘Walk Thrus’ by Tom Sherrington and Oliver Caviglioli are both excellent guides and easily adapted to the virtual classroom.

Be prepared at this point to also build in time for students – it takes time to write the date and title, to present work neatly, and to think, synthesise and write an answer (remember ‘wait time’ evidence Stahl 1994 shows it’s easy to rush verbal questioning in class; letting students work independently also takes time, and time definitely seems to go slowly when you are waiting!) Ask students to tell you they are done or use the ‘raise hand’ function on Google Meet so that you are not guessing work rate; this also allows you to move individuals on to extend their understanding.

Checking understanding (10-15 minutes): It is likely from the last section of the lesson that you are feeling a bit ‘blind’ to how your class have been progressing – the usual ability to circulate and observe progress is lost. It is now important to conclude the lesson in a way that ensures you gain this oversight to check understanding and help you set a baseline for next lesson – but also be time efficient too!

A useful way to do this is ask your students to upload their work to Google Classroom – through uploading photographs of handwritten work or electronic copies they have worked on. When setting an assignment, attaching a Google doc/slides and creating a ‘copy for each student’, so they can work directly on it, removes the need for uploads; students simply open and edit the file, and turn in the assignment.

This is also where tools such as a Google Forms can be your best friend – an exit ticket constructed of 5 careful multiple choice questions that self-marks, will allow you to check understanding quickly and inform planning. Perhaps add one free response for marking (don’t add lots, keep workload manageable!) Alternatively, make use of the question and private message functions on GC to set a question that students respond to and allows you to provide feedback on. Remember feedback does not mean a personal comment to each student – it could be reading responses and presenting whole class next lesson! Feedback – Not Marking! (I’m looking forward to reading M. Chiles new book the Feedback Pendulum, it’s on the way!)

Conclusions

Overall, when applying this model of delivery to students, the first 30ish minutes the teacher is very involved and active with the students – welcoming, ensuring engagement, providing feedback, sharing new knowledge and guiding practice. However after this point, students enter independent practice (including submission of work). From experience it is helpful to stay on the Google Meet in the second half of the lesson as a quick question can be easily answered or explained verbally to move the student on instantly, and messages that require typed responses become far less frequent and time is gained outside of the timetabled lesson.

  • Timing an online lesson is different to a f2f lesson
  • Skillful Direct Instruction and Guided Practice is essential
  • Independent practice may need additional scaffolding
  • Give time for students to upload their work
  • Use efficient feedback strategies

This summary also only introduces some examples of technology that can be used to support lesson delivery, however there are loads of others too – check out Tom Sherrington’s newest post where he has crowd sourced resources to support efficient feedback on students’ writing and just generally browse twitter for some great ideas!

Feedback NOT Marking

During the first phase of introducing our new approach to assessment we have focused on how we build in opportunities within the Learning Journey to directly assess students against our Age-Related Expectations. In doing so staff were encouraged to explore ways of providing students with feedback that was timely and helped move students forward throughout the topic.

Having now spent just over half a term exploring this it was interesting how our first review of Progress Books demonstrated a vast array of approaches to addressing the challenge of effective feedback. It demonstrated clearly the pros and cons of various methods. Below I explore some of these different ideas and link to a range of other research that evaluate these methods too.

Feedback or Marking?

The first thing I noted was how differently staff interpret the need to respond to student work; ultimately it comes down to whether staff are providing feedback or marking work. David Didau writes about this in his blog, Marking and Feedback are not the same. Ultimately, we want to move students forward in their learning and crucially it is feedback that achieves this, marking is merely the act of “checking, correcting and giving a mark to students’ work.” However, that is not to say that feedback will not include checking and correcting student work, it is the how the checking and correcting is delivered that is important.

Method 1: Individual Written Feedback

Pros: Personalised; allows basic SPAG to be addressed.

Cons: Time consuming; difficult to make comments useful and concise; challenging to support all students to simultaneously respond.

It is only worth writing individual feedback if it is well-structured, going to be acted upon and make a difference to the student in the long term. Daisy Christodoulou reflects on this when she contrasts written comments with whole class feedback in Whole-class feedback: saviour or fad? Ultimately the issue is how time consuming it is to write these comments and that students need to be trained how to interpret and respond to them; and even then, if the comments are not composed well, or require students to learn something they do not know, the written comment will have no impact at all.

Method 2: Feedback Slips

Pros: Faster and less repetitive than written comments; element of some personalisation; captures most common errors and misconceptions.

Cons: Rely on students being able to respond independently; may not address individual gaps if uncommon error/misconception; difficult to address SPAG.

These are a kind of mid-way point between individual written comments and whole-class feedback. They address the issue of workload by removing the written element and standout to students using highlighted sections. However, they still present challenges, such as how to support students to all respond to their comments independently and secure progress. This can, in part, be addressed by taking time to construct careful well-structured scaffolds to guide students to respond. Feedback slips like this may be effective in addressing errors that students have made and have the knowledge to correct, but they do not allow misconceptions or a lack of knowledge to be addressed; a misconception needs teaching to undo the wrong knowledge and rebuild with correct knowledge, a lack of knowledge needs teaching from the beginning.

Method 3: Whole-Class Feedback

Pros: Fast, efficient; does not require time-consuming comment writing; can directly address misconceptions through re-teaching; ensures students move forward in their learning as the teacher leads/scaffolds the feedback.

Cons: Requires careful design to ensure all students benefit.

When thinking about WCF, I actually struggled to think of many cons, and similarly there is very little negative research about it too! Really, WCF is something we do often, and responsively, in the classroom as we adapt the Learning Journey to the developing needs of the class within the lesson. It’s perhaps the more deliberate practice of doing so for a specific piece of work/concept in a subsequent lesson that teachers feel less proficient with – and I think this is because, as has often been the case, many schools have marking policies that insist on written comments in students books with a set regularity; and even if this has been changed to focus on feedback, old habits die hard.

Once a teacher embraces the idea of WCF the concern for the frequency specified in a school policy evaporates; the teacher does not need to check regularity of feedback because they embed a feedback and response routine into the Learning Journey for each of their classes to ensure the Learning Journey is constructed to meet the needs of the class. The skill is in how feedback and response is executed to secure progress.

A useful starting point for developing a feedback and response routine is a WCF framework, there are many examples of these already discussed in detail online, for example: Mark Enser, Making a Fuss of Feedback, and Nimish Lad, Thoughts on Feedback. The clear benefit of these is that it is a quick summary to complete as students’ work is checked and assessed, which can then inform planning for the next lesson and the feedback to be shared.

As proficiency in WCF develops, it is important to be clear that it is not ‘filling in a form’, this framework is just a scaffold to support developing proficiency in analysing students’ work to inform the Learning Journey; and that eventually the framework may not be needed as the teacher may move directly to building the lesson sequence to address the learning needs of the class as they assess students’ work.

Crucially, WCF allows the teacher to deliver feedback in a wide variety of ways that will enable them to be certain of a change or extension in students’ knowledge and understanding. It allows students to be guided through understanding feedback and responding to develop their understanding in a way that they know they improving.

Conclusions

Whole-class feedback clearly has the greatest potential as the most effective and efficient method for moving students learning forward, however there may be instances where carefully crafted written comments and other feedback will support the progress of students too.

It is clear that written comments are workload intensive and often have limited impact as students need to understand how to interpret and then act upon their comments accordingly. If the comment requires trying to address a misconception or knowledge gap the chances of a student successfully being able to undo and rebuild their schema independently is likely to be limited. The student requires support and modelling to deconstruct and rebuild their understanding, which can be provided by WCF.

A clear example of this would in using a hinge-point question or analysing responses to MCQs. If the MCQ has been constructed well then all of the answers will be plausible and the wrong answers will indicate the root of a misunderstanding or misconception that students hold. If the teacher is aware of this then they can either prepare to respond instantly with re-teaching, modelling and non-examples to build the correct knowledge; or if taking in the assessment at the end of a lesson, reflect on the incorrect responses and prepare how to address these in the following lesson, providing an opportunity for students to thoroughly consolidate the correct learning.

In contrast, WCF may be less effective in addressing SPAG, unless the error is common to the majority of the class or the grammatical error is proof of a misconception. Written marking for SPAG with an opportunity to correct the error would be effective in guiding a student to correct a spelling, e.g. photosynthesis, and instructing a student to independently use Look, Cover, Write, Check to learn the spelling. However if it is a grammatical error it may be exposing a misconception, for example, if a number of students write “Sodium Chloride is an ionic bond” there is clearly a wider issue of what an ionic bond is compared to the actual compound, and how to write about them with scientific accuracy. This second issue would be addressed far more effectively through WCF as the concept needs explaining clearly and modelling to change student understanding.

Overall the evidence of the benefits of whole-class feedback clearly outweigh the argument for individual written comments. WCF clearly has greater power in moving forward students understanding while also having the added bonus of being workload friendly – who wouldn’t want a system that is effective and efficient! The emphasis must be on feedback not marking!

References – The resources mentioned above, and more, which have helped shape my thinking.

What does the research say about marking and feedback? Harry Flecther-Wood

Whole-class feedback: saviour or fad? Daisy Christodoulou

Marking and Feedback are not the same. David Didau

Is marking the enemy of feedback? Michael Tidd

Marking vs Feedback: The Journey. Lauren Murphy and Sunita Vyas

Whole-class feedback: fad or workload saviour? Adam Riches

Thoughts on Feedback. Nimish Lad

Making a Fuss of Feedback. Mark Enser

Five Ways of Giving Effective Feedback as Actions. Tom Sherrington

What to do after a mock exam: into the classroom with whole class feedback. Adam Boxer

Marking is a Hornet. Joe Kirby

The CRAFT of Assessment. Michael Chiles (book)

Implementing a New Assessment Model

My last (very long) blog on Assessment introduced the ideas I’ve been exploring over the past year to introduce a new approach to assessment that complements our improved curriculum.

This past half term we have focused on introducing our assessment strategy with KS3, Year 7-9. We had a vision for the half term to work on building a ‘narrative’ of student learning throughout a topic. Reflecting on this past half term it has been really interesting to see how this is progressing, as a team we have definitely taken strides forward in achieving our goal of a more holistic and comprehensive assessment system. When looking in progress books there is a visible build up of work demonstrating student understanding, rather than the previous collation of test papers. When we reach our first long term assessment point this coming term it will be interesting to see how well our ‘Making the Learning Stick’ strategies have worked.

What has been really reassuring is the things I was anticipating to be our next steps have been exactly as expected and fortunately there have been no surprises!

Our next steps in the coming half term will be focusing on the following:

Tier 1 Assessment – The in class diagnostic assessments we use to teach responsively. The aspect teachers have compete autonomy over deciding the design of, for example, mini whiteboards, verbal questioning, and ‘hinge’ questions. We will also be exploring the BEST resources. Designed to diagnose preconceptions and response activities to support overcoming them.

Tier 2 Assessment – The in class assessments that are used consistently across the year group to directly assess Age Related Expectations. These will be our main focus to develop:

  • Consistency in narratives from class to class. This half term has been very much an experiment and a number of our resources used we have worked as delivering the topics.
  • Holistic conclusions on student progress in a topic. Having experimented with how a narrative of student learning forms we now have evidence that we can draw conclusions from. A key focus will be on how we standardise our conclusions and ensure consistency.
  • Assessments that cover the breadth of learning in a topic. To create an effective narrative that we can make evidence-informed judgements from we need to be thorough in mapping the assessments to the Age-Related Expectations.
  • Providing feedback in a timely and workload friendly manner. This enables misconceptions to be addressed and enhance student understanding in a topic more effectively. An obvious conclusion, but shifting from an end of topic only assessment with feedback all in one go, to a model of ‘bite-sized’ feedback and response takes time to adjust to and build as a habit.
  • Interpreting MCQs. Developing strategies to interpret the wrong answers to MCQs and use these to address misconceptions and misunderstandings.

How we are addressing these things this half term:

  • We have shared model examples from the first half term on what a narrative might look like. With the aim of refining assessment for the topic is presented.
  • Implementing a feedback sheet to support making holistic conclusions and providing feedback that is workload friendly.
  • Continuing to review and develop resources to support assessment of the topics being taught.
  • BEST resources have been mapped to the topics being taught and have Teacher Guides to assist trialing and implementation with classes.
  • Shared explanations on how MCQs have been constructed and examples on how to interpret class outcomes.

Subject Knowledge and Misconceptions in Science – Part 2

Part 1 – considering specialist v. non-specialist teaching in science can be found HERE.

Having had my ambitions to explore subject knowledge and misconceptions across the KS2/3 transition with our feeder schools derailed by the pandemic, I settled for a more in depth focus over the last seven months on the KS3 curriculum. As a faculty we have moved forward together to improve our curriculum and ensure it lays the foundations for developing deeper understanding at GCSE.

February to July – Curriculum Review and Improvement

The first phase, between February and July, focused on reviewing the current KS3 curriculum in subject specialist teams. The following key areas were focused on:

  • Students need to ‘know stuff’ to be able to understand concepts – so, what do students need to know, and begin to understand, at KS3 to build a detailed, interlinked science schema during KS4?
  • In tandem with knowledge students need to develop skills, both Working Scientifically and mathematically – What do students need to ‘be able to’ do from a skills perspective?
  • Finally, how should the identified knowledge, understand and skills be sequenced to create a coherent learning journey and help students make links to previous learning and between specialisms?

Once the three strands of the science curriculum had been clarified and sequenced they were framed in a set of Age-Related Expectations and success criteria for each topic. The Age-Related Expectations focus on what students need to ‘know’, ‘understand’ and ‘be able to’. The success criteria are the granular details that enable a student to achieve the AREs – the ‘building blocks’ that will construct a schema.

In constructing AREs and success criteria to frame the curriculum it was essential to make sure that these scaffolds were built in such a way that they would guide non-specialist staff to avoid common pitfalls and misconceptions. Thus, it was crucial that they achieved two key things: 1. A coherent sequence for the learning journey in terms of topic order, while maintaining sufficient teacher autonomy to sequence the lessons of a topic to suit their class, and 2. Correct and consistent use of science specific language at all times.

September – Implementing the Curriculum

The outcome of preparing so thoroughly for September has had a two-fold impact; it has set us up with a coherent and logically sequenced curriculum which we are now delivering across Years 7-9. Secondly, and more importantly, there is a real sense of ownership and responsibility throughout the team to deliver our curriculum to a high standard across all specialisms.

Since September it is now common for a member of staff, when teaching out of specialism, to seek out a specialist to help them sequence and understand the learning journey; to check ‘why a specific order’ and ‘how to explain and model a concept’.

Having no prescriptive lesson order has created a fine balancing act between instilling confidence in staff to design their own sequence while meeting the needs of non-specialists who feel they ‘don’t know where to start’ on an unfamiliar topic. The appreciation of ‘one size does not fit all’ in terms of sequencing is improving, and although it feels like taking away the comfort blanket for the non-specialist, avoiding giving a prescriptive list is driving rich T&L discussion and careful thinking about the learning journey more than ever before. This in turn, is helping us to be better prepared to address misconceptions that we may not anticipate and making sure that we think about the ‘big picture’ of where the learning is coming from and where it is leading to.

November – Moving Forward

The next steps in the journey to improve teaching around misconceptions link closely to our whole school focus this year on Access and Challenge. The EEF Improving Secondary Science Report has been invaluable in helping to frame this, thinking specifically of Strand 1, Preconceptions:

  • Developing greater fluency in understanding students’ misconceptions in non-specialist subjects – especially in how to identify preconceptions and why students have these ideas. What is the baseline of each student? How are we going to then construct an accessible learning journey that builds from the baseline and dispels misconceptions?
  • Developing a repertoire of techniques to actively address misconceptions – how can cognitive conflict and discussion be used to tackle misconceptions and challenge students to think deeply to further their knowledge and understanding?
  • Ensure we provide time for addressing misconceptions and actually change student thinking – a quick correction will not undo a hardwired misconception. If misconceptions go unaddressed, these form barriers to students accessing more challenging learning.

Assessment

Following a year carefully sequencing our curriculum and focusing on improving teaching techniques to deliver content effectively, my aim for the coming year is to review how our assessment model can be improved and complement our curriculum content. Assessment has already begun to evolve as a result of our improved curriculum delivery, but also now needs to change rapidly to help secure real success in the techniques we have successfully employed.

This blog focuses on my thinking and planning for how assessment will look in science this coming academic year.

The current assessment model that we have used faculty-wide in science has become unfit for purpose. In fact, when asking myself the crucial question, “What is the purpose of each assessment?” I found my answer was really unsatisfactory, the bottom line: because our end of topic assessments fell with enough regularity to meet the whole school assessment policy we were trying to use a single assessment for multiple purposes. It was supposed to work summatively, to provide an overall grade for the student in that topic, which we would then combine with their other topics to give an average grade for science. It was also supposed to work formatively, to show the teacher where each student had knowledge gaps. We even produced each student a personalised feedback sheet with improvement tasks so that they could work independently and spend lesson time doing this. I’ve learnt so much about direct instruction vs. discovery learning in the past couple of years, that to think we assumed students could learn the thing they found the most difficult independently, feels totally illogical now!

This is not to say it is wrong to have an assessment where both formative and summative conclusions can be drawn; however we were using an assessment at the end of a topic to try and do ‘everything’. When I evaluate the questions we used, some were well designed, and with confidence, I can state whether a students has understood the knowledge or the concept. However, for some questions, which I had previously considered suitable, I realised it was not possible to tell whether the student understood the concept, as it may be that their ability to read and decode the question was preventing them from answering correctly. Secondly, trying to assess understanding of practical skills – what if the student doesn’t have the knowledge that underpins the experiment being used in the assessment? Dylan Wiliam’s opening chapter in The ResearchED guide to Assessment delves into this more deeply in considering the validity and reliability of assessments.

So, our model just hasn’t worked well enough, it did not effectively inform the learning journey and secure better progress for all students. To achieve this, it is vital the classroom teacher is embedding and acting upon the outcomes of various short assessments throughout a topic which are used to draw formative inferences that immediately alter the lesson or future lessons. Our model also contained no longer term recall than beyond one topic; I could not say whether they had retained knowledge over a term, or a year.

Hence the need for change, and on reflection some small changes to assessment have been part of our curriculum evolution over the past few years, for example: introducing a specified practical in each KS3 topic and giving specific feedback to develop skills; and the use of retrieval practice strategies to make knowledge stick – see my blog on retrieval practice. However, with a clear learning journey now sequenced in our curriculum and an appreciation of securing the building blocks before progressing, it is only right that we update our assessment approach to complement our learning strategy.

So, my reflection on improving assessment has taken me on a journey over the past few months in seeking out lots of research and views on assessment. The rest of this blog presents some, but not all, of my thoughts (there are too many) on the different sources I have visited and concludes with an outline of the approach to assessment we have adopted for this coming year.

References

The ResearchED Guide to Assessment, Sarah Donarski

This book really has been an incredible source in encouraging me to think more widely about the assessment challenges I was facing and often enabled me to come up with solutions. Below are just a few of the key ideas I took away from my reading:

  • Dylan Wiliam’s chapter helped my thinking about not to call the assessment itself summative or formative, but the inferences drawn can be either; and although doing this without really realising it brought greater clarity to my considerations of the purpose of each assessment.
  • Claire Hill introduced me to ways of utilising MCQ’s, and the also asking students to give a ‘confidence score’ which can encourage ‘hypercorrection’ when students are responding to feedback. The key is how this is implemented to maximise impact.
  • Ruth Powley encouraged me to reflect on my own in classroom practice – Do I probe higher attaining students more thoroughly than lower attaining students? Do I give all students the same ‘wait time’ to answer a question? Do I give the same amount of detail in my feedback?
  • Rich Davies provided a fascinating insight into how ARK academies have implemented assessment, and given me lots to think about in terms how some of the techniques could be used effectively on a smaller scale (some couldn’t due to sample size)
  • Kris Boulton introduced me to the idea of the difficulty model and quality model of assessment. It made me consider how the science exams are a good example of the difficulty model except for 6 mark questions which also look for quality of answer. This makes me consider how I will decide on questions to set – which type of outcome do I want to achieve?

There were many more ‘take-aways’ from every single chapter of this gem of a book, but the above gives a flavour of how it has greatly informed my thinking.

Responsive Teaching: Cognitive Science and Formative Assessment in Practice, Harry Fletcher-Wood

This was a book I read very early on in my curriculum planning, it opened with chapters about planning a unit and planning a lesson. It provided me with some initial ideas that I was then able to take and use to shape a coherent curriculum sequence.

As the title suggests, the ideas in this book fed into my planning on what assessment looked like day-to-day in the classroom. It presented ideas on how to use responses from students to immediately alter and influence the learning journey within a lesson or in preparation for the next lesson. The checklists at the end of each chapter in this book served me well in beginning to think about my own personal practice before exploring assessment further.

The CRAFT of Assessment, Michael Chiles

A quick read (good thing!), crammed full of useful tips and ideas for implementing assessment. In this book Michael Chiles has concisely constructed the method, CRAFT, for thinking about the process of structuring assessment; from the delivery of knowledge through to the feedback process to move students forwards.

From this book I took away a wide range of ideas to explore further, including techiques for encouraging students to retrieve knowledge and reflect on what they know; and the ideas for time-efficient methods of feedback to move students forward in their learning. This book works really well to inform the individual classroom teacher on how to truly utilise assessment as tool for progress.

Making Good Progress, Daisy Christodoulou

As with each of the books I have read to inform my planning for assessment, they each support and reinforce each other on what works well to create an assessment structure that we use with students to move them forwards in their learning. At the heart of all of these resources is how assessment is part of the curriculum and not something we ‘do to students’.

Daisy Christodoulou’s book Making Good Progress is no exception, it prompted me to again think carefully about the purpose of assessments and how to use them to achieve the best outcomes for students. One of the things it really made me think about was how assessments are used, when assessments are used, their frequency and the breadth of content assessed at different points.

Where Next? Assessment in the Principles of Instruction, Dr Helen Skelton

When I came across this blog in my wider reading, it was incredible how much Helen Skelton’s experience resonated with my own train of thought. As she described her experience of assessment it became a checklist of my experience too:

  • End of topic assessments to generate report data – check.
  • Formative assessment focused on recently learned content – check.
  • Formative assessment with little thought to it’s purpose and how it would actually be used formatively – check.
  • Rosenshine’s Principles of Instruction and Tom Sherrington’s Rosenshine’s Principles in Action stimulating my reflection – check.

My then subsequent actions in the classroom were also very much a mirror of the beginning of Helen’s journey too:

  • Introducing low stakes retrieval quizzes at the start of every lesson to make factual knowledge stick.
  • Learning how to build these questions more effectively to suit my classes and using tasks that assess prior learning that is essential for the focus of the new lesson. Giving me a baseline of which ‘building blocks’ to re-teach before launching into new content.
  • Improving how I question students so that I can glean answers from all students and not just an individual via cold calling. I rediscovered my love for mini whiteboards in the past year!
  • Ensuring my sequence of learning was in a logical order and broken down into sufficiently small ‘building blocks’.
  • Using opportunities for student practice that allowed students to repeat the process and also become progressively more challenging.

The real nugget I took from Helen’s blog, and also her ResearchED Norwich talk, was how she has developed her use of exam style questions. I am looking forward to trialling the use of exam questions in a greater variety of ways to help me diagnose students knowledge and skills more effectively.

Science Assessment Model

As a school our new assessment model has taken into consideration all of the thinking about the types of assessment, and constructed a three ‘tier’ framework for embedding assessment in the curriculum.

  • Tier 1 – the day to day, in class assessment by the teacher. Measuring the progress of students and adapting the learning journey accordingly.
  • Tier 2 – the key points throughout a sequence of learning where it is pivotal to check whether students have understood a concept. The assessment is used formatively and feedback to students is used to address gaps and move them forward.
  • Tier 3 – Summative assessments that measure knowledge of students over a term, and as the year progresses recall from previous terms. Data will produce scores which can be used to compare each individual in the year group. Linked to this, core subjects will also use GL Assessments to allow us to align our expectations with those of a National cohort.
Tier 1 – Assessment in an Individual Lesson

Tier 1 considers how assessment is employed on a granular level, it works to check student progress against the success criteria of a lesson and inform the learning journey within the lesson and moving into the next. It enables a teacher to act responsively to the needs of their class.

This is how Tier 1 assessment might look in a lesson:

  • Start of the lesson: use of retrieval practice techniques to ascertain prior understanding essential for the lesson I am planning to deliver.
    • This is something as a faculty we introduced to all of our lessons, a starter quiz which involves a range of recall. I have been working to develop carefully crafted sets of questions that involve recall from the previous sequence of learning and also from links to other learning to check for retention and that the class are ready to commence the new learning.
    • Example: When teaching water purification techniques I may begin a lesson with the following set of questions:
      1. Write the definition of osmosis – I ask this question to ascertain that if they know what osmosis is then they are ready to learn reverse osmosis. It also work nicely as a cross-topic link with biology and some long term recall.
      2. What is the difference between a mixture and a compound? – This question checks whether students have the prior knowledge so that I can encourage them to apply knowledge of mixtures to the impure water as a mixture.
      3. Draw a dot and cross diagram of a chlorine molecule – I use this question as an opportunity to assess long term recall of paper 1 knowledge near the end of Year 11.
      4. What is the difference between aerobic and anaerobic respiration? – Another cross-topic link to biology. If students understand this question then it means they will be able to appreciate, and understand what is happening during both aerobic and anaerobic digestion in sewage treatment
      5. Name two examples of salts. – I use this question to assess long term recall of another paper 1 chemistry topic, and it makes a small link to dissolved salts in potable water and the process of desalination of salt water.
  • Retrieval Practice throughout the lesson: In my planning I will identify key points within the lesson where I can pose ‘hinge questions’ or provide students with an opportunity to recall their learning from segments of the lesson.
  • End of Lesson Learning Check – I will ensure that by the end of the lesson I understand clearly where students are in their learning so that my next lesson will start and continue coherently. Some techniques I may use here include: a mini whiteboard quiz or a printed exit ticket that I collect for marking.
  • Finally, Extended Learning (homework). In some situations I may opt to use homework as a key opportunity to also gather assessment from students – with the caveat of them potentially having Google as their assistant! However if I opt to make the homework something like ‘plot a graph’, knowing that I am going to ask students to do this next lesson; they will a) not be able to get the answer from Google, and b) I will very quickly be able to identify at the start of the next lesson what sort of scaffolding I will need to provide to enable students to improve that graph plotting skills.
Tier 2 – Assessment Throughout a Topic

Possibly the most challenging Tier to implement; these are the assessments that form the narrative of progress and learning for each student. The information gathered from these will inform the feedback cycle the teacher provides for the class and individuals.

Across a topic this would include:

  • Checking Prior Knowledge
    • To identify the baseline starting point of each student in the class.
    • Could be completed at the end of the final lesson on the previous topic, beginning of the first lesson of the new topic or potentially as an extended learning task at home by the students. I favour the idea of using all three of these opportunities – the end of the previous topic to scope out an overview of their understanding and at home to delve a bit deeper (although risks Google providing answers). This will then allow me to use diagnostic assessment at the beginning of the first lesson to focus on whether the students have the building blocks to access the learning I have planned for that first lesson and enable me to act responsively to their needs from the outset.
  • Carefully planned checkpoints.
    • Throughout the topic I will need to know whether students have understood a particular concept in order to move on – threshold concepts, ‘hinge questions’.
    • I will be using our faculty ‘Age Related Expectations’ (AREs) to help me identify these points and then construct a suitable assessment to check their understanding. As a team we will develop these so they are consistent across classes and allow for us to compare groups and individuals and measure progress.
    • I intend to use a range of strategies, some examples include: MCQs to check knowledge, extended writing to check understanding, Practical write ups to check Working Scientifically and Maths Skills.
  • Exam Practice
    • Threaded throughout the topics I teach I will also be looking to include opportunities for students to experience exam questions. I will need to select exam questions carefully to ensure it is knowledge I know students already have and that the focus can be on ‘how’ to answer the question rather than having to think of the knowledge first.
  • End of Topic MCQ
    • A summative assessment to check specific knowledge, understanding and skills in this single topic. The outcomes of the MCQ will be used by the class teacher to inform topics for interleaving and recall later in the term.
Tier 3 – Assessment Across a School Term

In line with our school assessment policy, at the end of each term students will complete an assessment of their learning from the term, and as the year progresses, longer term recall too.

These assessments are going to need careful planning and constructing to ensure they will enable us to draw valid inferences from them. These assessments will include:

  • A section of short answer questions to assess knowledge and understanding of concepts. These questions will delve more deeply than MCQs, using exam banks to help with design and then to adapt them to ensure accessibility.
  • A section on Working Scientifically skills. The knowledge aspect of these questions must be one that will be secure for students so that we can effectively assess their skills with other barriers minimised.

It has been an exciting start to the year introducing this idea of an assessment narrative that builds throughout a topic – so far I can already see how I am better informed about the progress of my students and that I am acting even more responsively to their needs. As a team we have been embracing the production of the assessment materials and working together to grasp the logistics of making assessment a tool to drive progress rather than solely a process to measure outcomes.

Subject Knowledge and Misconceptions in Science – part 1.

How many science teachers teach only their specialism? So far from my experience, and also identified by the DfE 2016 report on impact of specialist and non-specialist teaching, the general answer is that hours taught by ‘specialists’ increase the further through the key stages students progress.

Beyond this the DfE report is difficult to use from a science perspective as ‘general science’ classifications do not allow for a clear interpretation of the biology/chemistry/physics split and therefore do not allow us to understand whether the high proportion of ‘specialists’ in science are teaching within their specific field or not.

The evidence from the DfE suggests that the level of ‘specialist’ teaching has no impact on outcomes in science, however this is likely to be influenced by the inability to measure the number of teachers delivering outside their scientific field. Is it possible that there is considerable non-specialist teaching which is therefore causing this outcome?

Anecdotally, many teachers talk about teaching all sciences at KS3 and then becoming more specialist at KS4/5; and often talk of how being a non-specialist teacher makes developing student knowledge, understanding and skills in science considerably more challenging, when they themselves lack confidence in the subject material. Niki Kaiser has been conducting her own research along similar themes and exploring the ‘confidence factor’ between specialists and non-specialists. Initially it looks like it does exist and also has a impact on fundamental subject knowledge (threshold concepts) of the non-specialist v. specialist. The final outcomes of this study are something I look forward to reading in more depth.

My hypothesis when thinking about these ideas is that the one of the key factors influencing student progression in science is the subject knowledge and confidence of primary and secondary school teachers teaching outside of specialism. If this aspect of early science education could be addressed to create more ‘specialists’, perhaps through targeted CPD, then we may be able to overcome the misconceptions and knowledge gaps that seem to be manifest among students well into GCSE and A Level studies.

For sure, this is not a new thought, and is clearly one that has been considered by others (you only have to do a simple internet search, or ask another professional, and ancedotal evidence and detailed studies are plentiful) but still there remains a theme of very little consideration for science teaching before secondary school; and even in secondary school it is KS4 which most often has the greatest focus. We as teachers spend much of our time at KS4 trying to undo, reconstruct or build from the beginning again on concepts that could very easily be secured earlier had the initial teaching not introduced such a wide variety of misconceptions.

The exciting focus on curriculum design more recently brings much of this thinking to the fore, and conversations about the role of KS3 in preparing students for GCSE is a more common talking point; whether it is discussing ‘a 5 year curriculum’, or ‘mapping the curriculum’, or understanding the ‘learning journey’. All of these ideas emphasise the importance of KS3 and as a result the importance of teacher subject knowledge and opportunity to specialise.

Over the coming weeks and months I aim to explore my ideas further, especially the ideas of what misconceptions exist in primary and KS3 science; and how specialist v. non-specialist teaching influence the existence of these misconceptions in the understanding of science for students.

Learning Goals and Success Criteria

Every lesson I have taught since I became a teacher I have created learning goals/objectives/intentions (whichever phrase you choose to use) for each lesson. It is drilled into every teacher from the moment they begin training that these are the foundation of each lesson and the starting point for your planning. Knowing that these are so important, why is it that so often you can walk into a classroom and say to a student ‘What are you learning today?’ and the response from the student is focused on the task not the learning? Somewhere from the planning phase of the teacher to the student in the classroom the understanding of the purpose of the lesson is lost.

Reflecting on this, there are a number of points between the inception of planning the lesson and the learning occurring in the classroom where the learning goals can fall by the wayside.

Before the beginning – does the teacher understand what learning goals are?

For something so important, I do not feel the knowledge and skills required to write effective learning goals is really taught – I do not recall doing much at all on my PGCE on this, and certainly it is something I discuss regularly with ITT and NQT teachers.

It is vital that in writing learning goals, they are exactly that – learning goals. It is incredibly easy to slip into a task focused approach and to think of learning goals as a tick list. To remain effective and to construct effective learning goals it is important to try to think more widely on the topic rather than looking at the lesson in isolation.

At the start – deciding on the learning goals.

To begin with it is vital to consider the ‘big picture’ and the context of what you are wanting your students to learn. If the topic is about enzymes, what do you ultimately want them to be able to know and understand about enzymes? What skills do you want them to develop along the way?

At this stage do not become specific, continue to maintain a broad focus, and use phrases such as ‘to know’ and ‘to understand’; it may also be appropriate to use certain command words which, when combined with the context, create a complex combination and sequence of learning, for example: ‘to compare’; ‘to analyse’; ‘to explain’.

It is also important at this point to consider how this learning goal links to previous learning and future learning. Where does this phase of learning fit with the ‘big picture’ of the whole learning journey?

Planning the lesson – the learning that is going to occur.

Once the learning goal has clarity, it is time to consider the learning journey students will take to reach their end goal. Essentially, what are the building blocks of knowledge and skills that students must acquire to be successful? These steps are the success criteria.

Success criteria are still not be task focused, they will utilise a wide range of verbs to articulate the knowledge and skills students will gain during this phase of their learning journey.

When preparing success criteria it is also important to consider the previous knowledge students are going to draw on and link to their new learning. For example, if the learning goal is focused on chromatography, students are going to need to have knowledge of pure and impure substances; this is likely to be something they have learned about previously, perhaps when studying mixtures at KS3.

Linked to this part of the planning stage and success criteria, are the tasks that you are going to use to secure progress and ensure that students are learning. However it is important to ensure that the tasks fit the success criteria and not vice versa.

The delivery of the lesson – communicating to students.

As a teacher, having constructed a clear learning journey and planned a superb lesson where you know that students have every opportunity of success, there is nothing more demoralising than hearing students asking ‘what are we doing?’ or ‘why are we doing this?’

It is vital that the learning goals and success criteria are communicated with the students; and that they are communicated in such a way that students have absolute clarity of the purpose of their lesson. There will always be the challenge of bringing the disengaged student on the learning journey, but in giving a clear learning goal, and contextualising this within the learning journey, this brings purpose to the lesson. Add in a series of criteria that students understand, can self-evaluate against, and that are clearly achievable, it gives you the opportunity to really engage the students in their own personal learning. For this recipe to succeed the teacher must also deliver with conviction and enthusiasm for their subject, showing real belief and excitement at the learning journey about to take place!

The student – the metacognitive abilities of the student.

This leads me to what I see as potentially the final hurdle in ensuring learning goals and success criteria are effective. After having shared the learning goals and success criteria, there is still a possibility that students may struggle to articulate the learning journey and their learning. This is a fair thing to assume, it is not easy being sat in a lesson focusing hard on learning something new and being expected to know how this new information fits in with the knowledge you already have.

This is where I have personally been experimenting the most, how can I make the abstract nature of the learning journey more concrete and tangible for students to reduce the cognitive load required of them and enable them to more rapidly build the complex and interlinked schema I want them to achieve?

As I train students to develop the ability to make links in their schema independently I need to show them where the links are; this has lead me to use a very simple tool each lesson to share this with students – a ‘smart art’ form on powerpoint.

The smart art that I use is the arrow with three separate boxes on it; into this I put the topic for the current learning in the centre box. I then reflect on my planning and complete the first box with the links to previous learning (I then use retrieval practice strategies to ensure students have this expected baseline of understanding, and reteach any gaps as necessary). Finally into the final box, I add where this is leading to; this is usually the most challenging box to write, and it can vary from being a scientific concept that will be learned in the future, to a real world context.

The key to then succeeding with these has then been how these are executed in the lesson; being ready to elaborate on what each box means; often through questioning of students to contribute the prior knowledge section and then taking their responses and linking the new and future learning to it. If done carefully students will feel like they are already a significant way along their Learning Journey to their ultimate goal.

Some examples of the smart art I have used include:

When teaching the first lessons of Atomic Structure in Physics:

When teaching Changes of State in the Particle Model of Matter topic:

When teaching Infectious Diseases at KS3:

When teaching Chromatography in the Chemical Analysis topic:

What I have found in using these images when introducing learning goals and success criteria is that I am quickly able to instil confidence in the students. Knowing that what they are going to be learning today is linked to something they already know, and using the start of a lesson to retrieve this knowledge has the effect of engaging them in sharing information and makes the new knowledge less daunting.

I have also noticed increased engagement from students, for example, the class who were studying atomic structure and isotopes, started thinking about where their learning was going to lead and started asking questions about radioactivity and what they are going to find out in future lessons.

Students have also been much clearer in articulating their learning journey, they refer to the diagram as a prompt, but they are now able to say with greater certainty, what they are learning, how it links to previous knowledge, and the purpose of the lesson and their learning. It has also had a great impact on my on planning and teaching; I also have greater clarity of the learning journey, I think more widely about the sequence of learning that is going to take place and am able to act more responsively as I formatively assess student knowledge and secure understanding before progressing.

References and Further Reading.

Design a site like this with WordPress.com
Get started