By Anish Ramachandran | Chief Executive Officer (Hon.)
Can tests be fun? (Spoiler: Yes!) Here’s our experience assessing 6000+students in Anekal.
When we started our English teaching programme in Anekal in 2010, our entire philosophy was centred around making learning fun and exciting. We wanted to ensure that the classrooms are places of joy.
It's been more than 13 years since we started our programmes with that thought. While we have been always conducting assessments , over the course of last year we almost completely revamped the approach to base it on certain established assessment standards and principles. So when we decided to launch our student assessment programme this year, we were pretty nervous.
This was a litmus test of our hypothesis! We would finally be able to see whether our teaching methods were effective, how effective they were, and where we were falling short. (Did we mention how nervous we were?)
The assessments we wanted to carry out were pretty ambitious. OBLF works in 90 schools in Anekal taluk, with around 6000+ students. We wanted to comprehensively check the ‘Listening, Speaking, Reading, Writing, Vocabulary and Grammar’ levels of every student in these schools.
And unlike other sampling tests, these were hour-long individual assessments and covered each of these language skills.
We realised quite early that if we had any hope of pulling this off, we would need to call in some reinforcements.
Enter, the teachers.
We engaged 64 teachers to carry out these assessments with us. This meant getting the teachers together, carrying out a 10-hour-long intensive training session, and we were getting set to go.
But imagine our surprise when we understood that the teachers themselves were afraid of tests! Baffling as this sounds, it makes sense when you understand the background.
The culture of tests, and a fear of failure
Here’s something that is important to understand: Even in a relatively developed, semi-rural area like Anekal, school teachers find English language teaching daunting. Many of them are first-time teachers, and English is still a foreign language to them.
Second, each of these teachers have their plates full! They visit at least 3 to 4 schools daily. In the public education system, they shoulder multiple responsibilities — from looking after student enrollments and attendance, making sure that children don't drop out and so much more.
In this context, the teachers see poor learning levels as a personal failure, as a reflection of their work performance. Simply put, they were afraid of the consequences.
Our first step, then, was to reassure the teachers.
We explained why we were carrying out the assessment. That it was about improving the programme as a whole. No fingers would be pointed, or blame assigned. This was a constructive exercise, with the final goal of improving both teaching and learning.
Another problem that we found and tackled during the initial process was the question of bias. We stepped in here to make sure the biases didn’t dilute the assessment results.
In most cases, the teachers knew the students quite well. So they tended to interpret or discount the results! (Here are some scenarios we saw: “This student is usually very active, why isn't she answering now?” Or in the case of a poor performance, “She's going through so much at home. Maybe that’s affecting her performance”)
While this personal touch is invaluable, we intervened. We pointed out and acknowledged the biases we observed, and worked with them to minimise this from colouring the outcome of the assessments!
The second (pleasant) surprise
While the teachers were apprehensive about the tests, the students were positively excited. That was because the 'OBLF tests' were so different from the examinations they were used to!
With the ‘OBFL tests’, there was no atmosphere of fear or pressure.
Unlike regular exams, where the cable TV at home is disconnected, mobile phones are stowed away and 'special' study arrangements are made, our assessments were carried out on a regular school day.
They were not explicitly graded or punished. In fact, many of them look at the test as a form of game or play — a reassuring sign if there was one!
So that's how our team fanned out across the schools throughout the course of February and administered the tests.
Even as the results have started trickling in (they aren't fully in yet), we are noticing a few positive trends.
1. There has been an overall shift from 42% scored in baseline to 58% in the endline in the overall student population under OBLF’s Elevate program. Students have shown a minimum of 15% growth across every learning skill — L/S/R/W. A quick break up of skill wise endline scores have shown students scoring 72% in listening, 65% in reading, 45% in writing and 50% in speaking.
2. The data has also shown a direct correlation between frequent attendance and improved performance.
3. The students under our SOLVEX OBLF tablet based learning program — showed a higher level of improvement across the four skills in comparison to the group that were not a part of this program. This further cemented our guiding principle that alternative supplementary forms of learning and exposure through technology which complement the conventional in-class teaching, results in accelerated learning among the students.
4. In these students, we are seeing a 20% improvement in listening skills, 12% improvement in reading, 15% improvement in writing & 17% improvement in speaking.
What's next?
Once the data from the tests are fully processed, the students will be assigned to teachers and resources appropriate to their learning level. We will also tweak the curriculum for various cohorts.
So that's the story of how we spent our February working to make tests lively and interactive!
The aim is to help both the students and teachers, but we also learnt to have some fun along the way!
In retrospect this month made all of us smile, as we realised we might just pass the litmus test by a good margin! We have a long way to go in our journey of change, but these small victories along the way are well worth celebrating!
Links:
Project reports on GlobalGiving are posted directly to globalgiving.org by Project Leaders as they are completed, generally every 3-4 months. To protect the integrity of these documents, GlobalGiving does not alter them; therefore you may find some language or formatting issues.
If you donate to this project or have donated to this project, you can receive an email when this project posts a report. You can also subscribe for reports without donating.
Support this important cause by creating a personalized fundraising page.
Start a Fundraiser