Teaching and learning

I taught my students how to cheat using ChatGPT and they hated it

If you’re on any teaching-related social media, you’ve been inundated by posts related to ChatGPT. This week, I decided to jump on the bandwagon and try a few lessons using the new tool.

What is ChatGPT?

ChatGPT is an artificial intelligence chatbot. The premise is that you can ask ChatGPT a question, and it will answer it. It was created by the OpenAI initiative, and a short run-down if its design can be found here. The key takeaway for teachers is,

ChatGPT was built using text found on the internet. That means it’s essentially as smart as your average internet poster.

For teaching, this raises several things you should be aware of:

  • It will give the most likely internet answer for a question. That means that any past-years exam questions will almost certainly be answered correctly, because students will likely have asked how to answer a question on internet forums and somebody will have posted the mark-schemes.
  • It will also magnify common misconceptions. Because common misconceptions are, by definition, common, they will be repeated often and weighted more highly in the AIs attempt at answering questions.

It doesn’t know when it’s wrong

Take a look at the exchange below:

Can you spot the problem? Neither can ChatGPT!

This is a function of how the AI works. It doesn’t actually have a human sense of understanding. Instead, it’s writing a sentence and then working out what, statistically speaking, would likely be said next. That means that it can’t answer any of these types of questions, because while the words are extremely formulaic, the numbers themselves are highly specific to each question. As a result, the AI happily accepts it’s wrong, and moves on to the next most likely answer, which is still wrong.

Using this in lessons

My lesson idea was to try to harness this to enhance students ability to spot answers. I took a high-risk approach, and suggested that they start cheating. There have been widespread reports of schools blocking ChatGPT to prevent cheating, and accounts on TikTok promoting it as a way of getting easy answers to homework questions. I decided to tell my students they would be using ChatGPTs answers on a worksheet, but with one caveat – they must check the answers were correct!

I used this Google Slides presentation, along side this worksheet.

The core task was to ask ChatGPT about a type of energy resource, and then rate the quality of its answer. Here’s my first attempt:

I asked chatCPT:

Please explain simply how a coal power station works

The chatGPT answer is below. The following codes have been used:
Wrong information
Correct information
Not completely wrong, but not fully explained or correct to IGCSE standard.

Chat GPT answer
A coal-fired power station is a type of power station that generates electricity by burning coal to produce steam, which is then used to generate electricity. Here's a simple explanation of how a coal power station works:
Coal is delivered to the power station and is stored in a coal yard.
The coal is then transported to a coal crusher, where it is crushed into a fine powder.
The coal powder is fed into a boiler, where it is burned to heat water and produce steam.
The steam is sent through a network of pipes to a steam turbine.
The steam turbine is connected to a generator, which is a machine that converts mechanical energy into electrical energy. As the steam passes through the turbine, it causes the turbine blades to spin, which in turn generates electricity.
The electricity is then sent to a transformer, where it is converted to a high voltage and sent through power lines to homes and businesses.
I hope this helps! Let me know if you have any questions.
5 isn’t too clear. It should say that the steam turns a turbine, and then the turbine turns the generator.
6 and 2 aren’t needed for IGCSE.

As we can see, it wasn’t too bad an answer. However, things got a lot worse when I asked for the advantages and disadvantages, from an IGCSE Physics perspective:

The energy change involved
I asked chatCPT:

Thanks! What are the energy transfers at each stage?

The chatGPT answer is below. The following codes have been used:
Wrong information
Correct information
Not completely wrong, but not fully explained or correct to IGCSE standard.

Chat GPT answer
In a coal-fired power station, energy is transferred from the coal to the steam to the mechanical energy of the spinning turbine blades, and finally to the electrical energy of the generator.
The first sentence mixes up stores of energy with types of energy; it starts by saying the energy in the coal, but then says the mechanical energy which isn’t an IGCSE energy type.
In the coal yard, the chemical energy stored in the coal is transferred to the surroundings as heat energy when the coal is burned.
This energy change takes place in the boiler; in the coal yard, the coal isn’t burned so the energy is still chemical energy in the coal.
In the coal crusher, the mechanical energy of the crusher's moving parts is used to crush the coal, transferring some of the energy to the coal in the form of kinetic energy.
This would only be true if the coal continued to move after being crushed. 
In the boiler, the chemical energy in the coal is transferred to the water, heating it and turning it into steam. This energy transfer is due to the combustion of the coal, which releases heat energy.
The energy change is chemical into heat energy. This answer again mixes up types of heat transfer and physical locations where the energy is present.
In the steam turbine, the energy of the steam is transferred to the turbine blades, causing them to spin and creating mechanical energy.
Firstly energy can never be created, so ‘creating mechanical energy’ is completely wrong. Secondly, at IGCSE we say kinetic energy. The energy change is heat into kinetic as the hot steam moves and spins the turbines.
In the generator, the mechanical energy of the spinning turbine blades is transferred to the generator's rotating coil, producing a fluctuating magnetic field. This magnetic field then produces an electric current in the generator's stationary coil, transferring the energy to the electrical energy of the current.
It should still read kinetic energy, but that kinetic energy is transferred into electrical energy. We don’t need to know hot it happens until Y11.
In the transformer, the electrical energy of the current is transferred to the transformer's coil, causing a change in the voltage of the electricity. This energy transfer is due to the electromagnetic induction occurring in the transformer.
This isn’t required until Y11.

You can read more on the link above, but essentially most of what the AI said wouldn’t be correct in an exam. What this shows is that while ChatGPT could be a good starting point, there’s a lot of work that students would need to do in order to get their answers into an acceptable form for their exams.

The students reaction

It was fascinating to watch students as they went through the exercise. At first, they were amazed by it, asking all sorts of questions from subject-specific ones like ‘describe the causes of the second world war’, to the esoteric like ‘should I eat avocado?’. However, as they started the work, they quickly became dissillusioned. Some of the comments I heard were:

  • What’s the point? Why don’t we just use the textbook?
  • Urgh, it takes so long to check everything, why can’t they just use the core knowledge document?
  • I keep thinking it’s find and then somebody points out another mistake.

Now, I’m luck that my class is extremely high-attaining and have strong subject knowledge. The average grade is a strong A*, with no students working below a B. I’m fairly confident that most students could get themselves to a grade B using the tool, but to gain anything higher than that they would have to spend more time on checking each answer against their textbook than if they just went to the textbook in the first place.

In summary, ChatGPT is an interesting tool and provides a nice set of starting answers, but it’s not going to revolutionise education. Because it doesn’t understand what it’s saying, none of its answers will be nuanced or explain the subtle differences in language that make good feedback important. I can see a role for for those with absurdly high class sizes, where individual feedback isn’t possible, but I don’t think it’s going to replace 1:1 conversations any time soon. Additionally, it can get a student currently working around a grade E to a B, but can’t really help the understanding of any students at a B or above, since its answers just aren’t specific enough.


An apology to Capita: why we slurped directly from the database

Health warning: this post is pretty technical!

One of the key goals for our Data Dashboard was to have up-to-date information. Initially, we were dealing with just GL standardised data (CAT4 and PASS). However, as time went by we started to want to use our internal reporting data, logged concerns, and merits / behaviour data.

When importing SIMS data to PowerBI, most schools seem to create .csv files from, and add them to the PowerBI model. However, that approach had the big drawback that it was a manual process. Now we could have automated it using command reporter, a tool for SIMS which can run reports a Windows scheduled task, but that still wouldn’t have worked for us as SIMS reports don’t let you export marksheet data. Since a key requirement of our staff is to see report data as it’s entered, we needed another approach.

Direct database access

SIMS uses a Microsoft SQL Server to store all its data. To put it bluntly… it’s ugly as sin. I’m not stranger to ugly data structures; I run, and our database sturcture isn’t quite this bad, but then we’re also not over 20 years old! So there’s no shame on Capita for having such a complicated table strucutre, but you should be aware of what you’re getting yourself into if you go down the direct database access route.

In the end, we decided that getting immediate data was worth the pain of having to access the SIMS Micorosft SQL Server directly. We were fortunate, however, that there were a few thigns in our favour:

  1. We have a full-time on-site developer, who is knowledgeable with SQL and database design principles. She wrote a series of custom SQL views for us, so that we never had write access to the database and could even use caching to reduce performance problems. This was a mammoth task for her, but once done it meant we could run nightly snapshots of SIMS to update our dashboard.
  2. We have a strong IT support team who can monitor server health and let us know if your queries start to degrade the performance of SIMS for the rest of our staff.
  3. We have self-hosted SIMS which means we get access to the servers in the first place.

If you’re not in the same position as us, don’t worry; there are still ways you can have nearly-live data. The easiest way is to use Command Reporter for most student data, and then ask your data manager to make a marksheet with as much of the report data as possible on it.

To make the most of this, you can create a shared drive and have the .csv files regularly update. PowerBI is able to pull these updated files through on a refresh schedule, keeping you up to date.

Data Dashboard

Visualising SIMS data: Treemaps, Hierarchies and Behaviour Data

This is the second post in my series about how we created our Data Dashboard to bring together all the information we have on our students. In the first post I discussed how we visualised simple report data to get an overview of how teachers were reporting attitude and organisation to parents. Today, I’ll be talking about how we report behaviour, taking us from this:

to this:

About behaviour

Capita’s allows for some pretty powerful behaviour reporting. At our school, we use both and EduLink One to enter data about students.

We are fortunate that the general standard of behaviour at our school is extremely high. As a result, we rarely have instances of low-level disruption or misbehaviour in our lessons. As a result of this, we don’t make much use of the behaviour management systems in SIMS for managing misbehaviour, since such instances are so rare they are able to be dealt with immediately. Instead, our behaviour reporting system is mostly concerned with the Social and Emotional aspects of learning, and with academic interventions.

Behaviour in

Behaviour incidents can be created in SIMS or EduLink quite easily:

Once created, an email and SIMS desktop alert are sent to linked teachers, including the form tutor and Head of Year:
This is decent for monitoring, but doesn’t allow for us to do any deeper analytics of trends in incidents.

Our existing analytics

Our existing reporting tool was to use SIMS Reports to produce a weekly analysis of all behaviour incidents for the year-to-date, looking something like this:

These analytics were used by Heads of Year and Heads of Subject to look for trends in their respective areas. These reports were much quicker than looking through every email and student profile on SIMS, but still took quite a lot of manual checking and cross-referencing. Finding the comments for each report was also difficult, as one would have to cross-reference the subject and date of each incident to find out what actually happened.

Using PowerBI

We extracted our behavour logs directly into PowerBI, giving us this table:

Now, the key things that middle leaders wanted to know each week were:

  • Which year groups, form groups and students had the most behaviour incidents?
  • Which subjects had the most behaviour incidents?
  • Are behaviour incidents increasing or decreasing?

For showing proportions, heatmaps are often the most intuitive form of display. To allow our middle leaders to see how each group was performing, we created a hierarchy that went Year –> Tutor group –> Student. This would let users click on a year group to expand it down to see the tutor groups with it.

We also wanted to be able to see the incidents immediately, so added a table with all the comments for each report. We also included slicers to allow us to filter by whether incidents were resolved or unresolved, or referred to different middle leaders;

We now have a much more intuitive, user-driven report. Clicking on any part of the heatmap will filter the table of incidents. This puts much more emphasis on the comments written by staff, and in turn helps staff to see the point of giving more detail in their incident reports. It also allows for easy follow-up, since the name of the reporting staff is always visible.

This view also bridges the gap between heads of year and heads of faculty. Before, they were using different reports in different exported spreadsheets. Now, they’re all literally on the same page, so can reference issues between themselves much more simply.


Visualising SIMS data: How are reports changing over time?

This post is the first in a series of entries I plan to make showing how we designed a bespoke data dashboard for our school. I’ll post links here to show how each aspect of our dashboard was created, leaving us with something that looks like this:

The Problem

At our school, we have a fairly simple way of reporting attitudinal and organisational behaviour to parents:

NumberAttitudinal behaviourOrganisational behaviour
1Below expectationsBelow expectations
2Meeting expectationsMeeting expectations
3Exceeding expectationsExceeding expectations
AB and OB descriptors

Now, I know that from a data analysis point of view, this will initially look horrifying. Clearly a 1-4 scale is not suitable for any kind of analysis. However, I actually have come to quite like this system, as it is extremely simple for parents to understand. At a data analysis level, we have plenty of other indicators to tell us when a child is slipping in terms of their organisation of attitude, mainly through our Behaviour Management system. So really, if we see a ‘1’ in a report, we can use that as a warning sign that further investigation and intervention is needed.

These scores are reported on a termly basis. By the end of the year, a students simplified report card might look something like this:

SubjectTermAttitudinal BehaviourOrganisational Behaviour
Maths1.1Meeting expectationsExceeding Expectations
Maths1.2Meeting expectationsBelow expectations
Maths2.1Below expectationsBelow expectations
Maths2.2Below expectationsExceeding expectations
Maths3Meeting expectationsMeeting expectations
English1.1Meeting expectationsBelow expectations
English1.2Below expectationsBelow expectations
English2.1Meeting expectationsMeeting expectations
English2.2Meeting expectationsMeeting expectations
English3Exceeding expectationsExceeding expectations
End of year report for a student

This is system is okay when looking at a single student, but becomes quite challenging for tutors and middle leaders when trying to analyse cohort data. Typically, our SIMS manager will create a marksheet that has all the data from each subject, like this:

Now this is okay, but it lacks finesse. Firstly, we will need to open seperate marksheets for each half-terms report data. We also need to manually look around for each score. While we could prettify it a bit with conditional formatting, this really isn’t optimal.

The PowerBI solution

The key requirements for a good visualisation of this data were:

  • Show at a glance whether something was a 1, 2, 3 or 4
  • Be able to show change over time
  • Summarise the different data for each subject.

In the end, I went for a 100% stacked bar graph:

To create these, I added the following to the 100% stacked bar graph shelves:

You might notice the tooltip is going to a report page. This tooltip is a simple table showing the subjects and count of ABs, so that we get dynamic tooltips that show us which subjects gave each grade. We can also click a bar to get a list of which students were assigned that grade:

With this, we now have a much richer, filterable view of our data. A teacher can now filter our key indicators instantly, to see how aspects are changing by term, and link these back to students.

As I said at the start of this post, a 1-4 rating system is really only useful as a traffic light. Staff can use these to flag students for investigation, before using the other analytics tools the dashboard provides to start to explain why those scores were given.

In the next post, I’ll look at our behaviour reporting, and how we used hierarchies and tree-maps to gain a more initiative understanding of behaviour incidents in our school.


Hello world!

Welcome to WordPress. This is your first post. Edit or delete it, then start writing!