The Real Fast Reports Blog

How Do We Reduce Risk?

At Real Fast Reports, we help teachers write better reports in less time using AI.

Teachers write reports in bullet point format. We use AI to turn the bullet points into prose. The bullet points go into a searchable "comment bank," which significantly speeds up report writing.

Here's an example:

Teacher input:
Science
Nathalie, Female:

  • works hard
  • participates well
  • did really well in the test on forces
  • needs to improve her graph drawing

Real Fast Reports output:
Nathalie works hard and participates well in Science. She did really well in the test on forces, demonstrating a good understanding of the subject matter. However, Nathalie needs to improve her graph drawing skills.

What are the risks?

As with any service that uses AI, we face risks relating to:

  • Accuracy
  • Bias
  • Data protection

We take these risks very seriously, and we believe we bear significant responsibility to mitigate them (rather than simply passing on these risks to teachers and schools).

How big are the risks?

Fortunately, the nature of our use of AI limits somewhat the extent to which inaccuracy and bias can pose significant risks.

The factual content of our reports comes from teachers (in bullet point format). We use AI to turn the bullet points into prose. Therefore, the main function of the AI is to link comments together to form a fluent and cohesive report. There is relatively little "opportunity" for the AI to be inaccurate or biased.

However, just because these opportunities are small does not mean that they don't exist, so we still take steps to reduce these risks.

Regarding data protection, the risks are more significant. While the majority of reports do not concern highly sensitive personal information, nonetheless, this data must be protected, stored, and processed with due care.

Here are some of the steps we take:

De-identification

A significant step we take is to de-identify student data that we send to OpenAI (our AI provider). This helps reduce any possible bias and increases the security of student data.

Take the example above of a science report about Jane. Before sending this data to OpenAI, we change the name of the student. Otherwise, it is possible that the AI could demonstrate bias (e.g., it may write differently if the student is called Ngozi instead of Nathalie).

We also sometimes change the gender of the student. This reduces the possibility of gender bias. So for this Science report for Jane, we may send the following data to OpenAI:

Science
Mark, male:

  • works hard
  • participates well
  • did really well in the test on forces
  • needs to improve his graph drawing

OpenAI then writes a Science report about Mark. When we receive the report, we transform it back to being about Jane again. This pseudonymization process reduces significantly the possibility of racial bias based on student name, and of gender bias.

Since we do not send the name of the school or the teacher to OpenAI, it makes the possibility of re-identifying the original student from the data sent to OpenAI remote. In this example it would be impossible for anyone at OpenAI to identify the real "Jane" based on the comments received about the fictitious male student called Mark.

Prompt engineering

Generative AIs are capable of writing creatively. ChatGPT can invent all sorts of weird and wonderful stories and poems. It is also capable of completely fabricating school reports. Here is an example:

Teacher input: Write me an end of term school science report about Jane.

ChatGPT output: Jane exhibited strong scientific understanding this term, excelling in photosynthesis, energy transformation, and biomes. Her inquisitive nature and participation enhanced classroom dynamics. Notably, her dedication to laboratory work was commendable. She is encouraged to pursue advanced studies.

The chance of this report being accurate based on such minimal input is close to zero. So how do we prevent the AI from inventing student information like this?

At Real Fast Reports we have experimented extensively with the instructions and parameters we send to the AI (known as "prompt engineering").

By testing hundreds of inputs and outputs, we ensure that OpenAI returns reports that are faithful to the teacher's input, rather than creative fabrications.

Teacher feedback

One key tool in detecting inaccuracy and bias is to collect feedback from teachers. We offer the ability to flag bad reports for any reason (inaccuracy, bias, or simply bad wording).

So far, there have been no flags of bias in our reports. We have received a number of accuracy-based flags - e.g., the AI was sometimes over-optimistic about student future potential when we transitioned from using GPT-3 to GPT-3.5, which we acted upon by improving our prompts.

As AI improves over the coming years, it is hoped that accuracy and bias will become less of a risk, but we remain vigilant to these dangers as they are of such critical significance for school reports.

Encryption and storage

All student data we store is encrypted using 256-bit Advanced Encryption Standard (AES-256). Data is transmitted between servers and clients securely using end-to-end encryption.

Data is stored in multiple redundant locations in Ireland using AWS data centers with high levels of physical security. We maintain backups of data which allow point-in-time recovery.

Residual risk

It is never possible to completely eliminate risk. The risks of any tech solution like Real Fast Reports must be weighed against the benefits.

In our case, some residual risk must be passed on to the end user - e.g., it is essential that teachers proofread all reports generated for accuracy before sending them to parents.

We are committed to saving teachers time and improving the quality of school reports. We are also advocates for the responsible use of AI, and we believe that our systems and processes strike the right balance of benefit and risk for teachers, parents, students, and schools.

How to Write School Reports

Writing school reports can be a daunting and time-consuming task. But great school reports put a smile on parents’ faces, support student learning and growth, and may be treasured for years to come.

What makes a report great?

From a parent’s perspective, the hallmark of a great report is its authenticity - it captures the essence of their child. This personal touch shows that you truly know and value their child…

Continue reading »

Can ChatGPT Write School Reports?

Yes it can, and I'll show you some examples. But before you rush to start using it...

Data protection - be aware

As of March 2023, by default OpenAI (the makers of ChatGPT) can keep the data you type into ChatGPT and furthermore they can use it to train their models. Here's the source

Continue reading »

Should Teachers Use AI to Write School Reports?

This is an interesting question with many facets to it.

This article concerns the general use of AI to write school reports. If you are thinking of using ChatGPT to write school reports READ THIS FIRST.

Here are some advantages of using AI to write school reports…

Continue reading »

Empowering Self-Regulated Learning

One of the notable recommendations by the EEF is to cultivate 'self-regulation' among students. In my practice, I've effectively integrated student reflection into my preparation for report writing and parents' evenings, fostering an environment that encourages students to plan, monitor, and evaluate their own learning progress…

Continue reading »