Custom Label Validation
You can define a custom Javascript script that can continuously compare annotations to a schema/set of rules and inform annotators (in real-time) of any mistakes in their annotations. You can also prevent annotators from submitting tasks when your validation script finds errors.
For most annotation projects, there is a schema/rule-set/taxonomy that annotators must follow. A large portion of errors in annotation projects is due to oversights/slips during labeling in adhering to the schema. For example, the annotator must segment a tumor and fill out a few related attributes if a tumor is found. A common error can be forgetting to fill out all attributes when the tumor is present. Post-processing scripts usually reveal these errors.
Prevent simple, recurrent errors from occurring by writing a set of tests that will be run regularly, informing annotators of any mistakes they're making.
Overview of custom label validation

Overview

By default, all projects have a custom check to warn annotators when they submit tasks without any annotations. You can enable/disable custom validation under Project Settings -> Label Validation.
Custom label validation in settings

Prevent submissions with errors

By default, annotators will just receive the error messages as a warning, and they will still be able to submit the task anyway. To prevent the annotators from submitting with any errors present, toggle the Prevent submission with errors switch.
Submission allowed
Submission with errors prevented

Custom Javascript function

You will write the custom validation as a Javascript function. This Javascript function will run on each annotator's browser while they are annotating data. The Javascript function has the following definition:
function(labels: Label[]): string[] {
// Your custom validation logic
}

label: Label[]

The validation function has a single input - a list of labels containing minimal meta-data about the labels. Please see the definition of the Label object below:
interface Label {
category: string[];
attributes: LabelAttribute[];
labelType: TaskType;
numFramesLabeled?: number;
}
// Label attribute
interface LabelAttribute {
name: string;
value: boolean | number | string;
}
// Task Type
enum TaskType {
ITEMS = 'ITEMS',
CLASSIFY = 'CLASSIFY',
BBOX = 'BBOX',
POLYGON = 'POLYGON',
POLYLINE = 'POLYLINE',
POINT = 'POINT',
ELLIPSE = 'ELLIPSE',
SEGMENTATION = 'SEGMENTATION',
MULTI = 'MULTI',
MULTICLASSIFY = 'MULTICLASSIFY',
LENGTH = 'LENGTH',
ANGLE = 'ANGLE',
}
You can generate a sample Label[] object by going to the labeling tool -> opening command bar cmd/ctrl + k -> Copy current label state to clipboard.

Returns string[]

Your custom validation script must return a list of warning/error messages describing the issues found. Return an empty array [] if no errors are found. These error message strings will be displayed to the annotators on the labeling tool.
To help you write a validation function with several checks, RedBrick AI has a custom-defined function assert that accepts a boolean statement and a corresponding error message. The two scripts below will produce the same result:
With assert()
Without assert()
function(labels: Label[]): string[] {
assert(labels.length > 1, "You have not created any labels!")
assert(labels.length < 5, "You have created too many labels!")
}
function(labels: Label[]): string[] {
const errors: string = []
if (labels.length > 1) {
errors.push("You have not created any labels!")
}
if (labels.length < 5) {
errors.push("You have created too many labels!")
}
return errors
}

Validate your code

Before saving your script, you should validate that your code executes as expected. Click on the validate button on the bottom right of the Settings page, and paste the JSON object copied from the labeling tool to see if your code executes as expected:

Displaying the validation on the labeling tool

Your custom validation script will be regularly run. If any warnings are found, an indicator will appear on the right side of the bottom bar. If you have enabled Prevent submissions with errors the indicator will be red.
Submission with errors is allowed
Submission with errors is prevented

Example Scripts

Check if exact categories are present

For this example, let's say we are expecting each task to contain the following segmentations - necrosis, enhancing tumor, non-enhacing tumor and edema.
function(labels: Label[]): string[] {
const expectedCategories = [
'necrosis',
'enhancing tumor',
'non-enhancing tumor',
'edema',
];
// Iterate over all expected categories
for (const category of expectedCategories) {
// Check if the category is present in any label
const categoryPresent = labels.some(
(label) => label.category[0] === category
);
// assert with message
assert(categoryPresent, `The ${category} category is missing!`);
}
}

Validate only single instance of a category has been created

This script validates only a single instance of a particular category has been created. If you're expecting semantic segmentation labels, this check can ensure annotators don't accidentally create multiple instance segmentations.
function(labels: Label[]): string[] {
const semanticCategory = 'edema';
const labelsFiltered = labels.filter((label) => label.category[0] === semanticCategory);
assert(
labelsFiltered.length === 1,
`Expected exactly 1 ${semanticCategory} annotation`
);
}
Copy link
Outline
Overview
Prevent submissions with errors
Custom Javascript function
Validate your code
Displaying the validation on the labeling tool
Example Scripts
Check if exact categories are present
Validate only single instance of a category has been created