You can define a custom Javascript script that can continuously compare annotations to a schema/set of rules and inform annotators (in real-time) of any mistakes in their annotations. You can also prevent annotators from submitting tasks when your validation script finds errors.
For most annotation projects, there is a schema/rule-set/taxonomy that annotators must follow. A large portion of errors in annotation projects is due to oversights/slips during labeling in adhering to the schema.
For example, the annotator must segment a tumor and fill out a few related attributes if a tumor is found. A common error can be forgetting to fill out all attributes when the tumor is present. Post-processing scripts usually reveal these errors.
Prevent simple, recurrent errors from occurring by writing a set of tests that will be run regularly, informing annotators of any mistakes they're making.
Overview
By default, all projects have a custom check to warn annotators when they submit tasks without any annotations. You can enable/disable custom validation under Project Settings -> Label Validation.
Prevent Submissions with Errors
By default, annotators will just receive the error messages as a warning, and they will still be able to submit the task anyway. To prevent the annotators from submitting with any errors present, toggle the Prevent submission with errors switch.
Custom Javascript function
You will write the custom validation as a Javascript function. This Javascript function will run on each annotator's browser while they are annotating data.
The Javascript function has the following definition:
function(task:Task, labels:Label[]):string[] {// Your custom validation logicassert(false,"This assertion was false");}
label: Label[]
The validation function has a single input - a list of labels containing minimal meta-data about the labels. Please see the definition of the Label object below:
You can generate a sample Label[] object by going to the labeling tool -> opening command bar cmd/ctrl + k -> Copy current label state to clipboard.
Returns string[]
Your custom validation script must return a list of warning/error messages describing the issues found. Return an empty array [] if no errors are found. These error message strings will be displayed to the annotators on the labeling tool.
To help you write a validation function with several checks, RedBrick AI has a custom-defined function assert that accepts a boolean statement and a corresponding error message. The two scripts below will produce the same result:
function(task:Task, labels:Label[]):string[] {assert(labels.length>=1,"You have not created any labels!");assert(labels.length<=5,"You have created too many labels!");}
function(task:Task, labels:Label[]):string[] {consterrors= [];if (labels.length<1) {errors.push("You have not created any labels!"); }if (labels.length>5) {errors.push("You have created too many labels!"); }return errors;}
Validate Your Code
Before saving your script, you should validate that your code executes as expected. Click on the validate button on the bottom right of the Settings page, and paste the JSON object copied from the labeling tool to see if your code executes as expected:
Displaying the Validation on the Labeling Tool
Your custom validation script will be regularly run. If any warnings are found, an indicator will appear on the right side of the bottom bar. If you have enabled Prevent submissions with errors,the indicator will be red.
Example Scripts
Check if Exact Categories are Present
For this example, let's say we are expecting each task to contain the following segmentations - necrosis, enhancing tumor, non-enhacing tumor and edema.
function(task:Task, labels:Label[]):string[] {constexpectedCategories= ['necrosis','enhancing tumor','non-enhancing tumor','edema', ];// Iterate over all expected categoriesfor (constcategoryof expectedCategories) {// Check if the category is present in any labelconstcategoryPresent=labels.some( (label) =>label.category[0] === category );// assert with messageassert(categoryPresent,`The ${category} category is missing!`); }}
Validate Only Single Instance of a Category has been Created
This script validates only a single instance of a particular category has been created. If you're expecting semantic segmentation labels, this check can ensure annotators don't accidentally create multiple instance segmentations.
Verify that Specific Segmentation Type is Visible on Specific Series
The following script examines the Series Identifier and verifies whether a specific Segmentation type is present on it. In this example, you could use this script to be sure that labelers cannot finalize a Series that ends in "DWI" (a common naming convention for DWI images) without including an "Infarct" segmentation on the Series.
More broadly speaking, this script is an example of the extensive functionality available when combining the label, task, and series objects.
function (task:Task, labels:Label[]):string[] {assert(labels.length>0,"You haven't created any labels! Are you sure you want to submit?");for (let label of labels) {if (label.category[0] ==="Infarct") {assert(task.series[label.seriesIndex].name.startsWith("DWI_"),"Segmentation 'Infarct' is allowed only on 'DWI' images" ); } }}