HOW TO Use iSee platform

Quick start Videos :

About entering a new use case in the cockpit, explaining what is an intent, a persona, showing the recommendation of strategy : an example on Radiology diagnosis : Radiology use case demo video
another example on Loan approval : iSee Demo – YouTube

About selecting the questions for the evaluation and using the dialog manager : an example on Radiology diagnosis : Radiology use case demo video

ISee tutorial (How to use the Explainers library to add a new explainer, translate your explanation needs into a use case, understand the recommended explanation strategy and change it with your own strategy or plug an alternative one recommended by iSee engine, evaluate the result of the explanation strategy on data instances (sampled or user-chosen) and check the analytics board to understand which explanation method is the most popular, trusted, the least useful…). This tutorial has been recorded (3 hours) and the supporting material can be shared on demand on our contact form.

iSee Cockpit URL (you will need to have a user account) :

User guide of the Cockpit

Questionnaire : We would like to hear from you about your experience with iSee tools. Please would you mind sharing your feedback by clicking and filling in this online questionnaire.

Contributing to isee :

Use case onboard guide (version October 2023) : iSee_use case guidevshort_oct23

External contributors (either explainer addition or use case design or evaluation outcome sharing) have two ways of using iSee tools : use the hosted cockpit and dialog manager on our platform (all data are kept confidential and encrypted), or consume the API for external integration. The modus operandi of this integration API is outlined in Use case API integration D5_5 and in D2.1a API for external contributors

How to contribute to the Explainers catalogue :

The Explainers library has an API allowing researchers to upload(add), reuse (read) explanation methods. The library is at

Do you need an online “Explainability using iSee” course ?

Please contact us using the contact form at Contact – iSee (

How to design explanation strategies and user-evaluation processes

 In iSee we are using the Behaviour Tree concept. Have a look at our Behaviour Tree Editor personalised for Explanation experience design.

FAQ (click here for the updated FAQ)

How to add a new explainer, and validate it has the right format ?

The only way to add an explainer to the iSeeExplainerLibrary is to implement the skeleton py class as described on the README in github,  section “Adding New Explainers to the Catalogue”. Point 3 shows the skeleton.

Within that class you can call your pypi library, or another GitHub repository of your own work. In case you need help, you can work with us in iSee team : you just need to implement your methods on a ipynb or something similar, and we can help to complete the full cycle.

Then, please create a pull request to the iSeeExplainerLibrary with your explainer. Often this involves editing 2 existing py files in the repository and adding one new py file.

To show, check the reusability aspect, you will need to test your explainer on multiple datasets of the applicable data type. We already do this for existing explainers in iSee. For example, you can  check your explainer is working on use cases provided by the iSee team in iSee platform. On our side, we will propose standardised use case (though complex enough) for each dataset type, image (BW/RGB), Time-series, tabular and natural language text (NLP). 

What is an intent ?

How to link user evaluation question to intent ?

Why is the data type needed ?


What is the format of AI data sample to allow iSee generating an evaluation seed case ?


How to select preferred explanation strategy ?


How to build an Explanation strategy ?

How to evaluate my explainer in an (simple) Explanation strategy ?