Tools of iSee are getting mature, why not use them ?… be ready to use them and get your AI model fully explained and certified as compliant with regulations !
In this tutorial, the iSee project team would like to share with the participants the latest developments of this project, especially:
- How explanation experiences are represented in an ontology created for the project
- How explanations experiences are captured in a case-based database
- How use cases can be designed and represented in the system
- How explanations are progressively and interactively provided to end-users using a chatbot and how they can be evaluated.
- How end-user evaluations are measured and used to augment the explanations strategies case base.
The tutorial will culminate in attendees being able to add a ‘mock’ case to the system, and receive recommendations of an explanation strategy to suit that case. Pending interest from the academic community, the tutorial may be expanded to include an interactive component demonstrating the uploading of XAI algorithms to iSee’s explainer API for broader industry impact.