Explainable AI using CBR Challenge 2023

challenge 2023 schedule(NEW on 15-5-23 ! Timeline and submission) |   CALL FOR PARTICIPATIONIMPORTANT DATES   | TUTORIAL  |  PRIZES | COMMITTEE


Recently released AI Index Report 2023 – Artificial Intelligence Index (stanford.edu) has confirmed the increasing need for toolkits, frameworks, and platforms to evaluate AI model and their explicability level. To reach the stage of trusted AI, this has become clear that a multi-perspectives co-creation approach where end-user and designers are sharing the knowledge and experience gained. Evaluating Explanation methods of AI systems would then be in 2 steps: First, a technical evaluation by the ML expert to asses reliability and then subjective/qualitative feedback by the end-user. Within the iSee project, we advocate that past experience about using an explanation method on certain use case domains can be reused to generate a new explanation strategy (i.e. a combination of explanation methods). To support this we launch the iSee platform which allows entering use cases about the need for explaining an AI system, ingesting new explainers, assessing their reusability on the use cases and evaluating the application of explanation methods through personalised questions.


After its successful first edition at XCBR’22 we organize a second contest on explicability using the platform that we have launched in the iSee project. This year, the challenge focuses on the evaluation of explanation methods (new or existing) adaptability on use cases, and to that aim, the participants are advised to use iSee tools to :

  1. Ingest a new explainer into the iSee Explainer Library (we will provide you with a Google Colab Template)
  2. Evaluate the explanation generated by the new explainer (or other existing in iSee) on a use case (either existing in iSee or one new that you can enter), using the iSee Cockpit use case manager and iSee Cockpit conversational evaluation tool.
  3. Share your remarks about experiencing the use of iSee.

iSee cockpit is our graphical tool to model a use case, user intents, check recommended explanation strategies for the case (which can include the newly added explainer or not) and enter the questions that will be asked by the iSee evaluation chatbot to guide the end-user evaluating the explanation experience.  iSee team will propose all registered participants of the XCBR challenge follow an online tutorial (free).

At every step, participants will have the option to be coached by a iSee researcher as a mentor to help should they need that.


  1. Expression of Interest Deadline: 20th May 2023 (email to hello@isee4xai.com)
  2. Online Tutorial for participants: 22nd – 26th May (1 hour slot)
  3. Submission deadline (report and explainer) : 3rd July 2023
  4. XCBR challenge day :17th July 2023
  5. iSee Tutorial session on site with ICCBR : 18th July 2023
  6. iSee in action Stall at the Industry Day : 18th July 2023

XCBR Challenge Call


In complement to iSee Tutorial ICCBR2023, The iSee team organises a dedicated tutorial to walk the participants through iSee tools. The tutorial will be a mix of practical activities (video and online demo) where the participants can interact with iSee industry users and researchers. The following topics will be explained :

  1. How to add a new explainer into the iSee catalogue (and validate it is following the format of reusable explainers expected by iSee)
  2. How to create a new use case with iSee Cockpit or amend an existing use case (and what are the key points to pay attention to when modelling the AI model and explanation intents)
  3. How to retrieve a list of recommended explanation strategies and evaluate your own explainer.

You are encouraged to have a look at our HOW TO GUIDELINES as well.


Anne Liret, BT, France
Bruno Fleisch, BT France
Chamath Palihawadana, Robert Gordon University, UK
Marta Caro-Martinez, University Complutense of Madrid, Spain
Ikechukwu Nkisi-Orji, Robert Gordon University, UK
Jesus Miguel Darias-Goitia, University Complutense of Madrid, Spain
Preeja Pradeep, 
University College Cork, Ireland.



There will be a selection of prizes distributed to all participating teams, with an extra price for the top winning contribution, among the following

  • Amazon vouchers
  • iSee certified practitionner Tshirt
  • Local Scottish food or objects
  • iSee XCBR challenge Stickers
  • iSee Coffee mug
  • Cloud computing voucher