Explainable AI using CBR Challenge 2023

challenge 223 schedule(NEW on 05-JULY-23 !) SCHEDULE and SUBMISSION GUIDELINECALL FOR PARTICIPATIONIMPORTANT DATES   | TUTORIAL  |  PRIZES | COMMITTEE

EXTENDED Submission deadline (explainer) : 7th July 2023    Participants will have to Friday 7th July to submit their colab notebook with their explainer. iSee team needs some time to add the explainer into iSee library for you.

EXTENDED Submission deadline (report and iSee use case) : 17th July 2023.      Participants will have the opportunity to complete their use case set up in the cockpit and perform some evaluation of the explanations using the conversational tool. There will be time to complete your report or online questionnaire and submit it by 1:00.

SCHEDULE (full workshop schedule here)

On the day in Aberdeen, you will have the opportunity to finish your task either on iSee cockpit (use case and evaluation) or completing the Questionnaire during the first slot (11:30-12:00).  URL of the Cockpit (use your user account) : https://cockpit-dev.isee4xai.com/usecases
In the 3d slot (16:00-17:30) you will have the opportunity to share challenges and advice for enhancement or future applications (5-10 minutes talk) and receive your prize. Everyone will win a participation prize !

11:30 – 12:00  : Opening
12:00 – 13:00  : Contest + questionnaire (Lab room)
16:00 – 17:30  : summary of the session , submission talks,  award ceremony, Q&A.

SUBMISSION GUIDELINES

You are asked to provide answers to the questionnaire we have compiled. You have the choice of writing a short report and submitting it by email at hello@isee4xai.com. Or answering directly the online questionnaire at that link (online questionnaire). Your answers will contain details for the jury to understand the following points:

1) the explainer if you have proposed one, on which data type it is working, which technologies/model it is using and a brief description of the logic;
2) to which intent/goal the explanation is expected to answer (points that you would set up in the iSee Cockpit for instance);
3) how the application of your explainer (Explanation strategy) is performing on the chosen use case;
4) how/if you use the Q&A dialog manager to evaluate the explainer results.

QUESTIONNAIRE:

  1. EXPLAINER Please share about the explainer if you have proposed one :
    • Explainer name, data type used, which technologies/model does your explainer use ?
    • Please add a brief description of the logic
    • How have you tested your explainer ? if on one particular problem, which one is it please ? Can we test it with the colab notebook ?
  2. SET UP a CASE OF EXPLANATION NEEDS:
    • Did you have a use case ? if yes which one?
    • Do you have any initial intents/goal for evaluating the result of the explanation that a user could expect ? If yes, how did it go with entering these inside the iSee cockpit ?
    • Have you entered new evaluation questions for your persona ? if yes which one?
    • How much time have you spent entering the case in the cockpit, AI model, Persona, evaluation question, link to explainer ? (half day/1day/more/not done)
  3. EVALUATION OF EXPLANATION: 
    • Was a strategy based on your explainer or another existing one recommended by iSee cockpit ?
    • If no did you manage to link the case to your explainer by adding it as a potential new explanation strategy (hard task we can help with) ?  (yes/no)
    • When your/existing explainer was recommended by iSee cockpit, what was the highest similarity score ?
    • Did you use the Q&A dialog manager to evaluate the explainer results? (yes/no)
    • How many conversations with the chatbot did you try? If not, please let us know why.
    • Were the chatbot responses meaningful to you ? If not, please tell us why.
  4. USABILITY of ISEE:
    • Tell us how intuitive you found the user interface of the cockpit.
    • Are the tooltips useful?
    • What would you advise to change in the User Interface ?

BACKGROUND

Recently released AI Index Report 2023 – Artificial Intelligence Index (stanford.edu) has confirmed the increasing need for toolkits, frameworks, and platforms to evaluate AI model and their explicability level. To reach the stage of trusted AI, this has become clear that a multi-perspectives co-creation approach where end-user and designers are sharing the knowledge and experience gained. Evaluating Explanation methods of AI systems would then be in 2 steps: First, a technical evaluation by the ML expert to asses reliability and then subjective/qualitative feedback by the end-user. Within the iSee project, we advocate that past experience about using an explanation method on certain use case domains can be reused to generate a new explanation strategy (i.e. a combination of explanation methods). To support this we launch the iSee platform which allows entering use cases about the need for explaining an AI system, ingesting new explainers, assessing their reusability on the use cases and evaluating the application of explanation methods through personalised questions.

CALL FOR PARTICIPATION

After its successful first edition at XCBR’22 we organize a second contest on explicability using the platform that we have launched in the iSee project. This year, the challenge focuses on the evaluation of explanation methods (new or existing) adaptability on use cases, and to that aim, the participants are advised to use iSee tools to :

  1. Ingest a new explainer into the iSee Explainer Library (we will provide you with a Google Colab Template)
  2. Evaluate the explanation generated by the new explainer (or other existing in iSee) on a use case (either existing in iSee or one new that you can enter), using the iSee Cockpit use case manager and iSee Cockpit conversational evaluation tool.
  3. Share your remarks about experiencing the use of iSee.

iSee cockpit is our graphical tool to model a use case, user intents, check recommended explanation strategies for the case (which can include the newly added explainer or not) and enter the questions that will be asked by the iSee evaluation chatbot to guide the end-user evaluating the explanation experience.  iSee team will propose all registered participants of the XCBR challenge follow an online tutorial (free).

At every step, participants will have the option to be coached by a iSee researcher as a mentor to help should they need that.

IMPORTANT DATES

  1. Expression of Interest Deadline: 20th May 2023 (email to hello@isee4xai.com)
  2. Online Tutorial for participants: 22nd – 26th May (1 hour slot)
  3. EXTENDED Submission deadline (explainer) : 7th July 2023
  4. EXTENDED Submission deadline (report and iSee use case) : 17th July 2023.
  5. XCBR challenge day :17th July 2023 : CONTEST 11:30 – 1:30 / Workshop talks 2:00 – 3:30  / AWARDS 4:00 – 5:30 
  6. iSee Tutorial session on site with ICCBR : 18th July 2023  9:00 to 11:00

XCBR Challenge Call

iSee TUTORIAL

In complement to iSee Tutorial ICCBR2023, The iSee team organises a dedicated tutorial to walk the participants through iSee tools. The tutorial will be a mix of practical activities (video and online demo) where the participants can interact with iSee industry users and researchers. The following topics will be explained :

  1. How to add a new explainer into the iSee catalogue (and validate it is following the format of reusable explainers expected by iSee)
  2. How to create a new use case with iSee Cockpit or amend an existing use case (and what are the key points to pay attention to when modelling the AI model and explanation intents)
  3. How to retrieve a list of recommended explanation strategies and evaluate your own explainer.

You are encouraged to have a look at our HOW TO GUIDELINES as well.

COMMITTEE

Anne Liret, BT, France
Bruno Fleisch, BT France
Chamath Palihawadana, Robert Gordon University, UK
Marta Caro-Martinez, University Complutense of Madrid, Spain
Ikechukwu Nkisi-Orji, Robert Gordon University, UK
Jesus Miguel Darias-Goitia, University Complutense of Madrid, Spain
Preeja Pradeep, 
University College Cork, Ireland.

 

PRIZES

There will be a selection of prizes distributed to all participating teams, with an extra price for the top winning contribution, among the following

  • Amazon vouchers
  • iSee certified practitionner Tshirt
  • Local Scottish food or objects
  • iSee XCBR challenge Stickers
  • iSee Coffee mug
  • Cloud computing voucher