Explainable AI using CBR Challenge 2023
challenge 2023 schedule(NEW on 15-5-23 ! Timeline and submission) | CALL FOR PARTICIPATION | IMPORTANT DATES | TUTORIAL | PRIZES | COMMITTEE
challenge 2023 schedule(NEW on 15-5-23 ! Timeline and submission) | CALL FOR PARTICIPATION | IMPORTANT DATES | TUTORIAL | PRIZES | COMMITTEE
Recently released AI Index Report 2023 – Artificial Intelligence Index (stanford.edu) has confirmed the increasing need for toolkits, frameworks, and platforms to evaluate AI model and their explicability level. To reach the stage of trusted AI, this has become clear that a multi-perspectives co-creation approach where end-user and designers are sharing the knowledge and experience gained. Evaluating Explanation methods of AI systems would then be in 2 steps: First, a technical evaluation by the ML expert to asses reliability and then subjective/qualitative feedback by the end-user. Within the iSee project, we advocate that past experience about using an explanation method on certain use case domains can be reused to generate a new explanation strategy (i.e. a combination of explanation methods). To support this we launch the iSee platform which allows entering use cases about the need for explaining an AI system, ingesting new explainers, assessing their reusability on the use cases and evaluating the application of explanation methods through personalised questions.
After its successful first edition at XCBR’22 we organize a second contest on explicability using the platform that we have launched in the iSee project. This year, the challenge focuses on the evaluation of explanation methods (new or existing) adaptability on use cases, and to that aim, the participants are advised to use iSee tools to :
iSee cockpit is our graphical tool to model a use case, user intents, check recommended explanation strategies for the case (which can include the newly added explainer or not) and enter the questions that will be asked by the iSee evaluation chatbot to guide the end-user evaluating the explanation experience. iSee team will propose all registered participants of the XCBR challenge follow an online tutorial (free).
At every step, participants will have the option to be coached by a iSee researcher as a mentor to help should they need that.
XCBR Challenge Call
In complement to iSee Tutorial ICCBR2023, The iSee team organises a dedicated tutorial to walk the participants through iSee tools. The tutorial will be a mix of practical activities (video and online demo) where the participants can interact with iSee industry users and researchers. The following topics will be explained :
You are encouraged to have a look at our HOW TO GUIDELINES as well.
Anne Liret, BT, France
Bruno Fleisch, BT France
Chamath Palihawadana, Robert Gordon University, UK
Marta Caro-Martinez, University Complutense of Madrid, Spain
Ikechukwu Nkisi-Orji, Robert Gordon University, UK
Jesus Miguel Darias-Goitia, University Complutense of Madrid, Spain
Preeja Pradeep, University College Cork, Ireland.
There will be a selection of prizes distributed to all participating teams, with an extra price for the top winning contribution, among the following