iSee: Intelligent Sharing of Explanation Experience by Users for Users

A right to obtain an explanation of the decision reached by a machine learning (ML) model is now an EU regulation. Different stakeholders (e.g. managers, developers, auditors, etc.) may have different background knowledge, competencies and goals, thus requiring different kinds of interpretations and explanations. Fortunately, there is a growing armoury of ways of interpreting ML models and explaining their predictions, recommendations and diagnoses. We will refer to these collectively as explanation strategies. As these explanation strategies mature, practitioners will gain experience that helps them know which strategies to deploy in different circumstances. What is lacking, and what we propose addressing, is the science and technology for capturing, sharing and re-using explanation strategies based on similar user experiences, along with a much-needed route to explainable AI (XAI) compliance. Our vision is to improve every user’s experience of AI, by harnessing experiences of best practice in XAI by users for users.

Work Packages

Work Package 1: Ontology and Data Modelling

This WP develops and maintains iSeeOnto. We define the iSeeOnto requirements, conceptualisation, implementation, case representation, evaluation and maintenance. We describe the iSeeOnto explanation strategies and user experience and provide a standard representation formalism across the iSee platform, which is evaluated and adapted throughout the duration of the project reflecting advances in the use cases.

Work Package 2: iSee Platform

This WP develops the increasing versions of the iSee platform. Specifically defines the architecture, technologies, testing plans and external APIs; and integrate key platform components to achieve the fully functional CBR engine able to retrieve, reuse, revise and retain explanatory experiences.

Work Package 3: Constructing Explanation Strategies

In WP3, we will demonstrate the construction of explanations based on interactions with designer-users and, relatedly, the combination of previous explanation experiences to form new composite strategies. WP3 is central and interacts with all other WPs.

Work Package 4: iSee Evaluation Cockpit

The Evaluation cockpit provides an interactive environment for both designers and end-users to express intent, create, evaluate and manage their portfolio of explanation evaluation projects.  This WP develops a personalised interaction strategy for the cockpit; captures user’s intent aligned with iSeeOnto concepts; identifies (mis)matched expectations to facilitate case retrieval and provide explanation analytics for use cases and integrate with iSee platform.

Work Package 5: WP Use Case Management

WP6 provides a participatory approach to integration of explainability into new and existing AI systems. We manage use case stakeholder activities (end-users, design users, framework contributors, policy makers); co-design iSee platform user experience and evaluation measures; validate acceptability and suitability measures; and define clear data ownership, ethics and interoperability issues of the iSee cockpit (WP4) with AI systems.

Work Package 6: Impact Management and Open Science

WP6 organises a range of activities to raise awareness, attract external use-cases, build consensus and create visible/measurable impact with all stakeholders. This includes: explore XAI standards requirements; certification requirements at the national and EU level supported by iSee industry use-cases validation; manage the dissemination and exploitation strategy with oversight by the advisory committee; develop business and innovation plan with sight of a go-to-market strategy.

Work Package 7: Project Management

This WP manages and coordinates the project entities. Monitoring progress and scheduling to ensure delivery of scientific and technical objectives on time and on budget with appropriate quality and ethical and intellectual property management and standards. Maintaining effective communication documentation and archives of results and deliverables. Preparation and submission of periodic reports; monitoring risks and preparing contingency plans. Realignment of work packages and tasks where necessary.


  • D1.1: Ontology requirements specification document (RGU,R,CO)
  • D1.2: iSeeOnto conceptual model (UCM,R,CO)
  • D1.3: iSeeOnto v1 (UCM,SW,PU)
  • D1.4: Case representation (RGU,SW+R,CO)
  • D1.5: iSeeOnto revisions and final version (RGU,SW+R,PU)
  • D2.1a:Software Development Plan (SDP). (UCM,R,CO)
  • D2.1b: API for external contributions. (UCM,SW,PU)
  • D2.2a: QA reports. (UCM,R,PU)
  • D2.2b: Final iSee Platform. (iSee v4) (BTF,SW,PU)
  • D2.3: Retrieval-only iSee CBR system. (iSee v1) (UCM,SW,PU)
  • D2.4: Retrieval from Pre-defined Interactive Case Base. (iSee v2) (UCC,SW,PU)
  • D2.5: Constructive Adaptation and Case Retention. (iSee v3) (RGU,SW,PU)
  • D3.1: Similarity metrics and retrieval algorithms (UCC, R+SW,PU)
  • D3.2: Seed case base (BTF, R+SW, PU)
  • D3.3: Design of composite strategies (UCC, R+SW, CO)
  • D3.4: Composite strategies (UCM, R+SW, PU)
  • D3.5: Revision & Retention algorithms (RGU, R+SW, PU)
  • D4.1: Interaction engine (RGU,SW,PU)
  • D4.2: Evaluation module (RGU,SW,PU)
  • D4.3: Intent understanding module (BTF, SW,PU)
  • D4.4: Analytics dashboard (RGU,DEM,PU)
  • D4.5a: Cockpit integration interface (UCM,SW,PU)
  • D4.5b: Cockpit user manual (RGU,DEC,PU)
  • D4.5c: Knowledge discovery in the cockpit (RGU,R,PU)
  • D5.1a: Systematic literature review on evaluation instruments (RGU,R,PU)
  • D5.1b: Evaluation criteria (BTF,R,PU+CO)
  • D5.2: Cockpit UI and UX Design Guidelines (BTF,R,CO)
  • D5.3a: Use case specification & co-creation activity schedule. (BTF,DEM,CO)
  • D5.3b: Recruitment Plan (BTF,DEM+R,CO)
  • D5.4: Use case Ethics and data compliance framework (RGU,DEM,CO)
  • D5.5: Use case integration API (UCM,SW, CO)
  • D6.1: iSee XAI website and community forum (BTF,OTHER,PU)
  • D6.2: Regulatory Framework Recommendation Proposal (RGU,R,CO)
  • D6.3a: Intellectual Knowledge Management (BTF, R, CO)
  • D6.3b: Dissemination and communication plan (BTF,R, PU)
  • D6.3c: Exploitation Plan (BTF,R, PU)
  • D6.3d: X-CBR Challenge Workshops (UCM,OTHER,PU)
  • D6.4: Go-to-market strategy for XAI certification (BTF,R, PU)
  • D7.1a: Project Handbook (UCM,R,CO)
  • D7.1b: Annual activity reports . (UCM,R,CO)
  • D7.2: Quality plan. (BTF,R,CO)
  • D7.4a: Consortium Agreement. (BTF,R,CO)
  • D7.4b: Data Management Plan. (DMP) (UCM,R,CO)