no code implementations • 16 May 2024 • Milan Bhan, Jean-Noel Vittaut, Nina Achache, Victor Legrand, Nicolas Chesneau, Annabelle Blangero, Juliette Murris, Marie-Jeanne Lesot
In this work, we propose to apply counterfactual generation methods from the eXplainable AI (XAI) field to target and mitigate textual toxicity.
no code implementations • 8 May 2024 • Audrey Poinsot, Alessandro Leite, Nicolas Chesneau, Michèle Sébag, Marc Schoenauer
This paper provides a comprehensive review of deep structural causal models (DSCMs), particularly focusing on their ability to answer counterfactual queries using observational data within known causal structures.
no code implementations • 18 Mar 2024 • Natalia De La Calzada, Théo Alves Da Costa, Annabelle Blangero, Nicolas Chesneau
This research paper investigates public views on climate change and biodiversity loss by analyzing questions asked to the ClimateQ&A platform.
no code implementations • 19 Feb 2024 • Milan Bhan, Jean-Noel Vittaut, Nicolas Chesneau, Marie-Jeanne Lesot
Incorporating natural language rationales in the prompt and In-Context Learning (ICL) has led to a significant improvement of Large Language Models (LLMs) performance.
no code implementations • 24 Apr 2023 • Milan Bhan, Jean-Noel Vittaut, Nicolas Chesneau, Marie-Jeanne Lesot
Counterfactual examples explain a prediction by highlighting changes of instance that flip the outcome of a classifier.
no code implementations • 27 Mar 2023 • Milan Bhan, Nina Achache, Victor Legrand, Annabelle Blangero, Nicolas Chesneau
A human-grounded experiment is conducted to evaluate and compare CLS-A to other interpretability methods.
no code implementations • 19 Jul 2017 • Nicolas Chesneau, Grégory Rogez, Karteek Alahari, Cordelia Schmid
In this paper, we propose a new framework for action localization that tracks people in videos and extracts full-body human tubes, i. e., spatio-temporal regions localizing actions, even in the case of occlusions or truncations.