Schedule 2024

9:15-9:30 – Welcome

9:30-10:30 – Keynote Speaker: Zied Bouraoui
Learning Semantic Concept Embedding from Langage Models

10:30-11:00 – Coffee break

11:00-12:30 – Session 1 (20 mins presentation + 10 mins discussion)

  • Transformer-based hierarchical attention models for solving analogy puzzles between longer, lexically richer and semantically more diverse sentences
  • Learning Analogies between Classes to Create Counterfactual Explanations
  • Enhancing Analogical Reasoning in the Abstraction and Reasoning Corpus via Model-Based RL

12:30-14:30 – Lunch break

14:30-15:30 – Keynote: Emiliano Lorini
A Novel View of Analogical Proportion between Formulas

15:30-16:00 – Coffee break

16:00-17:30 – Session 2 (20 mins presentation + 10 mins discussion)

  • Towards a unified framework of numerical analogies: Open questions and perspectives
  • Testing proportional sentence analogies on SATS: from vector offsets to conditional generation
  • Probing Large Language Models to Perform Analogical Transformation

Comments are closed.