Skip to content
iarml - Workshop 2024
Skip to content
  • Home
  • SCHEDULE
  • Organizing & Program Committees
  • Plenary Speakers 2024
  • ACCEPTED PAPERS
  • Call & Submision Guidelines
Home ACCEPTED PAPERS

ACCEPTED PAPERS

  • Transformer-based hierarchical attention models for solving analogy puzzles between longer, lexically richer and semantically more diverse sentences (Benming Yan, Haotong Wang, Liyan Wang, Yifei Zhou, Yves Lepage)
  • Learning Analogies between Classes to Create Counterfactual Explanations (Xiaomeng Ye, David Leake, Yu Wang, Ziwei Zhao, David Crandall)
  • Enhancing Analogical Reasoning in the Abstraction and Reasoning Corpus via Model-Based RL (Jihwan Lee, Woochang Sim, Sejin Kim, Sundong Kim)
  • Towards a unified framework of numerical analogies: Open questions and perspectives (Yves Lepage, Miguel Couceiro)
  • Testing proportional sentence analogies on SATS: from vector offsets to conditional generation (Yves Blain-Montesano, Philippe Langlais)
  • Probing Large Language Models to Perform Analogical Transformation (François Olivier, Miguel Couceiro, Zied Bouraoui)

Comments are closed.

  • Key dates

    May 10, 2024:
    Paper Due Date
    June 4, 2024:
    Paper Notification
    June 25, 2024:
    Camera-ready
    August 3-5, 2024:
    Workshop IARML@IJCAI 2024
  • Past Event

    ARCHIVES – 2023-
    Welcome – Schedule

    Accepeted Papers

    Plenary Speakers

    Call & Submission__________________________________

  • Supported by

IARML-Loria-MR@24
Powered by Nirvana & WordPress. Mentions légales & CGU & Politique de confidentialité & Cookies

Nous utilisons des cookies pour vous offrir la meilleure expérience sur notre site.

You can find out more about which cookies we are using or switch them off in .

iarml - Workshop 2024
Powered by  GDPR Cookie Compliance
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.