AIAI Seminar: Monday 23rd February 2026 by Antonio Vergari

Title: Probabilistic Neuro-symbolic AI from first principles: from weighted model counts to neurosymbolic diffusion

Abstract: This is a talk that can be of interest to anyone doing probabilistic ML and wondering how to include constraints into neural networks, in a principled way. These constraints can help when predicting trajectories, generating text with LLMs, or links in a knowledge graph. In the talk, I will start by dissecting the workhorse of probabilistic neuro-symbolic predictors: computing the probability of a logical formula, also known as computing the weighted model count of the formula. I will then discuss what are the promises and challenges of computing this quantity exactly, and investigate how to deal with these computations when constraints are defined over continuous variables. I will then touch how to proceed when exact computations are not possible, e.g., in the case of diffusion models over discrete structures.