AIAI Seminar: Monday 9th February 2026 by Mark Chevallier Speaker: Mark Chevallier Title: GradSTL: Comprehensive Signal Temporal Logic for Neurosymbolic Reasoning and Learning Abstract: We present GradSTL, the first fully comprehensive implementation of signal temporal logic (STL) suitable for integration with neurosymbolic learning. In particular, GradSTL can successfully evaluate any STL constraint over any signal, regardless of how it is sampled. Our formally verified approach specifies smooth STL semantics over tensors, with formal proofs of soundness and of correctness of its derivative function. Our implementation is generated automatically from this formalisation, without manual coding, guaranteeing correctness by construction. We show via a case study that using our implementation, an optimiser learns to satisfy a pre-specified STL constraint. Our approach offers a highly rigorous foundation for integrating signal temporal logic and learning by gradient descent. During the presentation, we will discuss why STL is important for learning functions of continuous time and its limitations. We will also discuss the verification and code generation methods we use to ensure the correctness of the implementation. This is joint work with Filip Smola, Richard Schmoetten, Jacques Fleuriot. It was presented in August 2025 at the 32nd International Symposium on Temporal Representation and Reasoning. The published paper is available at https://doi.org/10.4230/LIPIcs.TIME.2025.6 Feb 09 2026 14.00 - 15.00 AIAI Seminar: Monday 9th February 2026 by Mark Chevallier AIAI Seminar presented by Mark Chevallier Informatics Forum, G.03
AIAI Seminar: Monday 9th February 2026 by Mark Chevallier Speaker: Mark Chevallier Title: GradSTL: Comprehensive Signal Temporal Logic for Neurosymbolic Reasoning and Learning Abstract: We present GradSTL, the first fully comprehensive implementation of signal temporal logic (STL) suitable for integration with neurosymbolic learning. In particular, GradSTL can successfully evaluate any STL constraint over any signal, regardless of how it is sampled. Our formally verified approach specifies smooth STL semantics over tensors, with formal proofs of soundness and of correctness of its derivative function. Our implementation is generated automatically from this formalisation, without manual coding, guaranteeing correctness by construction. We show via a case study that using our implementation, an optimiser learns to satisfy a pre-specified STL constraint. Our approach offers a highly rigorous foundation for integrating signal temporal logic and learning by gradient descent. During the presentation, we will discuss why STL is important for learning functions of continuous time and its limitations. We will also discuss the verification and code generation methods we use to ensure the correctness of the implementation. This is joint work with Filip Smola, Richard Schmoetten, Jacques Fleuriot. It was presented in August 2025 at the 32nd International Symposium on Temporal Representation and Reasoning. The published paper is available at https://doi.org/10.4230/LIPIcs.TIME.2025.6 Feb 09 2026 14.00 - 15.00 AIAI Seminar: Monday 9th February 2026 by Mark Chevallier AIAI Seminar presented by Mark Chevallier Informatics Forum, G.03
Feb 09 2026 14.00 - 15.00 AIAI Seminar: Monday 9th February 2026 by Mark Chevallier AIAI Seminar presented by Mark Chevallier