19 October 2020 - Ionela Mocanu, Chang Luo & Paulius Dilkas

 

Speaker 1: Ionela Mocanu

 

Title:  Polynomial-time Implicit Learnability in SMT

Abstract:

To deploy knowledge-based systems in the real world, the challenge of knowledge acquisition must be addressed. Knowledge engineering by hand is a daunting task, so machine learning has been widely proposed as an alternative. However, machine learning has difficulty acquiring rules that feature the kind of exceptions that are prevalent in real-world knowledge. Moreover, it is conjectured to be impossible to reliably learn representations featuring a desirable level of expressiveness. Works by Khardon and Roth and by Juba proposed solutions to such problems by learning to reason directly, bypassing the intractable step of producing an explicit representation of the learned knowledge. These works focused on Boolean, propositional logics. In this work, we consider such implicit learning to reason for arithmetic theories, including logics considered with {satisfiability modulo theory} (SMT) solvers. We show that for standard fragments of linear arithmetic, we can learn to reason efficiently. These results are consequences of a more general finding:  we show that there is an efficient reduction from the learning to reason problem for a logic to any sound and complete solver for that logic.

Bio: Ionela is a PhD student interested in multi-agent systems and knowledge acquisition. The presentation will cover the main ideas of a paper that has been published at ECAI20.

 

Speaker 2: Chang Luo

 

Title: Pattern Recognition for Financial Sentiment Analysis

Abstract:

In this talk, I will introduce some ongoing research works of mine, aiming at decomposing news and investor sentiment in the financial research area with numerical and graph-representation-based techniques.

Bio: Chang is currently a second-year PhD student doing inter-disciplinary research among the finance and computer science domain. Especially, with an interest in focusing on the applications of modern pattern recognition techniques in the financial numerical analysis fields.

 

Speaker 3: Paulius Dilkas

 

Title:  Generating Random Logic Programs Using Constraint Programming

Abstract:

Testing algorithms across a wide range of problem instances is crucial to ensure the validity of any claim about one algorithm's superiority over another. However, when it comes to inference algorithms for probabilistic logic programs, experimental evaluations are limited to only a few programs. Existing methods to generate random logic programs are limited to propositional programs and often impose stringent syntactic restrictions. We present a novel approach to generating random logic programs and random probabilistic logic programs using constraint programming, introducing a new constraint to control the independence structure of the underlying probability distribution. We also provide a combinatorial argument for the correctness of the model, show how the model scales with parameter values, and use the model to compare probabilistic inference algorithms across a range of synthetic problems. Our model allows inference algorithm developers to evaluate and compare the algorithms across a wide range of instances, providing a detailed picture of their (comparative) strengths and weaknesses. This talk describes a recent paper published at CP2020.

Bio: Paulius is a second-year PhD student working on inference algorithms for probabilistic relational/graphical models including probabilistic logic programming and weighted model counting.