Dana Fisman ; Dolav Nitay ; Michal Ziv-Ukelson - Learning of Structurally Unambiguous Probabilistic Grammars

lmcs:9223 - Logical Methods in Computer Science, February 8, 2023, Volume 19, Issue 1 - https://doi.org/10.46298/lmcs-19(1:10)2023
Learning of Structurally Unambiguous Probabilistic GrammarsArticle

Authors: Dana Fisman ; Dolav Nitay ; Michal Ziv-Ukelson

    The problem of identifying a probabilistic context free grammar has two aspects: the first is determining the grammar's topology (the rules of the grammar) and the second is estimating probabilistic weights for each rule. Given the hardness results for learning context-free grammars in general, and probabilistic grammars in particular, most of the literature has concentrated on the second problem. In this work we address the first problem. We restrict attention to structurally unambiguous weighted context-free grammars (SUWCFG) and provide a query learning algorithm for \structurally unambiguous probabilistic context-free grammars (SUPCFG). We show that SUWCFG can be represented using \emph{co-linear multiplicity tree automata} (CMTA), and provide a polynomial learning algorithm that learns CMTAs. We show that the learned CMTA can be converted into a probabilistic grammar, thus providing a complete algorithm for learning a structurally unambiguous probabilistic context free grammar (both the grammar topology and the probabilistic weights) using structured membership queries and structured equivalence queries. A summarized version of this work was published at AAAI 21.


    Volume: Volume 19, Issue 1
    Published on: February 8, 2023
    Accepted on: December 4, 2022
    Submitted on: March 18, 2022
    Keywords: Computer Science - Logic in Computer Science,Computer Science - Artificial Intelligence

    1 Document citing this article

    Consultation statistics

    This page has been seen 1858 times.
    This article's PDF has been downloaded 482 times.