Yoon Kim, Chris Dyer, Alexander M. Rush
We study a formalization of the grammar induction problem that models sentences as being generated by a compound probabilistic context-free grammar. In contrast to traditional formulations which learn a single stochastic grammar, our grammar's rule probabilities are modulated by a per-sentence continuous latent variable, which induces marginal dependencies beyond the traditional context-free assumptions. Inference in this grammar is performed by collapsed variational inference, in which an amortized variational posterior is placed on the continuous variable, and the latent trees are marginalized out with dynamic programming. Experiments on English and Chinese show the effectiveness of our approach compared to recent state-of-the-art methods when evaluated on unsupervised parsing.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Constituency Parsing | PTB Diagnostic ECG Database | Max F1 (WSJ) | 60.1 | Compound PCFG |
| Constituency Parsing | PTB Diagnostic ECG Database | Max F1 (WSJ10) | 68.8 | Compound PCFG |
| Constituency Parsing | PTB Diagnostic ECG Database | Mean F1 (WSJ) | 55.2 | Compound PCFG |
| Constituency Parsing | PTB Diagnostic ECG Database | Max F1 (WSJ) | 52.6 | Neural PCFG |
| Constituency Parsing | PTB Diagnostic ECG Database | Mean F1 (WSJ) | 50.8 | Neural PCFG |