Accepted Papers
- A Joint Learning Approach for Semi-supervised Neural Topic Modeling.
Jeffrey Chiu, Rajat Mittal, Neehal Tumma, Abhishek Sharma and Finale Doshi-Velez
- SlotGAN: Detecting Mentions in Text via Adversarial Distant Learning.
Daniel Daza, Michael Cochez and Paul Groth
- TempCaps: A Capsule Network-based Embedding Model for Temporal Knowledge Graph Completion.
Guirong Fu, Zhao Meng, Zhen Han, Zifeng Ding, Yunpu Ma, Matthias Schubert, Volker Tresp and Roger Wattenhofer
- Joint Entity and Relation Extraction Based on Table Labeling Using Convolutional Neural Networks.
Youmi Ma, Tatsuya Hiraoka and Naoaki Okazaki
- Multilingual Syntax-aware Language Modeling through Dependency Tree Conversion.
Shunsuke Kando, Hiroshi Noji and Yusuke Miyao
- Predicting Attention Sparsity in Transformers.
Marcos Vinicius Treviso, António Góis, Patrick Fernandes, Erick Rocha Fonseca, and Andre Martins
- Neural String Edit Distance.
Jindřich Libovický and Alexander Fraser
Non-Archival Papers to be presented at SPNLP 2022
- DomiKnowS: A Library for Integration of Symbolic Domain Knowledge in Deep Learning.
Hossein Rajaby Faghihi, Quan Guo, Andrzej Uszok, Aliakbar Nafar and Parisa Kordjamshidi
- Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors.
Wanyu Du, Jianqiao Zhao, Liwei Wang and Yangfeng Ji
- Conditioning Pretrained Language Models with Multi-Modal information on Data-to-Text Generation.
Qianqian Qi, Zhenyun Deng, Yonghua Zhu, Lia Lee, Jiamou Liu and Michael J. Witbrock
- Language Modelling via Learning to Rank.
Arvid Frydenlund, Gagandeep Singh and Frank Rudzicz
- Query and Extract: Refining Event Extraction as Type-oriented Binary Decoding.
Sijia Wang, Mo Yu, Shiyu Chang, Lichao Sun and Lifu Huang
- Extracting Temporal Event Relation with Syntax-guided Graph Transformer.
Shuaicheng Zhang, Qiang Ning and Lifu Huang