Compositionality in Emerging Multi-agent Languages: Marrying Language Evolution and Natural Language
The mainstream approach in NLP is to train systems on large amounts of data. Such passive learning contrasts with the way language is learnt by humans. Human language is acquired within communities, it is culturally transmitted and changes dynamically. These evolutionary mechanisms have been extensively studied in the field of Language Evolution. Despite limited prior interaction between fields, such mechanisms are now increasingly incorporated into NLP systems. Such models have the potential to both study the evolution of language in multi-agent simulations with state-of-the-art (deep) learning systems in more naturalistic settings and improve NLP systems by having language emerge organically. We examine how findings from a model by Havrylov & Titov (2017) compare to those from traditional Language Evolution models and quantify the emerging compositionality using an existing Language Evolution method (Tamariz, 2011). This approach reveals novel insights into the generated data, the applied methodology and the nature of compositionality.
Kees Sommer, Jae Perris, Arianna Bisazza, and Tessa Verhoef, Compositionality in Emerging Multi-agent Languages: Marrying Language Evolution and Natural Language. In: Proceedings of CogSci 2019: Creativity+Cognition+Computation (ISBN: 0-9911967-7-5), p 3577, 2019.