Skip to main content
Advertisement

< Back to Article

An iterated learning model of language change that mixes supervised and unsupervised learning

Fig 1

A guide to the notation. A: A meaning is an ordered sequence of n facts, thru . A signal is an ordered sequence of n words, thru with n = 6 in this example. The use of word is potentially confusing since a word is sometimes thought of as a sequence of letters, but here it is an indivisible component of the signal. A word corresponds to a single bit and a signal can be thought of as corresponding to a phrase; in the same way a meaning can be thought of as corresponding to a state of the world, composed of a set of facts. B: This is an example n = 3 language. This is much smaller than the n values actually simulated here but is convenient for illustration. The example here is fully expressive: every possible meaning maps to a different signal, and it is fully compositional: each fact fully determines a unique word. Here, , and . For illustrative convenience the and elements involved in the last of these equivalences have been colored orange. In the example in C the decoder map d maps the signal ( 1 , 0 , … , 1 ) to the meaning ( 0 , 1 , … , 1 ) ; the decoder is made up of two parts, the neural network with one hidden layer the same size as the input and output, and the decision map δ which maps probabilities to zero or one.

Fig 1

doi: https://doi.org/10.1371/journal.pcsy.0000030.g001