Thus, the L1 norm is called sparse, and the corresponding neural representation is sparse overcomplete. It was shown that the recurrent network of inhibitory neurons can implement sparse overcomplete representations (Rozell et al., 2008). To show this, the network dynamics are represented as a minimization of a cost function called the Lyapunov function, similarly to the representation of Hopfield networks (Hertz et al., 1991 and Hopfield, 1982). Hopfield
networks have attractor states that contain memory of activation patterns. In contrast to Hopfield networks, in purely http://www.selleckchem.com/products/ulixertinib-bvd-523-vrt752271.html inhibitory networks, the recurrent weights enter the Lyapunov function with a minus sign, which abolishes the attractor memory states and makes the NVP-BKM120 mouse network purely sensory (Rozell et al., 2008). Minimization of Lyapunov function in realistic recurrent
networks with inhibition was suggested as a means to implement the parsimony constraint (L1) mentioned above. To implement sparse overcomplete representations with realistic networks of neurons, two requirements have to be met (Rozell et al., 2008). First, the feedforward weights between the input layer of the network and the inhibitory neurons have to contain the dictionary elements (Figure 8A). This ensures that inhibitory neurons representing a particular dictionary element will be driven strongly when it is present in the input, due to a high overlap between the stimulus and the feedforward weights. Second, the recurrent inhibitory weight between any pair of neurons has to be proportional to the overlap between their dictionary elements (Figure 8A). This feature implies that similarly through tuned inhibitory neurons compete more strongly. Therefore, the two types of network weights, feedforward and recurrent, have to closely match each other, one of them constructed as the overlap of the other. Here, we suggest that the olfactory bulb network architecture based on dendrodendritic synapses can ensure that the feedforward and recurrent connectivity are closely matched. In the architecture
based on dendrodendritic synapses, both the feedforward weights received by the GCs and their recurrent connections are dependent on the same set of synapses. Similar architectures have been proposed for analysis-synthesis networks (Mumford, 1994 and Olshausen and Field, 1997). The GCs of the olfactory bulb receive excitatory inputs from the MCs through dendrodendritic synapses (Shepherd et al., 2004). These synapses encode patterns that can strongly drive individual GCs. The effective connectivity between GCs is inhibitory (GC-to-MC and MC-to-GC synapses are inhibitory and excitatory, respectively). To calculate the strength of mutual inhibition, one has to calculate the sum over intermediate synapses, which leads to the evaluation of a convolution or overlap between GC input weights (Figure 8B).