'What does indexing on a connected layer do?
I have been trying to understand this bit of code. I dont quite understand what a slicing operator does to a fully connected layer. Here's the code for context.
def generate_condition(self, embedding):
conditions = fc(embedding, self.embedd_dim * 2, 'gen_cond/fc',activation_fn=tf.nn.leaky_relu)
mean = conditions[:, :self.embedd_dim]
log_sigma = conditions[:, self.embedd_dim:]
return [mean, log_sigma]
where fc returns:
return tf.contrib.layers.fully_connected(inputs, num_out, activation_fn=activation_fn, weights_initializer=w_init,reuse=reuse, scope=name)
Solution 1:[1]
It simply splits your array. The way this layer works is that it has one layer that is supposed to produce 2 things: mu and sigma. You could achieve this by first having one, shared layers, and then 2 separate heads, one for mu, one for sigma. Another way to implement exactly the same thing is to have one head, of the size that is sum of the sizes of mu and sigma, and then just split it back into 2 things. In your case, first self.embedd_dim will be interpreted as mu, and the remaining features/neurons as log_sigma.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | lejlot |
