Comp304 Practice Quiz

A vibrant illustration depicting self-organizing maps, neural networks, and data clustering, with elements of circuitry and mathematical symbols in a modern, digital style.

Comp304 Practice Quiz

Welcome to the Comp304 Practice Quiz! Test your knowledge on the fascinating world of Self-Organizing Maps (SOMs) and their applications in neural networks. This quiz is designed for students and enthusiasts alike, helping you to solidify your understanding and explore various concepts.

Key Features:

  • 12 challenging questions
  • Covering a range of topics from SOM training to node classification
  • Instant feedback on your answers
12 Questions3 MinutesCreated by LearningMap101
What is the capital city of KZN
Durban
Pietermaritzburg
Richardsbay
NewCastle
KwaNongoma
Which of the following is true regarding training a SOM
During training the SOM will classify data according to similar properties through adjusting weights linearly using a gaussian function
SOM can use a "Supervised" learning approach applying backpropagation to adjust weights of every node in the network
In vector quantization the training set is supervised through a method called learning vector quantization(LVQ) for fine tuning the SOM
SOM performs dimensionality reduction, cluster labeling as well as reorganize instances
SOM can not be trained since it is not e neural network
None of the above
A supervised version of SOM(if it exists) require that vectors used for training have an unknown classification
True
False
Can not say
Kohonen neural networks and SOM are Not the same thing
True
False
Can not say
Which of the following is true regarding selecting the BMU
The input neuron with dimensions closest to the origin(0,0) when reduced from 3D is the BMU
The input neuron with minimal distance to all of the given nodes is the BMU
The value of the gaussian function for the BMU is always 1
An input neuron with the highest weights is the BMU
The input neuron with the lowest weights is the BMU
The output neuron/node mapped to the 1st input with the smallest weights will always be the BMU
Only the tanh activation function can be applied without a bias
True
False
Can not say
A neuron and a node are different entities
True
False
Can not say
Which of the following is not false regarding a SOM neural network
If node A and node B have 150 neurons between them, node A would still be able to offer feedback to Node B as long as they are within the same neural grid/lattice
Feedforward networks and SOM treat the concept of neighbourhood the same way
Using random weights for SOM outputs is not applicable, a more structured approach of calculating weights must always be implemented from the very begining
An infinite number of iterations is acceptable for training an SOM with a large number of input and output neurons
None of the above
Which of the following is true regarding the degree of training each output node gets
The BMU does not receive any training since it is the standard every other node is trying to match up to
The nodes furthest to the BMU receive most training since they are not as effective in mapping inputs
The gaussian function penalizes nodes that are 10 nodes away from the BMU
A Mexican hat function can be used to calculate the degree of training each node gets
None of the above
Can not say
The neighbours of the BMU always learn more from the training pattern since the value of their gaussian function is always higher
Which of the following is not false regarding the learning rate of an SOM
Increases monotonically with each iteration
Will always be 0.5 for maximum effectiveness on forming clusters of the SOM
When training an SOM it is always important to make sure a learning rate is always reduced to 0.02 for nodes with similar weights and 0.5 for the rest
The aim of training the SOM, is to make the node weights exactly the same as the training elements/ inputs through using the highest possible learning rate
When the learning rate is approximately zero, it is ok to stop the training process/
Can not say
The algorithm for training a SOM includes increasing the radius for the neighbourhood
True
False
Cannot say
When using the Learning Vector quantisation, weights for the winner are calculated the same way for correctly classified inputs as well as misclassified inputs
True
False
Can not say
{"name":"Comp304 Practice Quiz", "url":"https://www.quiz-maker.com/QPREVIEW","txt":"Welcome to the Comp304 Practice Quiz! Test your knowledge on the fascinating world of Self-Organizing Maps (SOMs) and their applications in neural networks. This quiz is designed for students and enthusiasts alike, helping you to solidify your understanding and explore various concepts.Key Features:12 challenging questionsCovering a range of topics from SOM training to node classificationInstant feedback on your answers","img":"https:/images/course1.png"}
Powered by: Quiz Maker