Embodied
Intelligence
Laboratory
[ Home  Publications
 Intelligent Robot
Distinguished Lecture Series Members 
Software  Links

Visit
Lab]
Operation of Code
1. unpack the .zip into whatever directory you like.
2. open MATLAB and move into that directory.
Now, you're good to go. To run the demos, run one of the
lca_example scripts (i.e., type lca_example in the command window) from
within MATLAB.
There are three example scripts provided. These can be run
without knowing anything about the program:.
lca_example.m  this
is an example of LCA with whitening. The example develops 256
lobe
components using randomly selected 16 by 16 pixel windows from each of
13 grayscale images of nature (located in
data\naturalimages). It
runs over
1500000 samples, so it takes a little while.
lca_example2.m  same as above, but without whitening, and
fewer samples (500000).
lca_example3.m  an
example of topographic LCA with whitening. Aside from the
simulation
length (500000), it's same as in the first example, but the adjacent
lobe
components in the grid will update along with the winning unit.
Usage
The function you want to use is lca.m  "help lca" will give
detailed information about all inputs and outputs. You can use
the program with your own images or sythetic data in an imagelike
format  a 3D matrix where the first two dimensions give pixel row and
column, and the third gives the number of samples.
For an overview of how the the algorithm works read below. For a
more detailed understanding, read the comments in the .m files.
Description of Code
Stages
of the algorithm
1. Sampling  Because of the efficiency in which Matlab does matrix
operations vs. vector operations within loops, this version of LCA
samples data first, and preprocesses all at once. The window size
input to the program (the winsize parameter)
gives the size of the square sampling region, which is selected
randomly from all input images for a certain number of samples (given
by the sampletime parameter).
Each sample is stored as a column vector in a sampling matrix.
The order of the sample columns are randomized before the next stage.
2. Preprocessing  In all cases, the local mean is subtracted from each
sample. If whitening is to be done (set by the pre parameter), the whitening and
dewhitening matrices are computed here. This can be done
incrementally, using the CCIPCA algorithm, but is not here, again due
to the efficiency of matrix operations vs. loops in Matlab.
Finally, the samples are whitened through multiplication of the sample
matrix and the whitening matrix.
3. Initialization  The lobe components (the number of which is set by
the numn parameter) are
initialized to the first samples.
4. Loop: competition and updating  for each sample, the response is
computed for all lobe components, then sorted. The numwinners lobe components with the
largest absolute responses will update using amnesic averaging.
If the topography parameter was set, the grid neighbors of each
updating lobe component will also update.
5. Resampling and preprocessing  the same number of samples is
collected randomly from the inputs. The local mean is subtracted
from each, and they are all whitened using the old whitening matrix, if
necessary. The whitening matrix is not recomputed, so make sure
to set sampletime to a large enough value if you're using
whitening. The competition loop continues for another iteration
(the total number of iterations is set by the iteration parameter).
6. Loop Completion  the lengths of the lobe components reflect the
energy, or variance, of the samples they updated for. For filter
weights that relate to the input (i.e. pixels in the range of 0  255),
the standard deviation is needed. So each vector is divided by
the square root of its length.
The lobe component weights and number of times each updated are
returned as output values. If the display parameter is set, the
program will lastly display the lobe components as images, and display
the histogram of the number of times each updated. The first
figure is sorted by number of updates if topography is not used.
The second figure is always sorted to display the largest values on the
left.
References
J.
Weng, N. Zhang and R. Gajakunta, "Distribution Approximation: An
InPlace Developmental Algorithm for Sensory Cortices and A Hypothesis"
Technical Report MSUCSE440. Computer Science and Engineering,
Michigan State University, East Lansing, Michigan, September
2004. Download PDF file.
J. Weng and M. D.
Luciw, ``Optimal InPlace SelfOrganization for Cortical Development:
Limited Cells, Sparse Coding and Cortical Topography,'' in Proc.
5th International Conference on Development and Learning, May 30 
June 3, Bloomgton, IN, 2006. Download PDF
file.
[ Home  Publications
 Intelligent Robot
Distinguished Lecture Series Members 
Software  Links

Visit
Lab]
Email inquiries about this page to: luciwmat@cse.msu.edu
Embodied Intelligence Laboratory
Department of Computer
Science and Engineering
Michigan State University
