Self organising map classifying everything to one coordinate

In summary, the conversation is about a person who has coded a Self-Organizing Map (SOM) based on the explanation from the AI-Junkie website. However, the output of the program always classifies everything as the dimension vector - (1,1) instead of the expected result. The person suspects that the issue might be with the training loop, as the website was unclear about that part. They are seeking help to fix the issue.
  • #1
NotASmurf
150
2
Hey all,just coded this SOM based on the explination on the ai-junkie website, it seems to place all the input vectors in one single common coordinate, what am I doing wrong? I suspect its the training loop as that was the one part that the site was unclear about, Any help appreciated.

Code:
  static void Main(string[] args)
        {
            SOM som = new SOM(3, 5, 2, 0.1, new double[][] { new double[] {0,0,0 }, new double[] {0,0,0 } },0.00000001);
            List<string> lines = new List<string>();
            lines = System.IO.File.ReadAllLines("C:/Food.txt").ToList();
            som.training_data = new double[lines.Count][];
           
            for (int i = 0; i < som.training_data.Length; i++)
            {
                string[] h = lines[i].Split(',');
                som.training_data[i] = new double[h.Length-1];
                for (int j = 1; j < h.Length; j++)
                {
                    som.training_data[i][j - 1] = Convert.ToSingle(h[j]);
                }
            }

            som.Train();
            for (int i = 0; i < som.training_data.Length; i++)
            {
                var best = som.Classify(som.training_data[i]);
                Console.WriteLine(lines[i].Split(',')[0]+" "+best[0]+" "+best[1]);
            }
            Console.ReadLine();
           
        }
    }
    public class MathUtil
    {
        public static double Exp(double map_rad, double t, double lambda)
        {
            return map_rad * Math.Exp(-1 * (t / lambda));
        }
        public static double ExpLearn(double t, double lambda)
        {
            return Math.Exp(-1 * (t / lambda));
        }
        public static double Gaussian(int iteration, double distance, double neigh)
        {
            return Math.Exp(-1 * ((distance * distance) / 2 * neigh * neigh));
        }
    }
    public class Node
    {
        public double[] vec;
        public double error;
        Random r = new Random();
        public Node(int dim)
        {
            vec = new double[dim];
            for (int i = 0; i < dim; i++)
            {
                vec[i] = r.NextDouble();
            }
        }
        public double Distance(double[] vec)
        {
            double sum = 0;
            for (int i = 0; i < vec.Length; i++)
            {
                sum += Math.Pow(this.vec[i] - vec[i], 2);
            }
            return Math.Sqrt(sum);
        }
        public static double Distance(double[] vec,double[] vec2)
        {
            double sum = 0;
            for (int i = 0; i < vec.Length; i++)
            {
                sum += Math.Pow(vec[i] - vec2[i], 2);
            }
            return Math.Sqrt(sum);
        }
        public double UpdateWeight(double[] inp_vec, double learn, int iteration, double neigh)
        {
            double sum = 0;
            for (int i = 0; i < inp_vec.Length; i++)
            {
                double delta = learn * MathUtil.Gaussian(iteration, Distance(inp_vec), neigh) * (vec[i] - inp_vec[i]);
                vec[i] = vec[i] + delta;
                sum += delta;
            }
            error = sum / vec.Length;
            return sum / vec.Length;
        }

        public bool InRad(double radius, int[] pos_win, int[] pos_me)
        {
            double square_sum = 0;
            for (int i = 0; i < pos_me.Length; i++)
            {
                square_sum += Math.Pow(pos_me[i] - pos_win[i], 2);
            }
            if (Math.Sqrt(square_sum) < radius)
            {
                return true;
            }
            return false;
        }
    }

    public class SOM
    {
        public Node[,] nodes;
        public double width;
        public Random r = new Random();
        public double[][] training_data;
        public double height;
        public double Max_Error;
        public double learn_t0;

        public int[] Classify(double[] vec)
        {
            int[] best = new int[2];
            best = GetBMU(vec);
            return best;
        }
        public double GetError()
        {
            double error = 0;
            for (int i = 0; i < nodes.GetLength(0); i++)
            {
                for (int j = 0; j < nodes.GetLength(1); j++)
                {
                    error += nodes[i, j].error;
                }
            }
            return error;
        }
        public double neighboorhood_radius(int iteration)
        {
            return MathUtil.Exp(Map_Radius_t0, iteration, lambda(iteration));
        }
        public double LearnFac(int iteration)
        {
            return learn_t0 * MathUtil.ExpLearn(iteration, lambda(iteration));
        }
        public double lambda(int iteration)
        {
            return iteration / Math.Log(Map_Radius_t0);
        }
        public double Map_Radius_t0
        {
            get
            {
                return Math.Max(width, height) / 2;
            }
        }

        public int[] GetBMU(double[] ran_tr_vec)
        {
            int[] best = new int[2];
            double smallest = double.MaxValue;
            for (int i = 0; i < nodes.GetLength(0); i++)
            {
                for (int j = 0; j < nodes.GetLength(1); j++)
                {
                    if (Node.Distance(nodes[i, j].vec, ran_tr_vec) < smallest)
                    {
                        best[0] = i;
                        best[1] = j;
                    }
                }
            }
            return best;
        }
        public void Train()
        {
            for (int u = 0; u < 10000; u++)//while(GetError()>Max_Error)//
            {
                for (int ind = 0; ind < training_data.Length; ind++)
                {
                    int iter = u;
                    var inp_vec = training_data[ind];//r.Next(0, training_data.Length)];
                    int[] best = GetBMU(inp_vec);
                    #region Update_Weights
                    for (int i = 0; i < nodes.GetLength(0); i++)
                    {
                        for (int j = 0; j < nodes.GetLength(1); j++)
                        {
                            if (nodes[i, j].InRad(neighboorhood_radius(iter), best, new int[] { i, j }))
                            {
                                nodes[i, j].UpdateWeight(inp_vec, LearnFac(iter), iter, neighboorhood_radius(iter));
                            }
                        }
                    }
                    #endregion
                }
                Console.WriteLine(GetError());
            }
        }
        public SOM(int dim, int len, int bredth, double learn, double[][] tr_data, double Max_Error)
        {
            #region ini
            training_data = tr_data;
            learn_t0 = learn;
            width = bredth;
            height = len;
            this.Max_Error = Max_Error;
            nodes = new Node[len, bredth];
            for (int i = 0; i < nodes.GetLength(0); i++)
            {
                for (int j = 0; j < nodes.GetLength(1); j++)
                {
                    nodes[i, j] = new Node(dim);
                }
            }
            #endregion
          
        }
    }
}
 
Technology news on Phys.org
  • #2
The exact glitch seems to be that it always classify things as the dimension vector - (1,1) if it has width 5 and length 6 it would classify everything as (4,5). Any ideas?
 

Related to Self organising map classifying everything to one coordinate

1. What is a self-organizing map (SOM)?

A self-organizing map is an artificial neural network algorithm used for unsupervised learning. It is designed to map high-dimensional data onto a lower-dimensional space, usually a 2D grid, while preserving the topological relationship between the data points.

2. How does a self-organizing map classify data?

A self-organizing map uses a competitive learning process to classify data. This means that the neurons in the map compete with each other to be activated based on the input data. The winning neuron then becomes the best representation of that particular input data.

3. What is meant by "classifying everything to one coordinate"?

In a self-organizing map, the "coordinates" refer to the neurons in the map. "Classifying everything to one coordinate" means that the SOM has successfully mapped all the input data onto a single neuron, indicating that the data is highly similar or has a strong relationship.

4. What are the advantages of using a self-organizing map for classification?

One advantage is that SOMs can handle high-dimensional data and reduce it to a lower-dimensional space, making it easier to visualize and analyze. They are also useful for identifying complex patterns and relationships in data that may not be obvious with traditional methods.

5. Are there any limitations or drawbacks to using a self-organizing map for classification?

One limitation is that the SOM algorithm is sensitive to the initial values and can produce different results with different initializations. Additionally, it may struggle with very large or noisy datasets. It also requires careful tuning of parameters and may not always produce meaningful results.

Similar threads

  • Programming and Computer Science
Replies
3
Views
1K
  • Programming and Computer Science
Replies
1
Views
792
  • Programming and Computer Science
Replies
23
Views
1K
  • Programming and Computer Science
Replies
1
Views
666
  • Programming and Computer Science
Replies
5
Views
1K
  • Programming and Computer Science
Replies
9
Views
2K
  • Programming and Computer Science
Replies
5
Views
2K
  • Programming and Computer Science
Replies
1
Views
883
  • Programming and Computer Science
Replies
7
Views
2K
  • Programming and Computer Science
Replies
6
Views
8K
Back
Top