Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
CODE PROJECT For Those Who Code
  • Home
  • Articles
  • FAQ
Community
  1. Home
  2. General Programming
  3. C#
  4. i write a backprobagate neural network code with c# and it is work slowly [modified]

i write a backprobagate neural network code with c# and it is work slowly [modified]

Scheduled Pinned Locked Moved C#
helpcsharpsysadminquestion
6 Posts 5 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K Offline
    K Offline
    karayel_kara
    wrote on last edited by
    #1

    hi, i wrote a code and it has 4 layers and difference count neuron in each layer, and also it has 5 input,output. i have read so many codes of neural network and they have run very fast but mine is very slow. finally, when i use 400 paterns for input and 100000 iterations ,i wait so much. can we solve this problem for me ? i wait your recommend. Best this code : Main Blocak

    double err=10, mmse ;
    int ss = data.GetLength(0);
    snetwork(0);
    initializes();
    mmse = 0;
    double mser=0.0;
    for (i = 0; i <iterasyon; i++)

            {
    
    
              mser = 0.0;
              for (int xx = 0; xx <data.GetLength(0); xx++)
              { 
       
    
                  ffw(xx);
                  backpropagate(xx);
                  mser += mse(nb\_cikis, xx);
              }
           
              mmse = mser / data.GetLength(0);
              if (mmse < 0.00001) break;
          } ;
    
           
            ffw\_egitimout();
    

    backprobagate error:

    void backpropagate(int r)
    {
    double sum;
    for (int j = 0; j < networks.layers[networks.layers.Length - 1].noron.Length; j++)
    {
    networks.layers[networks.layers.Length - 1].noron[j].hata = nb_cikis[r,j] - networks.layers[networks.layers.Length - 1].noron[j].output;
    networks.layers[networks.layers.Length - 1].noron[j].delta = networks.layers[networks.layers.Length - 1].noron[j].output * (1 - networks.layers[networks.layers.Length - 1].noron[j].output) * networks.layers[networks.layers.Length - 1].noron[j].hata;//son katman deltası hesaplanıyor...
    }

        	//	find delta for hidden layers	
    for(int i=networks.layers.Length-2;i>0;i--)
    {
    	for(int j=0;j<networks.layers\[i\].noron.Length;j++)
        {
    		sum=0.0;
             for (int k = 0; k < networks.layers\[i\].noron\[j\].sbaglanti.Length; k++)
            {
    		             
          sum+=  networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].delta\*networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].dentw\[k\];
    			
    		}
    		networks.layers\[i\].noron\[j\].delta=networks.layers\[i\].noron\[j\].output\*(1-networks.layers\[i\].noron\[j\].output)\*sum;
    

    //
    }
    }

            	//	apply momentum ( does n
    
    N K A K K 5 Replies Last reply
    0
    • K karayel_kara

      hi, i wrote a code and it has 4 layers and difference count neuron in each layer, and also it has 5 input,output. i have read so many codes of neural network and they have run very fast but mine is very slow. finally, when i use 400 paterns for input and 100000 iterations ,i wait so much. can we solve this problem for me ? i wait your recommend. Best this code : Main Blocak

      double err=10, mmse ;
      int ss = data.GetLength(0);
      snetwork(0);
      initializes();
      mmse = 0;
      double mser=0.0;
      for (i = 0; i <iterasyon; i++)

              {
      
      
                mser = 0.0;
                for (int xx = 0; xx <data.GetLength(0); xx++)
                { 
         
      
                    ffw(xx);
                    backpropagate(xx);
                    mser += mse(nb\_cikis, xx);
                }
             
                mmse = mser / data.GetLength(0);
                if (mmse < 0.00001) break;
            } ;
      
             
              ffw\_egitimout();
      

      backprobagate error:

      void backpropagate(int r)
      {
      double sum;
      for (int j = 0; j < networks.layers[networks.layers.Length - 1].noron.Length; j++)
      {
      networks.layers[networks.layers.Length - 1].noron[j].hata = nb_cikis[r,j] - networks.layers[networks.layers.Length - 1].noron[j].output;
      networks.layers[networks.layers.Length - 1].noron[j].delta = networks.layers[networks.layers.Length - 1].noron[j].output * (1 - networks.layers[networks.layers.Length - 1].noron[j].output) * networks.layers[networks.layers.Length - 1].noron[j].hata;//son katman deltası hesaplanıyor...
      }

          	//	find delta for hidden layers	
      for(int i=networks.layers.Length-2;i>0;i--)
      {
      	for(int j=0;j<networks.layers\[i\].noron.Length;j++)
          {
      		sum=0.0;
               for (int k = 0; k < networks.layers\[i\].noron\[j\].sbaglanti.Length; k++)
              {
      		             
            sum+=  networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].delta\*networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].dentw\[k\];
      			
      		}
      		networks.layers\[i\].noron\[j\].delta=networks.layers\[i\].noron\[j\].output\*(1-networks.layers\[i\].noron\[j\].output)\*sum;
      

      //
      }
      }

              	//	apply momentum ( does n
      
      N Offline
      N Offline
      Not Active
      wrote on last edited by
      #2

      If you are going to post code format it properly by using the pre tags


      I know the language. I've read a book. - _Madmatt

      1 Reply Last reply
      0
      • K karayel_kara

        hi, i wrote a code and it has 4 layers and difference count neuron in each layer, and also it has 5 input,output. i have read so many codes of neural network and they have run very fast but mine is very slow. finally, when i use 400 paterns for input and 100000 iterations ,i wait so much. can we solve this problem for me ? i wait your recommend. Best this code : Main Blocak

        double err=10, mmse ;
        int ss = data.GetLength(0);
        snetwork(0);
        initializes();
        mmse = 0;
        double mser=0.0;
        for (i = 0; i <iterasyon; i++)

                {
        
        
                  mser = 0.0;
                  for (int xx = 0; xx <data.GetLength(0); xx++)
                  { 
           
        
                      ffw(xx);
                      backpropagate(xx);
                      mser += mse(nb\_cikis, xx);
                  }
               
                  mmse = mser / data.GetLength(0);
                  if (mmse < 0.00001) break;
              } ;
        
               
                ffw\_egitimout();
        

        backprobagate error:

        void backpropagate(int r)
        {
        double sum;
        for (int j = 0; j < networks.layers[networks.layers.Length - 1].noron.Length; j++)
        {
        networks.layers[networks.layers.Length - 1].noron[j].hata = nb_cikis[r,j] - networks.layers[networks.layers.Length - 1].noron[j].output;
        networks.layers[networks.layers.Length - 1].noron[j].delta = networks.layers[networks.layers.Length - 1].noron[j].output * (1 - networks.layers[networks.layers.Length - 1].noron[j].output) * networks.layers[networks.layers.Length - 1].noron[j].hata;//son katman deltası hesaplanıyor...
        }

            	//	find delta for hidden layers	
        for(int i=networks.layers.Length-2;i>0;i--)
        {
        	for(int j=0;j<networks.layers\[i\].noron.Length;j++)
            {
        		sum=0.0;
                 for (int k = 0; k < networks.layers\[i\].noron\[j\].sbaglanti.Length; k++)
                {
        		             
              sum+=  networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].delta\*networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].dentw\[k\];
        			
        		}
        		networks.layers\[i\].noron\[j\].delta=networks.layers\[i\].noron\[j\].output\*(1-networks.layers\[i\].noron\[j\].output)\*sum;
        

        //
        }
        }

                	//	apply momentum ( does n
        
        K Offline
        K Offline
        Khaniya
        wrote on last edited by
        #3

        Are you going to read this much long code with this kind of formatting if someone provides you ? Think yourself and decide your time begins now

        Life's Like a mirror. Smile at it & it smiles back at you.- P Pilgrim So Smile Please

        1 Reply Last reply
        0
        • K karayel_kara

          hi, i wrote a code and it has 4 layers and difference count neuron in each layer, and also it has 5 input,output. i have read so many codes of neural network and they have run very fast but mine is very slow. finally, when i use 400 paterns for input and 100000 iterations ,i wait so much. can we solve this problem for me ? i wait your recommend. Best this code : Main Blocak

          double err=10, mmse ;
          int ss = data.GetLength(0);
          snetwork(0);
          initializes();
          mmse = 0;
          double mser=0.0;
          for (i = 0; i <iterasyon; i++)

                  {
          
          
                    mser = 0.0;
                    for (int xx = 0; xx <data.GetLength(0); xx++)
                    { 
             
          
                        ffw(xx);
                        backpropagate(xx);
                        mser += mse(nb\_cikis, xx);
                    }
                 
                    mmse = mser / data.GetLength(0);
                    if (mmse < 0.00001) break;
                } ;
          
                 
                  ffw\_egitimout();
          

          backprobagate error:

          void backpropagate(int r)
          {
          double sum;
          for (int j = 0; j < networks.layers[networks.layers.Length - 1].noron.Length; j++)
          {
          networks.layers[networks.layers.Length - 1].noron[j].hata = nb_cikis[r,j] - networks.layers[networks.layers.Length - 1].noron[j].output;
          networks.layers[networks.layers.Length - 1].noron[j].delta = networks.layers[networks.layers.Length - 1].noron[j].output * (1 - networks.layers[networks.layers.Length - 1].noron[j].output) * networks.layers[networks.layers.Length - 1].noron[j].hata;//son katman deltası hesaplanıyor...
          }

              	//	find delta for hidden layers	
          for(int i=networks.layers.Length-2;i>0;i--)
          {
          	for(int j=0;j<networks.layers\[i\].noron.Length;j++)
              {
          		sum=0.0;
                   for (int k = 0; k < networks.layers\[i\].noron\[j\].sbaglanti.Length; k++)
                  {
          		             
                sum+=  networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].delta\*networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].dentw\[k\];
          			
          		}
          		networks.layers\[i\].noron\[j\].delta=networks.layers\[i\].noron\[j\].output\*(1-networks.layers\[i\].noron\[j\].output)\*sum;
          

          //
          }
          }

                  	//	apply momentum ( does n
          
          A Offline
          A Offline
          Abhinav S
          wrote on last edited by
          #4

          Its hard to go through your complete program, but it looks like you are running just too many loops (some of them nested). I would start at looking to refactor some of your logic to avoid running so many loops. Also try and use break; to get out of the loop once the loop logic is complete.

          The funniest thing about this particular signature is that by the time you realise it doesn't say anything it's too late to stop reading it. My latest tip/trick Visit the Hindi forum here.

          1 Reply Last reply
          0
          • K karayel_kara

            hi, i wrote a code and it has 4 layers and difference count neuron in each layer, and also it has 5 input,output. i have read so many codes of neural network and they have run very fast but mine is very slow. finally, when i use 400 paterns for input and 100000 iterations ,i wait so much. can we solve this problem for me ? i wait your recommend. Best this code : Main Blocak

            double err=10, mmse ;
            int ss = data.GetLength(0);
            snetwork(0);
            initializes();
            mmse = 0;
            double mser=0.0;
            for (i = 0; i <iterasyon; i++)

                    {
            
            
                      mser = 0.0;
                      for (int xx = 0; xx <data.GetLength(0); xx++)
                      { 
               
            
                          ffw(xx);
                          backpropagate(xx);
                          mser += mse(nb\_cikis, xx);
                      }
                   
                      mmse = mser / data.GetLength(0);
                      if (mmse < 0.00001) break;
                  } ;
            
                   
                    ffw\_egitimout();
            

            backprobagate error:

            void backpropagate(int r)
            {
            double sum;
            for (int j = 0; j < networks.layers[networks.layers.Length - 1].noron.Length; j++)
            {
            networks.layers[networks.layers.Length - 1].noron[j].hata = nb_cikis[r,j] - networks.layers[networks.layers.Length - 1].noron[j].output;
            networks.layers[networks.layers.Length - 1].noron[j].delta = networks.layers[networks.layers.Length - 1].noron[j].output * (1 - networks.layers[networks.layers.Length - 1].noron[j].output) * networks.layers[networks.layers.Length - 1].noron[j].hata;//son katman deltası hesaplanıyor...
            }

                	//	find delta for hidden layers	
            for(int i=networks.layers.Length-2;i>0;i--)
            {
            	for(int j=0;j<networks.layers\[i\].noron.Length;j++)
                {
            		sum=0.0;
                     for (int k = 0; k < networks.layers\[i\].noron\[j\].sbaglanti.Length; k++)
                    {
            		             
                  sum+=  networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].delta\*networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].dentw\[k\];
            			
            		}
            		networks.layers\[i\].noron\[j\].delta=networks.layers\[i\].noron\[j\].output\*(1-networks.layers\[i\].noron\[j\].output)\*sum;
            

            //
            }
            }

                    	//	apply momentum ( does n
            
            K Offline
            K Offline
            Keith Barrow
            wrote on last edited by
            #5

            Are you still stuggling with this :-) ! Please format your code, as you can see it is dificult to read and you are being down voted. Point one: Nested loops = poor performance [generally]. Your class structure is wrongly IMO. Each Neuron should have a List<Dendrite> for both input and output. A Dendrite should have both an input and output Neuron as well as any weighting. You might need to put deltas in there too, I can't remember. There needs to be a special subclass for the Input Layer and Output layer as these don't have Input Neurons and Output Neurons respectively. Set the activation inputs, and calculate the activations on each layer in succession. The calculation is easy:

            //Pseudocode - in Neuron
            InputAcivation = 0.0;
            ForEach(Dendrite dendrite in InputDendrites)
            InputAcivation += dendrite.Activation;
            Activation = Squash(InputActivation); // Where Squash is the Squashing function to put into range 0-->1 -1 -->1 or whatever

            Note that I have the Dendrite class calculaing the Activation from the input, "Weight * InputActivation". The backprop works in a similar way (but reverse!) the desired output is placed onto a property on the output layer, and then each layered is worked back to the input. I really can't remember the ins and outs of this, but you only really need to loop over each neuron in each layer to do this, plus the dendrites. One advantage of generic Lists over arrays (if you are having a speed problem) is that you can trim "dead" connections (ones whose weights are near zero and have little momentum over multiple tries. Another suggestion is to try to define Halting Criteria, which defines when the NNW is getting it right enough to stop learning. You can over-train a NNW. One final thing, the OO encapsulation is poor IMO, the calculation code (as I have suggested above) should be in the Neurons, not the main code block. This goes for the backprop calcs too. If you do this, you won't have so many array calls confusing the mix, and you'll be able to profile your app a little better. [Edit] My class structure probably isn't the best, I last wrote a NNW ~10 years ago as an undergrad, in c++, but hopefully you'll get the idea. Did you check out the NNW article on CP I mentioned in your previous thread?

            Sort of a cross between Lawrence of Arabia and Dilbert.[

            1 Reply Last reply
            0
            • K karayel_kara

              hi, i wrote a code and it has 4 layers and difference count neuron in each layer, and also it has 5 input,output. i have read so many codes of neural network and they have run very fast but mine is very slow. finally, when i use 400 paterns for input and 100000 iterations ,i wait so much. can we solve this problem for me ? i wait your recommend. Best this code : Main Blocak

              double err=10, mmse ;
              int ss = data.GetLength(0);
              snetwork(0);
              initializes();
              mmse = 0;
              double mser=0.0;
              for (i = 0; i <iterasyon; i++)

                      {
              
              
                        mser = 0.0;
                        for (int xx = 0; xx <data.GetLength(0); xx++)
                        { 
                 
              
                            ffw(xx);
                            backpropagate(xx);
                            mser += mse(nb\_cikis, xx);
                        }
                     
                        mmse = mser / data.GetLength(0);
                        if (mmse < 0.00001) break;
                    } ;
              
                     
                      ffw\_egitimout();
              

              backprobagate error:

              void backpropagate(int r)
              {
              double sum;
              for (int j = 0; j < networks.layers[networks.layers.Length - 1].noron.Length; j++)
              {
              networks.layers[networks.layers.Length - 1].noron[j].hata = nb_cikis[r,j] - networks.layers[networks.layers.Length - 1].noron[j].output;
              networks.layers[networks.layers.Length - 1].noron[j].delta = networks.layers[networks.layers.Length - 1].noron[j].output * (1 - networks.layers[networks.layers.Length - 1].noron[j].output) * networks.layers[networks.layers.Length - 1].noron[j].hata;//son katman deltası hesaplanıyor...
              }

                  	//	find delta for hidden layers	
              for(int i=networks.layers.Length-2;i>0;i--)
              {
              	for(int j=0;j<networks.layers\[i\].noron.Length;j++)
                  {
              		sum=0.0;
                       for (int k = 0; k < networks.layers\[i\].noron\[j\].sbaglanti.Length; k++)
                      {
              		             
                    sum+=  networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].delta\*networks.layers\[laybul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].noron\[nronbul(networks.layers\[i\].noron\[j\].sbaglanti\[k\])\].dentw\[k\];
              			
              		}
              		networks.layers\[i\].noron\[j\].delta=networks.layers\[i\].noron\[j\].output\*(1-networks.layers\[i\].noron\[j\].output)\*sum;
              

              //
              }
              }

                      	//	apply momentum ( does n
              
              K Offline
              K Offline
              karayel_kara
              wrote on last edited by
              #6

              hi, Everybody. Firstly Thanks for repyl. i wonder, is there any solution for nested loop in ANN. can you explain with a example? My project is about back propagation neural network this project. to run fast of this project, when i can change my class, will program run fast ?

              1 Reply Last reply
              0
              Reply
              • Reply as topic
              Log in to reply
              • Oldest to Newest
              • Newest to Oldest
              • Most Votes


              • Login

              • Don't have an account? Register

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • World
              • Users
              • Groups