Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Syntax Advice

Syntax Advice

Scheduled Pinned Locked Moved C#
csharphelpquestion
13 Posts 7 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B bobsugar222

    foreach (DataRow row in DS.Tables[0])
    {
    x1.Text = row["test"].ToString();
    }

    C Offline
    C Offline
    Christian Graus
    wrote on last edited by
    #3

    shouldn't that be DS.Tables[0].Rows ??

    Christian Graus - C++ MVP 'Why don't we jump on a fad that hasn't already been widely discredited ?' - Dilbert

    B 1 Reply Last reply
    0
    • C Christian Graus

      shouldn't that be DS.Tables[0].Rows ??

      Christian Graus - C++ MVP 'Why don't we jump on a fad that hasn't already been widely discredited ?' - Dilbert

      B Offline
      B Offline
      bobsugar222
      wrote on last edited by
      #4

      Yes... I did that the other day too :sigh:

      1 Reply Last reply
      0
      • D dabuskol

        Hi, could anyone please help me to upgrade my syntax in c#.. for (int x = 0; x <= DSData.Tables[0].Rows.Count; x++) { x1.Text = DSData.Tables[0].Rows[x]["test"].ToString(); } dataset DS; foreach (________ in DS) { } ??

        Dabsukol

        A Offline
        A Offline
        andre_swnpl
        wrote on last edited by
        #5

        Personally I would not do the "upgrade" in the first place as your code will actually run slower. Unless you have a compelling reason to do so rather do the following: int rowCount = DSData.Tables[0].Rows.Count; for (int x = 0; x < rowCount ; x++) { x1.Text = DSData.Tables[0].Rows[x]["test"].ToString(); }

        B M 2 Replies Last reply
        0
        • A andre_swnpl

          Personally I would not do the "upgrade" in the first place as your code will actually run slower. Unless you have a compelling reason to do so rather do the following: int rowCount = DSData.Tables[0].Rows.Count; for (int x = 0; x < rowCount ; x++) { x1.Text = DSData.Tables[0].Rows[x]["test"].ToString(); }

          B Offline
          B Offline
          bobsugar222
          wrote on last edited by
          #6

          My test says otherwise:

          DataTable dt = new DataTable("table1");
          DataColumn dc = new DataColumn("col1");
          dt.Columns.Add(dc);

          for (int i = 1; i < 1000000; i++)
          {
          DataRow row = dt.NewRow();
          row["col1"] = "string" + i.ToString();
          dt.Rows.Add(row);
          }

          long start = DateTime.Now.Ticks;

          for (int i = 0; i < dt.Rows.Count; i++)
          {
          string s = (string)dt.Rows[i]["col1"];
          }

          long middle = DateTime.Now.Ticks;

          foreach (DataRow r in dt.Rows)
          {
          string s = (string)r["col1"];
          }

          long end = DateTime.Now.Ticks;

          TimeSpan forLength = new TimeSpan(middle - start);
          TimeSpan foreachLength = new TimeSpan(end - middle);

          Console.WriteLine("for = " + forLength.ToString());
          Console.WriteLine("foreach = " + foreachLength.ToString());

          Output:
          for = 00:00:00.9079668 foreach = 00:00:00.2661282

          But in the scheme of things, less than a second to process 1000000 records, either way fine by me.

          A L 2 Replies Last reply
          0
          • A andre_swnpl

            Personally I would not do the "upgrade" in the first place as your code will actually run slower. Unless you have a compelling reason to do so rather do the following: int rowCount = DSData.Tables[0].Rows.Count; for (int x = 0; x < rowCount ; x++) { x1.Text = DSData.Tables[0].Rows[x]["test"].ToString(); }

            M Offline
            M Offline
            Martin 0
            wrote on last edited by
            #7

            Hello,

            andre_swnpl wrote:

            as your code will actually run slower

            Why do you think that?

            andre_swnpl wrote:

            Rows[x]"

            If you really safe time with the for instead of the foreach, than you will loose it here again, I think! All the best, Martin

            1 Reply Last reply
            0
            • B bobsugar222

              My test says otherwise:

              DataTable dt = new DataTable("table1");
              DataColumn dc = new DataColumn("col1");
              dt.Columns.Add(dc);

              for (int i = 1; i < 1000000; i++)
              {
              DataRow row = dt.NewRow();
              row["col1"] = "string" + i.ToString();
              dt.Rows.Add(row);
              }

              long start = DateTime.Now.Ticks;

              for (int i = 0; i < dt.Rows.Count; i++)
              {
              string s = (string)dt.Rows[i]["col1"];
              }

              long middle = DateTime.Now.Ticks;

              foreach (DataRow r in dt.Rows)
              {
              string s = (string)r["col1"];
              }

              long end = DateTime.Now.Ticks;

              TimeSpan forLength = new TimeSpan(middle - start);
              TimeSpan foreachLength = new TimeSpan(end - middle);

              Console.WriteLine("for = " + forLength.ToString());
              Console.WriteLine("foreach = " + foreachLength.ToString());

              Output:
              for = 00:00:00.9079668 foreach = 00:00:00.2661282

              But in the scheme of things, less than a second to process 1000000 records, either way fine by me.

              A Offline
              A Offline
              andre_swnpl
              wrote on last edited by
              #8

              I stand corrected then :) I was going according to the MSDN Documentation on enumeration overhead. Thanks for the info.

              1 Reply Last reply
              0
              • B bobsugar222

                My test says otherwise:

                DataTable dt = new DataTable("table1");
                DataColumn dc = new DataColumn("col1");
                dt.Columns.Add(dc);

                for (int i = 1; i < 1000000; i++)
                {
                DataRow row = dt.NewRow();
                row["col1"] = "string" + i.ToString();
                dt.Rows.Add(row);
                }

                long start = DateTime.Now.Ticks;

                for (int i = 0; i < dt.Rows.Count; i++)
                {
                string s = (string)dt.Rows[i]["col1"];
                }

                long middle = DateTime.Now.Ticks;

                foreach (DataRow r in dt.Rows)
                {
                string s = (string)r["col1"];
                }

                long end = DateTime.Now.Ticks;

                TimeSpan forLength = new TimeSpan(middle - start);
                TimeSpan foreachLength = new TimeSpan(end - middle);

                Console.WriteLine("for = " + forLength.ToString());
                Console.WriteLine("foreach = " + foreachLength.ToString());

                Output:
                for = 00:00:00.9079668 foreach = 00:00:00.2661282

                But in the scheme of things, less than a second to process 1000000 records, either way fine by me.

                L Offline
                L Offline
                Luc Pattyn
                wrote on last edited by
                #9

                I would suggest you enclose all of that in another for loop that performs 2 or more passes, just to get rid of some effects such as data cache hits. Any first pass measurement typically is unreliable. :)

                Luc Pattyn

                B 1 Reply Last reply
                0
                • L Luc Pattyn

                  I would suggest you enclose all of that in another for loop that performs 2 or more passes, just to get rid of some effects such as data cache hits. Any first pass measurement typically is unreliable. :)

                  Luc Pattyn

                  B Offline
                  B Offline
                  bobsugar222
                  wrote on last edited by
                  #10

                  Good idea: pass 1: for = 00:00:00.0312510 foreach = 00:00:00.0156255 pass 2: for = 00:00:00.0625020 foreach = 00:00:00.0312510 pass 3: for = 00:00:00.0937530 foreach = 00:00:00.0312510 pass 4: for = 00:00:00.1562550 foreach = 00:00:00.0468765

                  L 1 Reply Last reply
                  0
                  • B bobsugar222

                    Good idea: pass 1: for = 00:00:00.0312510 foreach = 00:00:00.0156255 pass 2: for = 00:00:00.0625020 foreach = 00:00:00.0312510 pass 3: for = 00:00:00.0937530 foreach = 00:00:00.0312510 pass 4: for = 00:00:00.1562550 foreach = 00:00:00.0468765

                    L Offline
                    L Offline
                    Luc Pattyn
                    wrote on last edited by
                    #11

                    Well, even these measurements are not trustworthy, your system timer seems to tick in increments of 15.6 msec so that is also the margin of error. I now suggest you redo it in this way:

                    int maxIter=100; // whatever value makes it take at least 500 msec
                    for (int pass=0; pass<4; pass++) {
                    start timer (or read DateTime.Now.Milliseconds)
                    for (int iter=0; iter<maxIter; iter++) {
                    code to be timed
                    }
                    stop timer
                    start timer
                    for (int iter=0; iter<maxIter; iter++) {
                    alternative code to be timed
                    }
                    stop timer
                    }

                    :)

                    Luc Pattyn

                    S 1 Reply Last reply
                    0
                    • L Luc Pattyn

                      Well, even these measurements are not trustworthy, your system timer seems to tick in increments of 15.6 msec so that is also the margin of error. I now suggest you redo it in this way:

                      int maxIter=100; // whatever value makes it take at least 500 msec
                      for (int pass=0; pass<4; pass++) {
                      start timer (or read DateTime.Now.Milliseconds)
                      for (int iter=0; iter<maxIter; iter++) {
                      code to be timed
                      }
                      stop timer
                      start timer
                      for (int iter=0; iter<maxIter; iter++) {
                      alternative code to be timed
                      }
                      stop timer
                      }

                      :)

                      Luc Pattyn

                      S Offline
                      S Offline
                      ShermansLagoon
                      wrote on last edited by
                      #12

                      Just for fun I did the test 1000 times and got the following results: for: 00:12:08.2600974 foreach: 00:03:40.6635750 Note that this is total time for 1000 runs; the result seems clear though.

                      Internet - the worlds biggest dictionary

                      L 1 Reply Last reply
                      0
                      • S ShermansLagoon

                        Just for fun I did the test 1000 times and got the following results: for: 00:12:08.2600974 foreach: 00:03:40.6635750 Note that this is total time for 1000 runs; the result seems clear though.

                        Internet - the worlds biggest dictionary

                        L Offline
                        L Offline
                        Luc Pattyn
                        wrote on last edited by
                        #13

                        go with foreach ! :)

                        Luc Pattyn

                        1 Reply Last reply
                        0
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Don't have an account? Register

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • World
                        • Users
                        • Groups