Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C / C++ / MFC
  4. [Message Deleted]

[Message Deleted]

Scheduled Pinned Locked Moved C / C++ / MFC
13 Posts 6 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R Ric Ashton

    You need to state exactly what type of format or organising structure you are trying to acheive.

    R Offline
    R Offline
    Ric Ashton
    wrote on last edited by
    #4

    Are you trying to just get rid of the duplicated record?

    D 1 Reply Last reply
    0
    • R Ric Ashton

      You need to state exactly what type of format or organising structure you are trying to acheive.

      D Offline
      D Offline
      Davitor
      wrote on last edited by
      #5

      i need remove duplicate values..

      E R 2 Replies Last reply
      0
      • R Ric Ashton

        Are you trying to just get rid of the duplicated record?

        D Offline
        D Offline
        Davitor
        wrote on last edited by
        #6

        It's my problem man.I don't what rid problem and what he want to do..If you have some idia then plz help me

        1 Reply Last reply
        0
        • D Davitor

          [Message Deleted]

          M Offline
          M Offline
          Maximilien
          wrote on last edited by
          #7

          1- create a structure (or class) that contains the fields for each record ; create/use a collection that holds your records 2- in a loop : 3- read one line of the file, knowing that you know the format, you can skip unwanted lines. 4- skip line if unwanted; if line contains (is?) "============" and it's the first one, then you know it's a new record. 5- read the next few lines to fill a record. 6- look in your collection to see if new record is already there; if it's not there, add it to your collection. 7- once you've read all the lines, 8- write back the data line-by-line in a new file. (steps 5,6 can be optimized to not have to create a new record until you've find if the item is not in the collection).

          This signature was proudly tested on animals.

          D 1 Reply Last reply
          0
          • D Davitor

            i need remove duplicate values..

            E Offline
            E Offline
            Eytukan
            wrote on last edited by
            #8

            You can use "std::unique" algo, but you will have to design your own class with overloads to support it. Example[^]

            He never answers anyone who replies to him. I've taken to calling him a retard, which is not fair to retards everywhere.-Christian Graus

            1 Reply Last reply
            0
            • D Davitor

              i need remove duplicate values..

              R Offline
              R Offline
              Ric Ashton
              wrote on last edited by
              #9

              Then in any environment, you need to qualify what you wish to discard. You need to place a field (that you specify) into a list box and disallow the duplication of a matching field. If the field or database.component is new then allow the record to be copied to a new file. At the end (after testing) simply replace your old file with the new one. I'm assuming your running a few thousand records.

              Richard Andrew x64R 1 Reply Last reply
              0
              • M Maximilien

                1- create a structure (or class) that contains the fields for each record ; create/use a collection that holds your records 2- in a loop : 3- read one line of the file, knowing that you know the format, you can skip unwanted lines. 4- skip line if unwanted; if line contains (is?) "============" and it's the first one, then you know it's a new record. 5- read the next few lines to fill a record. 6- look in your collection to see if new record is already there; if it's not there, add it to your collection. 7- once you've read all the lines, 8- write back the data line-by-line in a new file. (steps 5,6 can be optimized to not have to create a new record until you've find if the item is not in the collection).

                This signature was proudly tested on animals.

                D Offline
                D Offline
                Davitor
                wrote on last edited by
                #10

                Thanks sir your logic is to much sound.Can you give me some dummy code.Plz help me

                M 1 Reply Last reply
                0
                • D Davitor

                  Thanks sir your logic is to much sound.Can you give me some dummy code.Plz help me

                  M Offline
                  M Offline
                  Maximilien
                  wrote on last edited by
                  #11

                  I will not give you code (dummy or not); I want you to try to come up with a solution based on the suggestions we've given you. Once you've tried one, two, or even more solutions to your problem (which sounds like a homework), and you still have problems, then we will gladly be of help of precise problems you can still have. It can be also easier to ask questions to your teachers and/or teaching assistants and/or computer lab assistants; or even fellow students.

                  This signature was proudly tested on animals.

                  1 Reply Last reply
                  0
                  • R Ric Ashton

                    Then in any environment, you need to qualify what you wish to discard. You need to place a field (that you specify) into a list box and disallow the duplication of a matching field. If the field or database.component is new then allow the record to be copied to a new file. At the end (after testing) simply replace your old file with the new one. I'm assuming your running a few thousand records.

                    Richard Andrew x64R Offline
                    Richard Andrew x64R Offline
                    Richard Andrew x64
                    wrote on last edited by
                    #12

                    Ric Ashton wrote:

                    I'm assuming your running a few thousand records.

                    Don't be so generous. This is obviously a homework assignment.

                    1 Reply Last reply
                    0
                    • D Davitor

                      [Message Deleted]

                      D Offline
                      D Offline
                      David Crow
                      wrote on last edited by
                      #13

                      The first order of business would be to read the data into some sort of data structure. Until you can get that far, eliminating duplicates is irrelevant.

                      "Old age is like a bank account. You withdraw later in life what you have deposited along the way." - Unknown

                      "Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons

                      1 Reply Last reply
                      0
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Don't have an account? Register

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups