Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Best Database Configures with c# win. form

Best Database Configures with c# win. form

Scheduled Pinned Locked Moved Database
csharpdatabase
38 Posts 10 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A agent_kruger

    yes, i mean store and search. As my work is little tricky it has to search through 100 million records every hour.

    J Offline
    J Offline
    Jorgen Andersson
    wrote on last edited by
    #25

    You have to manipulate ALL of those 100 million records every hour? Or is it just some, and then what percentage? And what kind of reporting do you need to do? On how many records? Fact is that it's not so important which DB you're having as long as you're just doing CRUD. The size of the database is also not very important for the performance (Were talking about a factor of a couple of hundred times larger for every new level in an index, assuming B-Tree index). It's the number of transactions! And how many records that are affected by the queries. So the limiting factors will be the hardware (mostly the harddrives), the configuration of the hardware (It doesn't matter if the drives are fast if they're in a RAID5), and the indexes. If you have no indexes at all, all inserts will be ridiculously fast, but querying or updating will be just as ridiculously slow. Adding every index you can think about will make both inserts and updates ridiculously slow instead, while the queries will be faster. Note that updates needs to both read and write. Finding the right indexes will give you a good balance.

    Wrong is evil and must be defeated. - Jeff Ello[^]

    1 Reply Last reply
    0
    • A agent_kruger

      yes, i mean store and search. As my work is little tricky it has to search through 100 million records every hour.

      L Offline
      L Offline
      Lost User
      wrote on last edited by
      #26

      "Manipulate" 100 million records per hour? Ask Amazon or Google to host your data.

      Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^]

      A 1 Reply Last reply
      0
      • L Lost User

        "Manipulate" 100 million records per hour? Ask Amazon or Google to host your data.

        Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^]

        A Offline
        A Offline
        agent_kruger
        wrote on last edited by
        #27

        sir, in this context i used "manipulate" for "searching"

        L 1 Reply Last reply
        0
        • A agent_kruger

          sir, in this context i used "manipulate" for "searching"

          L Offline
          L Offline
          Lost User
          wrote on last edited by
          #28

          Then you are using the wrong terms. "Manipulate" means updating, inserting or deleting records. Reading doesn't manipulate the data, it fetches it. Selecting a few million should be doable, depending on the hardware and software-combination and the skillset of the dba.

          Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^]

          A 1 Reply Last reply
          0
          • L Lost User

            Then you are using the wrong terms. "Manipulate" means updating, inserting or deleting records. Reading doesn't manipulate the data, it fetches it. Selecting a few million should be doable, depending on the hardware and software-combination and the skillset of the dba.

            Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^]

            A Offline
            A Offline
            agent_kruger
            wrote on last edited by
            #29

            sorry, for all this.i have corrected my reply.

            L 1 Reply Last reply
            0
            • A agent_kruger

              sorry, for all this.i have corrected my reply.

              L Offline
              L Offline
              Lost User
              wrote on last edited by
              #30

              Databases are optimized to work with records; there will be a HUGE difference between searching records (select using a where) and searching a specific substring within a NTEXT field. If you are going to search within the contents of the fiels, you'd be wanting a full-text search catalog. Again, supported by most major databases, but their speed may vary wildly. Can you post a schema of the data that you'll be storing? Should I be thinking about simple data like measurements or prices? (lots o' fields with numbers) Or more toward text? (lots of short readable text-fields, like profiles), more toward memo's (single large textfield), or even documents (Word, PDF?) In the case of documents I'd suggest to dump the files in the filesystem - and to use something like Google Desktop Search to search for specific terms.

              Agent_Spock wrote:

              sorry, for all this

              Don't be; for someone who doesn't code all those things might sound roughly the same. Sorry for my short and blunt answers.

              Live long and prosper :)

              A 1 Reply Last reply
              0
              • L Lost User

                Databases are optimized to work with records; there will be a HUGE difference between searching records (select using a where) and searching a specific substring within a NTEXT field. If you are going to search within the contents of the fiels, you'd be wanting a full-text search catalog. Again, supported by most major databases, but their speed may vary wildly. Can you post a schema of the data that you'll be storing? Should I be thinking about simple data like measurements or prices? (lots o' fields with numbers) Or more toward text? (lots of short readable text-fields, like profiles), more toward memo's (single large textfield), or even documents (Word, PDF?) In the case of documents I'd suggest to dump the files in the filesystem - and to use something like Google Desktop Search to search for specific terms.

                Agent_Spock wrote:

                sorry, for all this

                Don't be; for someone who doesn't code all those things might sound roughly the same. Sorry for my short and blunt answers.

                Live long and prosper :)

                A Offline
                A Offline
                agent_kruger
                wrote on last edited by
                #31

                no sir, it consists of data which is converted into bytes.

                L 1 Reply Last reply
                0
                • A agent_kruger

                  no sir, it consists of data which is converted into bytes.

                  L Offline
                  L Offline
                  Lost User
                  wrote on last edited by
                  #32

                  Everything that a computer stores is encoded in bytes; images, text, applications - they're all stored as bytes. Hence, the remark that it's going to store bytes is not very helpfull. That way I'd assume a large binary blob, and "searching" to be a series of bytes. Those bytes represent something; data, in whatever form. What "kind" of data you're going to store determines the best approach.

                  Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^]

                  A 1 Reply Last reply
                  0
                  • L Lost User

                    Everything that a computer stores is encoded in bytes; images, text, applications - they're all stored as bytes. Hence, the remark that it's going to store bytes is not very helpfull. That way I'd assume a large binary blob, and "searching" to be a series of bytes. Those bytes represent something; data, in whatever form. What "kind" of data you're going to store determines the best approach.

                    Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^]

                    A Offline
                    A Offline
                    agent_kruger
                    wrote on last edited by
                    #33

                    no i am using it in such way that each record has max. 25 letters

                    L 1 Reply Last reply
                    0
                    • A agent_kruger

                      no i am using it in such way that each record has max. 25 letters

                      L Offline
                      L Offline
                      Lost User
                      wrote on last edited by
                      #34

                      Then your search terms are "Free Text Search" and the name of your database :) It's described for SQL Server here[^], or you could google for Lucene.NET.

                      Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^]

                      1 Reply Last reply
                      0
                      • A agent_kruger

                        actually i am already working on sql but i am not sure that it can manage 100 million record. somebody suggested me oracle is oracle better than SQL.

                        S Offline
                        S Offline
                        S Douglas
                        wrote on last edited by
                        #35

                        Agent_Spock wrote:

                        sql but i am not sure that it can manage 100 million record.

                        I've worked with a SQL database that had a single table with a few billion rows (Inventory type stuff). So, yea SQL can scale as far as you need it too.


                        Common sense is admitting there is cause and effect and that you can exert some control over what you understand.

                        A 1 Reply Last reply
                        0
                        • S S Douglas

                          Agent_Spock wrote:

                          sql but i am not sure that it can manage 100 million record.

                          I've worked with a SQL database that had a single table with a few billion rows (Inventory type stuff). So, yea SQL can scale as far as you need it too.


                          Common sense is admitting there is cause and effect and that you can exert some control over what you understand.

                          A Offline
                          A Offline
                          agent_kruger
                          wrote on last edited by
                          #36

                          and sir, is the searching good (fast).

                          S 1 Reply Last reply
                          0
                          • A agent_kruger

                            and sir, is the searching good (fast).

                            S Offline
                            S Offline
                            S Douglas
                            wrote on last edited by
                            #37

                            Yup, never a problem. The table was indexed for the usage.


                            Common sense is admitting there is cause and effect and that you can exert some control over what you understand.

                            A 1 Reply Last reply
                            0
                            • S S Douglas

                              Yup, never a problem. The table was indexed for the usage.


                              Common sense is admitting there is cause and effect and that you can exert some control over what you understand.

                              A Offline
                              A Offline
                              agent_kruger
                              wrote on last edited by
                              #38

                              Does "Indexed" mean using primary key as filter or something else?

                              1 Reply Last reply
                              0
                              Reply
                              • Reply as topic
                              Log in to reply
                              • Oldest to Newest
                              • Newest to Oldest
                              • Most Votes


                              • Login

                              • Don't have an account? Register

                              • Login or register to search.
                              • First post
                                Last post
                              0
                              • Categories
                              • Recent
                              • Tags
                              • Popular
                              • World
                              • Users
                              • Groups