SQL Server, Performance and Implementation
-
I have lots of files and for each file I will have a lot of strings that I have to save, about 1000000 strings. I want somehow to be able to save the information of each file. To be able to search for a string in the table, and to get the information of the row. The problem is that I am afraid that it will have problems making queries on very large tables. The number of records in this table can arrive easyly to 100 million. Is there anyway, to configure SQLServer to handle large tables or any other way. Any suggestion could be great since I don't have too much experience with the abilities of SQL Server. Thanks, Clint
-
I have lots of files and for each file I will have a lot of strings that I have to save, about 1000000 strings. I want somehow to be able to save the information of each file. To be able to search for a string in the table, and to get the information of the row. The problem is that I am afraid that it will have problems making queries on very large tables. The number of records in this table can arrive easyly to 100 million. Is there anyway, to configure SQLServer to handle large tables or any other way. Any suggestion could be great since I don't have too much experience with the abilities of SQL Server. Thanks, Clint
SQL Server can easily handle hundreds of millions of rows in one table. I have a table here with 25 million rows in it. The key to SQL Server performance is in ensuring that you have suitable indexes on the tables and ensuring that you write your queries in such a way that the indexes get used.
Stability. What an interesting concept. -- Chris Maunder