Thanks. I agree that the database does not heve to be normalised to the smallest detail as it would effect perfomance and make the queries more complex. But like Colin pointed out: In otherwords, design the model as normalised as possible, then denormalise it afterwards if you find performance problems. I want to take this one step further - not denormalise it but use its structure to dynamically create only the required tables/relations. It might be a good idea to kind of explain what I am trying to do. The model I am trying to build will not actually contain any data, but its structure will be used (schema - which could be in xml format) to generate a database that will only contain the tables and structure required by the user. For example: User may want to store a telephone list - name and all the related phone numbers - so I would pick only the necessary tables (person,phonegroup,phone) and their relationships from the schema and create the database. Then he might want another DB to have all company info... So the idea is to "morph". For this I need all the conceivable possibility for the schema. Now I know this may not be the best approach for this kind of application and suggestions are welcome. Thanks!