Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. Design and Architecture
  4. Is my n-tier design OK?

Is my n-tier design OK?

Scheduled Pinned Locked Moved Design and Architecture
csharpdatabasesysadminasp-netsql-server
3 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • L Offline
    L Offline
    Lost User
    wrote on last edited by
    #1

    Hello, I'm developing n-tier warehouse control system. I use SQL Server 2008 for database. All data access is done using stored procedures. The core of the system is windows service, that does all the processing and is "gateway" to database. There are clients (WinForms & WPF) that connect to this core service via WCF and are basically just GUI (no business logic, only input data validation) clients do not connect to database directly. Number of clients may vary from 1 to few dozen at most. Core service, SQL Server and all clients are on business intranet with fast Ethernet infrastructure (actually physically in the same building). Now for the "unknown" part: for now I decided to go with typed dataset for business entities and data access with table adapters. Core service and each client has own instance of the dataset. Whenever client modifies data, changes are sent to core service, core service updates database, merges changes to own dataset instance and sends data changes via publish/subscribe WCF service event to all the clients. Clients then merge the changes to own dataset instance. That way I want to ensure that all participating parties (core service and all the clients) have the most current data. Specially one data table holding items that are currently being processed will be changed very often and it is at most important that clients have "real-time" data. This table will have 0 to a few hundred records at most. The largest table (for storage places) will have from a few thousand to a few 10k records. At least that is the idea ... before I dwell into implementation I want to make sure I'm on the right track. So any tips or critics from users with experience with similar scenarios are highly welcome. My main concern is performance of data refresh on clients side. If I should provide any more detail that might be of importance, please let me know. Thanks in advance, Tine

    M 1 Reply Last reply
    0
    • L Lost User

      Hello, I'm developing n-tier warehouse control system. I use SQL Server 2008 for database. All data access is done using stored procedures. The core of the system is windows service, that does all the processing and is "gateway" to database. There are clients (WinForms & WPF) that connect to this core service via WCF and are basically just GUI (no business logic, only input data validation) clients do not connect to database directly. Number of clients may vary from 1 to few dozen at most. Core service, SQL Server and all clients are on business intranet with fast Ethernet infrastructure (actually physically in the same building). Now for the "unknown" part: for now I decided to go with typed dataset for business entities and data access with table adapters. Core service and each client has own instance of the dataset. Whenever client modifies data, changes are sent to core service, core service updates database, merges changes to own dataset instance and sends data changes via publish/subscribe WCF service event to all the clients. Clients then merge the changes to own dataset instance. That way I want to ensure that all participating parties (core service and all the clients) have the most current data. Specially one data table holding items that are currently being processed will be changed very often and it is at most important that clients have "real-time" data. This table will have 0 to a few hundred records at most. The largest table (for storage places) will have from a few thousand to a few 10k records. At least that is the idea ... before I dwell into implementation I want to make sure I'm on the right track. So any tips or critics from users with experience with similar scenarios are highly welcome. My main concern is performance of data refresh on clients side. If I should provide any more detail that might be of importance, please let me know. Thanks in advance, Tine

      M Offline
      M Offline
      Mycroft Holmes
      wrote on last edited by
      #2

      Your initial design sounds good. It is unusual to use a windows service rather than web service. I assume there is absolutely no possibility of the client being web form based or silverlight. I question the need for the clients to have the most current data pushed from the server. This is generally a wish from the business that has no real requirement when the cost is factored in. How do you intend to inform a client that there has been a change? Is the system a high speed data entry app where concurrency is going to be an issue? How are you going to manage race conditions across multiple client? I question the use of typed data sets as the standards are moving to objects, if you use datasets then your client will need to have the that knowledge (I believe it is not possible with SL - no system.data). I have never used "merged" datasets, I always query back to the database - opinion only.

      Never underestimate the power of human stupidity RAH

      L 1 Reply Last reply
      0
      • M Mycroft Holmes

        Your initial design sounds good. It is unusual to use a windows service rather than web service. I assume there is absolutely no possibility of the client being web form based or silverlight. I question the need for the clients to have the most current data pushed from the server. This is generally a wish from the business that has no real requirement when the cost is factored in. How do you intend to inform a client that there has been a change? Is the system a high speed data entry app where concurrency is going to be an issue? How are you going to manage race conditions across multiple client? I question the use of typed data sets as the standards are moving to objects, if you use datasets then your client will need to have the that knowledge (I believe it is not possible with SL - no system.data). I have never used "merged" datasets, I always query back to the database - opinion only.

        Never underestimate the power of human stupidity RAH

        L Offline
        L Offline
        Lost User
        wrote on last edited by
        #3

        You are right ... there is no possibility of client being web or silverlight. I don't understand, what you mean by cost ... you mean cost of time needed for implementation? As I mentioned there is only one table (0 - few hundred records) that is critical for real-time data. This table holds items being currently processed and changes rather often and must be refreshed as soon the change happens, because operator actions depend on current data. I intend to inform the clients via dedicated publish/subscribe WCF service. The core service would publish events and clients would subscribe to events. So e.g. when data is changed due to processing, core service would raise an event and event args would hold dataset instance with data changes, the clients would then merge received changes to own dataset instance. Here lies my main concern - is dataset too big data structure to be efficient enough - I'm also considering custom "light-weight" business entities, but it is easier for me to use dataset infrastructure. I actually didn't give a lot of thought to concurrency - all data tables have time-stamp column and I suppose core service would grant or deny data update based on that value. I also haven't use dataset merging ... I have done some reading about it and it seems the main problem lies in auto-incremented columns - there is great possibility for duplicate or missing data if used improperly. Anyway, thanks for your time, it's been helpful, specially for pointing out some issues that I didn't yet think of and might arise. Tine

        1 Reply Last reply
        0
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        • Login

        • Don't have an account? Register

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • World
        • Users
        • Groups