Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. UDP Packet Synchronization

UDP Packet Synchronization

Scheduled Pinned Locked Moved C#
questiondata-structureshelptutorial
2 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S Offline
    S Offline
    softwarejaeger
    wrote on last edited by
    #1

    Hello, I'm developing a application which sends and receives tons of data over UDP. I know the problem of UDP to loss udp-datagrams between sending and receiving. So i created a stack and a timer, where the datagrams to send are stored and sended. (All datagrams get a counting ID, so i'm able to recognize if something is missing) With the interval of the timer i'm able to control the transfer-rate. My problem is now, how to calculate the correct interval of the transfer? Is there any logic to do this? I would work with the messages i get from the connection-partner how many datagrams are lost, but what is the best way to calculate this?

    D 1 Reply Last reply
    0
    • S softwarejaeger

      Hello, I'm developing a application which sends and receives tons of data over UDP. I know the problem of UDP to loss udp-datagrams between sending and receiving. So i created a stack and a timer, where the datagrams to send are stored and sended. (All datagrams get a counting ID, so i'm able to recognize if something is missing) With the interval of the timer i'm able to control the transfer-rate. My problem is now, how to calculate the correct interval of the transfer? Is there any logic to do this? I would work with the messages i get from the connection-partner how many datagrams are lost, but what is the best way to calculate this?

      D Offline
      D Offline
      Dave Kreskowiak
      wrote on last edited by
      #2

      If delivery is important, why are you even using UDP? UDP does not guarantee delivery, and considerbly less known, does not guarantee that the packets will arrive in the correct order. This sounds very kludgy because you cannot guarantee that the interval will be constant, depending on network traffic and load on the client end, the server end, and everything in between. Remember, it's not just your app putting data on the wire and it's a shared medium being used by machines that have nothing to do with your app.

      softwarejaeger wrote:

      My problem is now, how to calculate the correct interval of the transfer?

      What defines what the correct interval should be? Just because a packet was lost is not a valid indicator that the transfer should slow down. The NIC will only send out packets when it can, not at the rate your app can send them.

      A guide to posting questions on CodeProject[^]
      Dave Kreskowiak

      1 Reply Last reply
      0
      Reply
      • Reply as topic
      Log in to reply
      • Oldest to Newest
      • Newest to Oldest
      • Most Votes


      • Login

      • Don't have an account? Register

      • Login or register to search.
      • First post
        Last post
      0
      • Categories
      • Recent
      • Tags
      • Popular
      • World
      • Users
      • Groups