Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. How does USB to HDMI work?

How does USB to HDMI work?

Scheduled Pinned Locked Moved The Lounge
questioncsharpasp-netdotnetcom
7 Posts 7 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M Offline
    M Offline
    Marc Clifton
    wrote on last edited by
    #1

    The survey question reminded me that I've always wondered about this. Does it use the computer's memory or the existing graphics card memory? Is the computer's graphics card involved in any way? It seems so because on my latest laptop I can really up the resolution of Skyrim, but how in the world does that work? And it seems like this is a ton of data to be transmitting constantly - is it optimized in some way? When I ask this question on Google, the links I get are really stupid and don't explain any of the technical details that are what I'm interested in. Maybe my Google Fu is not awake this morning yet.

    Latest Article:
    Create a Digital Ocean Droplet for .NET Core Web API with a real SSL Certificate on a Domain

    D L C O 4 Replies Last reply
    0
    • M Marc Clifton

      The survey question reminded me that I've always wondered about this. Does it use the computer's memory or the existing graphics card memory? Is the computer's graphics card involved in any way? It seems so because on my latest laptop I can really up the resolution of Skyrim, but how in the world does that work? And it seems like this is a ton of data to be transmitting constantly - is it optimized in some way? When I ask this question on Google, the links I get are really stupid and don't explain any of the technical details that are what I'm interested in. Maybe my Google Fu is not awake this morning yet.

      Latest Article:
      Create a Digital Ocean Droplet for .NET Core Web API with a real SSL Certificate on a Domain

      D Offline
      D Offline
      Dan Neely
      wrote on last edited by
      #2

      The cheap USB3 version just has your system output an HDMI signal over the USB wires. This needs support on the host system; and probably won't work with a PCIe GPU and USB3 slot on your mobo. Edit: IIRC this one needs a USB C port in order to use both the normal and inverted pins to have enough high speed IO outputs, and a 3.0 USBA port would still need the below. IIRC This gives the output bandwidth for 1440p60 signalling. 4k60 would require the fastest USB3 mode that can do 20gps (the 10gbps mode is just doing the same use both sets of data pins as HDMI over USBC). The expensive version that also works on USB2 is from a company called Display Link. I'm not sure if their software captures video being rendered by the onboard GPU and then compresses it to fit in USB2 bandwidth, or if they have a crappy GPU on the other end of the USB cable and just send draw commands over it.

      Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius

      P 1 Reply Last reply
      0
      • M Marc Clifton

        The survey question reminded me that I've always wondered about this. Does it use the computer's memory or the existing graphics card memory? Is the computer's graphics card involved in any way? It seems so because on my latest laptop I can really up the resolution of Skyrim, but how in the world does that work? And it seems like this is a ton of data to be transmitting constantly - is it optimized in some way? When I ask this question on Google, the links I get are really stupid and don't explain any of the technical details that are what I'm interested in. Maybe my Google Fu is not awake this morning yet.

        Latest Article:
        Create a Digital Ocean Droplet for .NET Core Web API with a real SSL Certificate on a Domain

        L Offline
        L Offline
        Lost User
        wrote on last edited by
        #3

        If you mean the adapter, it works by using DisplayPort-over-USB-C (an alternate mode of USB-C), that goes into an DP-to-HDMI converter (something like the MCDP2900) to create the HDMI output. It's not involved in creating the image.

        1 Reply Last reply
        0
        • M Marc Clifton

          The survey question reminded me that I've always wondered about this. Does it use the computer's memory or the existing graphics card memory? Is the computer's graphics card involved in any way? It seems so because on my latest laptop I can really up the resolution of Skyrim, but how in the world does that work? And it seems like this is a ton of data to be transmitting constantly - is it optimized in some way? When I ask this question on Google, the links I get are really stupid and don't explain any of the technical details that are what I'm interested in. Maybe my Google Fu is not awake this morning yet.

          Latest Article:
          Create a Digital Ocean Droplet for .NET Core Web API with a real SSL Certificate on a Domain

          C Offline
          C Offline
          charlieg
          wrote on last edited by
          #4

          Marc - your google fu has not failed. Something has happened to google search that has made it useless. Most of the results are ads, companies, etc but no information.

          Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.

          M 1 Reply Last reply
          0
          • M Marc Clifton

            The survey question reminded me that I've always wondered about this. Does it use the computer's memory or the existing graphics card memory? Is the computer's graphics card involved in any way? It seems so because on my latest laptop I can really up the resolution of Skyrim, but how in the world does that work? And it seems like this is a ton of data to be transmitting constantly - is it optimized in some way? When I ask this question on Google, the links I get are really stupid and don't explain any of the technical details that are what I'm interested in. Maybe my Google Fu is not awake this morning yet.

            Latest Article:
            Create a Digital Ocean Droplet for .NET Core Web API with a real SSL Certificate on a Domain

            O Offline
            O Offline
            obermd
            wrote on last edited by
            #5

            You apparently have to register as an HDMI "adopter" at www.hdmi.org to get these specifications. I used DuckDuckGo to find this site.

            1 Reply Last reply
            0
            • C charlieg

              Marc - your google fu has not failed. Something has happened to google search that has made it useless. Most of the results are ads, companies, etc but no information.

              Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.

              M Offline
              M Offline
              maze3
              wrote on last edited by
              #6

              i do not know if my google chop any use, or just ad blocks. Also im plagued with quora results now X| snippet from article

              Quote:

              The USB to HDMI active adapter basically works like an external graphics or video card as an interface between the computer and monitor. Most computers will have a USB 2.0 or 3.0 Type A port. This is the slender rectangular port.

              [How to Connect a Laptop to a Monitor with a USB Port | Exhibit Edge](https://www.exhibitedge.com/how-to-connect-a-laptop-to-a-monitor-using-usb/#:~:text=The USB to HDMI active,is the slender rectangular port.) so sounds like for this specific device, driver captures output somehow, transmits to usb, and usb device has converter to hdmi signal. the USb / HDMI devices iv used tend to have a chunky block in them to mean have some kind of processor in them along with with the connectors. so what doing work, cpu, to capture output, but gpu might still be doing number crunching

              1 Reply Last reply
              0
              • D Dan Neely

                The cheap USB3 version just has your system output an HDMI signal over the USB wires. This needs support on the host system; and probably won't work with a PCIe GPU and USB3 slot on your mobo. Edit: IIRC this one needs a USB C port in order to use both the normal and inverted pins to have enough high speed IO outputs, and a 3.0 USBA port would still need the below. IIRC This gives the output bandwidth for 1440p60 signalling. 4k60 would require the fastest USB3 mode that can do 20gps (the 10gbps mode is just doing the same use both sets of data pins as HDMI over USBC). The expensive version that also works on USB2 is from a company called Display Link. I'm not sure if their software captures video being rendered by the onboard GPU and then compresses it to fit in USB2 bandwidth, or if they have a crappy GPU on the other end of the USB cable and just send draw commands over it.

                Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius

                P Offline
                P Offline
                Peter Shaw
                wrote on last edited by
                #7

                There are a number of different ways to do it. I have a couple of modern TV's for example, that have HDMI inputs where there is actually little to no video data sent over the link. Instead, the TV actually includes the GPU, and all that's used on the HDMI connection is a high speed serial bit stream of GPU commands. Like wise I also have USB3 to HDMI/DVI/VGA adapters that basically have the same GPU inside the adapter, and take a serial bit stream from the USB3 port, in much the same way my TV takes it's GPU input over the HDMI serial link. Broadcom make a LOT of SOC's that can do this decoding in real time, and many of the devices I open and investigate (esp the older ones) tend to have broadcom chips in them. In some cases though, yes it is just a straight through high speed serial link, USB3 to HDMI, you tend to be able to spot these ones though, they are generally smaller than the GPU based ones, and they won't work with some makes of monitors (EG: ones that don't have their own on-board GPU)

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups