Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C / C++ / MFC
  4. Changing the Length of String from 8-Bits to 16-Bits Number

Changing the Length of String from 8-Bits to 16-Bits Number

Scheduled Pinned Locked Moved C / C++ / MFC
3 Posts 3 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • N Offline
    N Offline
    Neelesh K J Jain
    wrote on last edited by
    #1

    Hello All, I am want to convert the length of a string to a Hexa-Decimal number as a 16 Bits. So Using the _itoa(cLengthOfString,strlen(strSource),16) where cLengthOfString is char [5] and strSource is a char * but when i observe the destination string is having length represented in 8-bits character, I want it in 16 bits character, for e.g., xFFFF should utilize only 2 bytes, instead of 4 bytes. Thanks, Neelesh K J Jain.

    CPalliniC L 2 Replies Last reply
    0
    • N Neelesh K J Jain

      Hello All, I am want to convert the length of a string to a Hexa-Decimal number as a 16 Bits. So Using the _itoa(cLengthOfString,strlen(strSource),16) where cLengthOfString is char [5] and strSource is a char * but when i observe the destination string is having length represented in 8-bits character, I want it in 16 bits character, for e.g., xFFFF should utilize only 2 bytes, instead of 4 bytes. Thanks, Neelesh K J Jain.

      CPalliniC Offline
      CPalliniC Offline
      CPallini
      wrote on last edited by
      #2

      To represent a integer in the range {0,65535} as hexadecimal (ANSI) string you need 4 characters. The binary encoding of such a number is (of course) 16 bits (i.e. 2 bytes) wide.

      If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler. -- Alfonso the Wise, 13th Century King of Castile.
      This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong. -- Iain Clarke
      [My articles]

      In testa che avete, signor di Ceprano?

      1 Reply Last reply
      0
      • N Neelesh K J Jain

        Hello All, I am want to convert the length of a string to a Hexa-Decimal number as a 16 Bits. So Using the _itoa(cLengthOfString,strlen(strSource),16) where cLengthOfString is char [5] and strSource is a char * but when i observe the destination string is having length represented in 8-bits character, I want it in 16 bits character, for e.g., xFFFF should utilize only 2 bytes, instead of 4 bytes. Thanks, Neelesh K J Jain.

        L Offline
        L Offline
        led mike
        wrote on last edited by
        #3

        Neelesh K J Jain wrote:

        for e.g., xFFFF should utilize only 2 bytes, instead of 4 bytes.

        What? How the hell is that going to happen? Run the following code and see if you get a clue.

        char buf[5];
        itoa( 0xf, buf, 16);
        cout << buf << endl;
        itoa( 0xff, buf, 16);
        cout << buf << endl;
        itoa( 0xfff, buf, 16);
        cout << buf << endl;
        itoa( 0xffff, buf, 16);
        cout << buf << endl;

        1 Reply Last reply
        0
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        • Login

        • Don't have an account? Register

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • World
        • Users
        • Groups