CryptBuffertoString returning invalid characters
-
Hello, I am trying to encrypt some data which contains a structure and the encrypted data looks like the following ËO.ÛßÞ¸ëô¾óÉïD»åÏë3¸·««««««««" The code to encrypt the data is the following dwBufferLen = dwCount=((STRUCT_BUF)pBufPtr)->ulSize; if(!CryptEncrypt(hKey, 0, TRUE, 0,pBufPtr, &dwCount, dwBufferLen)) { if(GetLastError() == ERROR_MORE_DATA) { //MessageBox("error","",MB_OK); } } Now I want to convert this encrypted data to a string so that it can be displayed to the user on the screen for typing purposes However when I call CryptBinaryToString with the data passed in it returns a string with alphanumeric characters but also other garbage characters. The string looks like the following y08u29/euOv0vWzx9Oo DWORD len; if(!CryptBinaryToString(pBufPtr,dwCount,1,NULL,&len)) { DWORD dwRet; dwRet=GetLastError(); MessageBox("error","",MB_OK); } char *msg2=(char*)malloc(len); *msg2 = NULL; if(!CryptBinaryToString((BYTE *)pBufPtr,dwCount,1,msg2,&len)) { DWORD dwRet; dwRet=GetLastError(); } Instead of 1 as the third argument above I tried passing in 4 and 12 but still get the same result. So how do I get rid of the garbage characters. Please help. Thanks. vg
-
Hello, I am trying to encrypt some data which contains a structure and the encrypted data looks like the following ËO.ÛßÞ¸ëô¾óÉïD»åÏë3¸·««««««««" The code to encrypt the data is the following dwBufferLen = dwCount=((STRUCT_BUF)pBufPtr)->ulSize; if(!CryptEncrypt(hKey, 0, TRUE, 0,pBufPtr, &dwCount, dwBufferLen)) { if(GetLastError() == ERROR_MORE_DATA) { //MessageBox("error","",MB_OK); } } Now I want to convert this encrypted data to a string so that it can be displayed to the user on the screen for typing purposes However when I call CryptBinaryToString with the data passed in it returns a string with alphanumeric characters but also other garbage characters. The string looks like the following y08u29/euOv0vWzx9Oo DWORD len; if(!CryptBinaryToString(pBufPtr,dwCount,1,NULL,&len)) { DWORD dwRet; dwRet=GetLastError(); MessageBox("error","",MB_OK); } char *msg2=(char*)malloc(len); *msg2 = NULL; if(!CryptBinaryToString((BYTE *)pBufPtr,dwCount,1,msg2,&len)) { DWORD dwRet; dwRet=GetLastError(); } Instead of 1 as the third argument above I tried passing in 4 and 12 but still get the same result. So how do I get rid of the garbage characters. Please help. Thanks. vg
vgandhi wrote:
Instead of 1 as the third argument above I tried passing in 4 and 12 but still get the same result
Where are you getting those numbers from? The third argument should be one of the following, correct? CRYPT_STRING_BASE64HEADER CRYPT_STRING_BASE64 CRYPT_STRING_BINARY CRYPT_STRING_BASE64REQUESTHEADER CRYPT_STRING_HEX CRYPT_STRING_HEXASCII CRYPT_STRING_BASE64_ANY CRYPT_STRING_ANY CRYPT_STRING_HEX_ANY CRYPT_STRING_BASE64X509CRLHEADER CRYPT_STRING_HEXADDR CRYPT_STRING_HEXASCIIADDR I suppose CRYPT_STRING_HEXASCII is the human-readable one. Mark
-
vgandhi wrote:
Instead of 1 as the third argument above I tried passing in 4 and 12 but still get the same result
Where are you getting those numbers from? The third argument should be one of the following, correct? CRYPT_STRING_BASE64HEADER CRYPT_STRING_BASE64 CRYPT_STRING_BINARY CRYPT_STRING_BASE64REQUESTHEADER CRYPT_STRING_HEX CRYPT_STRING_HEXASCII CRYPT_STRING_BASE64_ANY CRYPT_STRING_ANY CRYPT_STRING_HEX_ANY CRYPT_STRING_BASE64X509CRLHEADER CRYPT_STRING_HEXADDR CRYPT_STRING_HEXASCIIADDR I suppose CRYPT_STRING_HEXASCII is the human-readable one. Mark
-
Hello, I am trying to encrypt some data which contains a structure and the encrypted data looks like the following ËO.ÛßÞ¸ëô¾óÉïD»åÏë3¸·««««««««" The code to encrypt the data is the following dwBufferLen = dwCount=((STRUCT_BUF)pBufPtr)->ulSize; if(!CryptEncrypt(hKey, 0, TRUE, 0,pBufPtr, &dwCount, dwBufferLen)) { if(GetLastError() == ERROR_MORE_DATA) { //MessageBox("error","",MB_OK); } } Now I want to convert this encrypted data to a string so that it can be displayed to the user on the screen for typing purposes However when I call CryptBinaryToString with the data passed in it returns a string with alphanumeric characters but also other garbage characters. The string looks like the following y08u29/euOv0vWzx9Oo DWORD len; if(!CryptBinaryToString(pBufPtr,dwCount,1,NULL,&len)) { DWORD dwRet; dwRet=GetLastError(); MessageBox("error","",MB_OK); } char *msg2=(char*)malloc(len); *msg2 = NULL; if(!CryptBinaryToString((BYTE *)pBufPtr,dwCount,1,msg2,&len)) { DWORD dwRet; dwRet=GetLastError(); } Instead of 1 as the third argument above I tried passing in 4 and 12 but still get the same result. So how do I get rid of the garbage characters. Please help. Thanks. vg
Why are you calling that garbage? You're passing the flag 1, which is
CRYPT_STRING_BASE64
, and you're getting back a base 64 encoded version of the data.--Mike-- Visual C++ MVP :cool: LINKS~! Ericahist | PimpFish | CP SearchBar v3.0 | C++ Forum FAQ"); Ford, what's this fish doing in my ear?
-
Why are you calling that garbage? You're passing the flag 1, which is
CRYPT_STRING_BASE64
, and you're getting back a base 64 encoded version of the data.--Mike-- Visual C++ MVP :cool: LINKS~! Ericahist | PimpFish | CP SearchBar v3.0 | C++ Forum FAQ"); Ford, what's this fish doing in my ear?
Dear Mike, The string that I get from the CryptBinarytoString has a square shaped ascii character in it and that gets displayed to the user that day. So if the user were to use that string there is no way he/she can do that. I'm getting a base 64 encoded version of the data but I'm just wondering as to why it always includes those funny characters at the end and is there a way to avoid it. Thanks.
vg
-
Basically 1 stands for CRYPT_STRING_BASE64 and 4 stans for CRYPT_STRING_HEX CRYPT_STRING_HEXASCII doesn't help in this case also. Thanks
vg
This works for me with no garbage chars...
BYTE BinaryBytes[256];
for (int i = 0; i < 256; ++i)
BinaryBytes[i] = (BYTE)i;
DWORD dwStrLen = 0;
::CryptBinaryToString(BinaryBytes, 256, CRYPT_STRING_HEXASCII, NULL, &dwStrLen);
LPTSTR pszBuf = new TCHAR[dwStrLen];
::CryptBinaryToString(BinaryBytes, 256, CRYPT_STRING_HEXASCII, pszBuf, &dwStrLen);
delete[] pszBuf; -
Dear Mike, The string that I get from the CryptBinarytoString has a square shaped ascii character in it and that gets displayed to the user that day. So if the user were to use that string there is no way he/she can do that. I'm getting a base 64 encoded version of the data but I'm just wondering as to why it always includes those funny characters at the end and is there a way to avoid it. Thanks.
vg
This code works for me:
int main()
{
LPCSTR data = "Some encrypted data goes here...";
LPTSTR pszBase64 = NULL;
DWORD cchString = 0;CryptBinaryToString ( (const BYTE*) data, strlen(data), CRYPT_STRING_BASE64, NULL, &cchString );
pszBase64 = (LPTSTR) _alloca ( cchString * sizeof(TCHAR) );
CryptBinaryToString ( (const BYTE*) data, strlen(data), CRYPT_STRING_BASE64, pszBase64, &cchString );
wcout << pszBase64 << endl;
return 0;
}I get the output:
U29tZSBlbmNyeXB0ZWQgZGF0YSBnb2VzIGhlcmUuLi4=
--Mike-- Visual C++ MVP :cool: LINKS~! Ericahist | PimpFish | CP SearchBar v3.0 | C++ Forum FAQ"); Ford, what's this fish doing in my ear?
-
This works for me with no garbage chars...
BYTE BinaryBytes[256];
for (int i = 0; i < 256; ++i)
BinaryBytes[i] = (BYTE)i;
DWORD dwStrLen = 0;
::CryptBinaryToString(BinaryBytes, 256, CRYPT_STRING_HEXASCII, NULL, &dwStrLen);
LPTSTR pszBuf = new TCHAR[dwStrLen];
::CryptBinaryToString(BinaryBytes, 256, CRYPT_STRING_HEXASCII, pszBuf, &dwStrLen);
delete[] pszBuf;Dear Mark, Yes this worked for you but if you see the string value in this case how will you display it to the user. It is not in any displable format whereas I want to display the data on the screen for the user to either type or read it loud. Thanks.
vg
-
Dear Mark, Yes this worked for you but if you see the string value in this case how will you display it to the user. It is not in any displable format whereas I want to display the data on the screen for the user to either type or read it loud. Thanks.
vg
What are you expecting to see? The source is bytes, 0 to 255. How do you want to represent them on the screen? You can always loop through the source bytes and convert them to whatever format you want. Mark
-
This code works for me:
int main()
{
LPCSTR data = "Some encrypted data goes here...";
LPTSTR pszBase64 = NULL;
DWORD cchString = 0;CryptBinaryToString ( (const BYTE*) data, strlen(data), CRYPT_STRING_BASE64, NULL, &cchString );
pszBase64 = (LPTSTR) _alloca ( cchString * sizeof(TCHAR) );
CryptBinaryToString ( (const BYTE*) data, strlen(data), CRYPT_STRING_BASE64, pszBase64, &cchString );
wcout << pszBase64 << endl;
return 0;
}I get the output:
U29tZSBlbmNyeXB0ZWQgZGF0YSBnb2VzIGhlcmUuLi4=
--Mike-- Visual C++ MVP :cool: LINKS~! Ericahist | PimpFish | CP SearchBar v3.0 | C++ Forum FAQ"); Ford, what's this fish doing in my ear?
Yes but you using a string instead of a byte buffer. So try this out typedef unsigned char *PBYTE; typedef struct chal_buffer { DWORD ulSize; // size of the entire structure DWORD Num1; }CHALL_BUF, *PCHALL_BUF; PBYTE pBufPtr pBufPtr = (UCHAR*)malloc(sizeof(CHALL_BUF)); if(!pBufPtr) return; ((PCHALL_BUF)pBufPtr)->ulSize = sizeof(CHALL_BUF);// in bytes ((PCHALL_BUF)pBufPtr)->Num1=1233456; if(!CryptAcquireContext(&hProv, NULL, MS_DEF_PROV, PROV_RSA_FULL, 0)) { if(GetLastError() != NTE_BAD_KEYSET) { } if(!CryptAcquireContext(&hProv, NULL, MS_DEF_PROV, PROV_RSA_FULL, CRYPT_NEWKEYSET)) { } } if(!CryptCreateHash(hProv, CALG_MD5, 0, 0, &hHash)) { DWORD dwRet = GetLastError(); } // Derive a session key from the hash object. if(!CryptDeriveKey(hProv, CALG_RC4, hHash, 0, &hKey)) { DWORD dwRet = GetLastError(); } // Destroy the hash object. CryptDestroyHash(hHash); hHash = 0; dwBufferLen = dwCount=((PCHALL_BUF)pBufPtr)->ulSize; if(!CryptEncrypt(hKey, 0, TRUE, 0,pBufPtr, &dwCount, dwBufferLen)) { if(GetLastError() == ERROR_MORE_DATA) { //MessageBox("error","",MB_OK); } } DWORD len = 0; if(!CryptBinaryToString(pBufPtr,dwCount,CRYPT_STRING_BASE64,NULL,&len)) { DWORD dwRet; dwRet=GetLastError(); MessageBox("error","",MB_OK); } LPTSTR pszBase64 = NULL; pszBase64 = (LPTSTR) _alloca ( len * sizeof(TCHAR) ); if(!CryptBinaryToString((BYTE *)pBufPtr,dwCount,CRYPT_STRING_BASE64,pszBase64,&len)) { DWORD dwRet; dwRet=GetLastError(); }
vg
-
Yes but you using a string instead of a byte buffer. So try this out typedef unsigned char *PBYTE; typedef struct chal_buffer { DWORD ulSize; // size of the entire structure DWORD Num1; }CHALL_BUF, *PCHALL_BUF; PBYTE pBufPtr pBufPtr = (UCHAR*)malloc(sizeof(CHALL_BUF)); if(!pBufPtr) return; ((PCHALL_BUF)pBufPtr)->ulSize = sizeof(CHALL_BUF);// in bytes ((PCHALL_BUF)pBufPtr)->Num1=1233456; if(!CryptAcquireContext(&hProv, NULL, MS_DEF_PROV, PROV_RSA_FULL, 0)) { if(GetLastError() != NTE_BAD_KEYSET) { } if(!CryptAcquireContext(&hProv, NULL, MS_DEF_PROV, PROV_RSA_FULL, CRYPT_NEWKEYSET)) { } } if(!CryptCreateHash(hProv, CALG_MD5, 0, 0, &hHash)) { DWORD dwRet = GetLastError(); } // Derive a session key from the hash object. if(!CryptDeriveKey(hProv, CALG_RC4, hHash, 0, &hKey)) { DWORD dwRet = GetLastError(); } // Destroy the hash object. CryptDestroyHash(hHash); hHash = 0; dwBufferLen = dwCount=((PCHALL_BUF)pBufPtr)->ulSize; if(!CryptEncrypt(hKey, 0, TRUE, 0,pBufPtr, &dwCount, dwBufferLen)) { if(GetLastError() == ERROR_MORE_DATA) { //MessageBox("error","",MB_OK); } } DWORD len = 0; if(!CryptBinaryToString(pBufPtr,dwCount,CRYPT_STRING_BASE64,NULL,&len)) { DWORD dwRet; dwRet=GetLastError(); MessageBox("error","",MB_OK); } LPTSTR pszBase64 = NULL; pszBase64 = (LPTSTR) _alloca ( len * sizeof(TCHAR) ); if(!CryptBinaryToString((BYTE *)pBufPtr,dwCount,CRYPT_STRING_BASE64,pszBase64,&len)) { DWORD dwRet; dwRet=GetLastError(); }
vg
What are you expecting to see? CryptBinaryToString() only understands BYTEs as input so whether you input a char string or a byte buffer is irrelevant. A byte has 256 values. That means that somehow CryptBinaryToString() must be able to represent 256 values with a limited set of ASCII characters. Again, what do you want to see? Maybe that API isn't appropriate for your needs. Mark
-
What are you expecting to see? CryptBinaryToString() only understands BYTEs as input so whether you input a char string or a byte buffer is irrelevant. A byte has 256 values. That means that somehow CryptBinaryToString() must be able to represent 256 values with a limited set of ASCII characters. Again, what do you want to see? Maybe that API isn't appropriate for your needs. Mark
Dear Mark, I basically want to convert each byte in the encrypted data into two hexidecimal characters. For example 'z' becomes 7A. However my encrypted data is not of fixed length so I'm not sure how do to this. Also once I have the hexidecimal character encrypted data I also need to convert it back into the encrypted data. So that is the functionality I'm looking for. So any suggestions. Thanks.
vg
-
Dear Mark, I basically want to convert each byte in the encrypted data into two hexidecimal characters. For example 'z' becomes 7A. However my encrypted data is not of fixed length so I'm not sure how do to this. Also once I have the hexidecimal character encrypted data I also need to convert it back into the encrypted data. So that is the functionality I'm looking for. So any suggestions. Thanks.
vg
Ok, I'm with you. I think the CryptBinaryToString(() API was meant more for converting binary into characters that can be used in HTML/XML/etc. Not necessarily to be human-readable. The problem is that in the encrypted data there's no 'z'. There's just bytes. You'll need to store the length somehow. An easy way is to reserve the first BYTE/WORD/DWORD to hold the length of the following encrypted data. With the length, you can loop through the encrypted bytes, converting each to hex and appending to a string. I'm not sure about converting it back to encrypted though. If you allow the user to make changes then the data won't be able to be unencrypted later. It doesn't make sense really anyway - how can a user understand encrypted data? Mark