Translate C-Code to Csharp
-
Can sb. translate following code to C# pls:
/*********************************************************************** * Function Name: UuDecodeLength * * Description: Determines the length of the raw data given the * length of the UU-Encoded data * * Arguments: * nSrcLen Length of the UU-Encoded data * * Returns * Number of bytes of data **********************************************************************/ INT32U UuDecodeLength(INT32U nSrcLen) { INT32U uDestLen; uDestLen = (nSrcLen * 3) / 4; return uDestLen; } /*********************************************************************** * Function Name: UuEncodeLength * * Description: Determines the length of the UU-Encoded data given * the length of the raw data * * Arguments: * nSrcLen Length of the raw data * * Returns * Number of bytes of UU-Encoded data **********************************************************************/ INT32U UuEncodeLength(INT32U nSrcLen) { INT32U uDestLen; uDestLen = (nSrcLen * 4) / 3; if ( ((nSrcLen * 4) % 3) != 0 ) { // we need a partial byte, add one... uDestLen++; } return uDestLen; } /*********************************************************************** * Function Name: UuDecode * * Description: UU-Decodes the given ASCII data * * * Arguments: * pSrc Pointer to the source data (UU-Encoded data) * pDst Pointer to the destination data (binary data) * nSrcLen Length of the source buffer * * Returns * Number of bytes in the pDst array **********************************************************************/ INT32U UuDecode(INT8U* pDst,const INT8U* pSrc,INT32U nSrcLen) { unsigned int nDstIndex; unsigned int nSrcIndex; nDstIndex = 0; nSrcIndex = 0; while(nSrcIndex < nSrcLen) { // Decode A // Bits Src Offset pDst[nDstIndex] = ((pSrc[nSrcIndex++] - 0x20) << 2 ) & 0xFC; // A7:2 0 pDst[nDstIndex++] |= ((pSrc[nSrcIndex] - 0x20) >> 4 ) & 0x03; // A1:0 1 if ( (nSrcIndex + 1) >= nSrcLen ) { // we need one more byte to decode B break; } // Decode B pDst[nDstIndex] = ((pSrc[nSrcIndex++] - 0x20) << 4 ) & 0xF0; // B7:4 1 pDst[nDstIndex++] |= ((pSrc[nSrcIndex] - 0x20) >> 2 ) & 0x0F; // B3:0 2 if ( (nSrcIndex + 1) >= nSrcLen ) { // we need one more byte to decode C break; } // Decode C pDst[nDstIndex] = ((pSrc[nSrcIndex++] - 0x20) << 6 ) & 0xC0; // C7:6 2 pDst[nDstIndex++] |= ((pSrc[nSrcIndex] - 0x20) ) & 0x3F; // C5:0 3 nSrcIndex++; } return nDstIndex; } /***********************************
-
Can sb. translate following code to C# pls:
/*********************************************************************** * Function Name: UuDecodeLength * * Description: Determines the length of the raw data given the * length of the UU-Encoded data * * Arguments: * nSrcLen Length of the UU-Encoded data * * Returns * Number of bytes of data **********************************************************************/ INT32U UuDecodeLength(INT32U nSrcLen) { INT32U uDestLen; uDestLen = (nSrcLen * 3) / 4; return uDestLen; } /*********************************************************************** * Function Name: UuEncodeLength * * Description: Determines the length of the UU-Encoded data given * the length of the raw data * * Arguments: * nSrcLen Length of the raw data * * Returns * Number of bytes of UU-Encoded data **********************************************************************/ INT32U UuEncodeLength(INT32U nSrcLen) { INT32U uDestLen; uDestLen = (nSrcLen * 4) / 3; if ( ((nSrcLen * 4) % 3) != 0 ) { // we need a partial byte, add one... uDestLen++; } return uDestLen; } /*********************************************************************** * Function Name: UuDecode * * Description: UU-Decodes the given ASCII data * * * Arguments: * pSrc Pointer to the source data (UU-Encoded data) * pDst Pointer to the destination data (binary data) * nSrcLen Length of the source buffer * * Returns * Number of bytes in the pDst array **********************************************************************/ INT32U UuDecode(INT8U* pDst,const INT8U* pSrc,INT32U nSrcLen) { unsigned int nDstIndex; unsigned int nSrcIndex; nDstIndex = 0; nSrcIndex = 0; while(nSrcIndex < nSrcLen) { // Decode A // Bits Src Offset pDst[nDstIndex] = ((pSrc[nSrcIndex++] - 0x20) << 2 ) & 0xFC; // A7:2 0 pDst[nDstIndex++] |= ((pSrc[nSrcIndex] - 0x20) >> 4 ) & 0x03; // A1:0 1 if ( (nSrcIndex + 1) >= nSrcLen ) { // we need one more byte to decode B break; } // Decode B pDst[nDstIndex] = ((pSrc[nSrcIndex++] - 0x20) << 4 ) & 0xF0; // B7:4 1 pDst[nDstIndex++] |= ((pSrc[nSrcIndex] - 0x20) >> 2 ) & 0x0F; // B3:0 2 if ( (nSrcIndex + 1) >= nSrcLen ) { // we need one more byte to decode C break; } // Decode C pDst[nDstIndex] = ((pSrc[nSrcIndex++] - 0x20) << 6 ) & 0xC0; // C7:6 2 pDst[nDstIndex++] |= ((pSrc[nSrcIndex] - 0x20) ) & 0x3F; // C5:0 3 nSrcIndex++; } return nDstIndex; } /***********************************
No! No one will do your job for you! Please make an effort and came back with questions about specific issues, showing what have you already done, explaining why it doesn't work as you expected and possibly adding a code snippet of the relevant code!
-
No! No one will do your job for you! Please make an effort and came back with questions about specific issues, showing what have you already done, explaining why it doesn't work as you expected and possibly adding a code snippet of the relevant code!
ok I've made the job only just... but can somebody explain to me the UU-Encoding, because i.e. the UuEncode method returns an unsigned integer. I thought UU-Encoding converts Unicode to ASCII. So what should I do with this "uint"? The method "UuEncodeLength" returns me the same length... So I don't see there any differences in their functions...
-
ok I've made the job only just... but can somebody explain to me the UU-Encoding, because i.e. the UuEncode method returns an unsigned integer. I thought UU-Encoding converts Unicode to ASCII. So what should I do with this "uint"? The method "UuEncodeLength" returns me the same length... So I don't see there any differences in their functions...
djfresh wrote:
The method "UuEncodeLength" returns me the same length...
This is not true; look at its code:
INT32U UuEncodeLength(INT32U nSrcLen)
{
INT32U uDestLen;
uDestLen = (nSrcLen * 4) / 3;
if ( ((nSrcLen * 4) % 3) != 0 )
{ // we need a partial byte, add one...
uDestLen++;
}
return uDestLen;
}Now let's look at what happen:
- if
nSrcLen = 1
then the function returns 2; - if
nSrcLen = 2
then the function returns 3; - if
nSrcLen = 3
then the function returns 4; - if
nSrcLen = 4
then the function returns 6; - and so on...
- if
-
djfresh wrote:
The method "UuEncodeLength" returns me the same length...
This is not true; look at its code:
INT32U UuEncodeLength(INT32U nSrcLen)
{
INT32U uDestLen;
uDestLen = (nSrcLen * 4) / 3;
if ( ((nSrcLen * 4) % 3) != 0 )
{ // we need a partial byte, add one...
uDestLen++;
}
return uDestLen;
}Now let's look at what happen:
- if
nSrcLen = 1
then the function returns 2; - if
nSrcLen = 2
then the function returns 3; - if
nSrcLen = 3
then the function returns 4; - if
nSrcLen = 4
then the function returns 6; - and so on...
- if
-
sorry my english is not so good... I've meant that the methods UuEncodeLength and UuEncode are returning the same value... i.e. if you enter 2 characters they both returns 3. Example: 12 returns 3... but real UUencoding must return: ,3(@
You have misunderstood the code:
- The
UuDecodeLength
andUuEncodeLength
functions get the length of a string as parameter and return the required length of the correspondant decoded/encoded string. - The
UuDecode
andUuEncode
functions get three parameters: an output buffer, an input string and the length of the input string. They return the number of bytes written to the output buffer, and put on the output buffer the decoded/encoded version of the input string.
I tried this on my PC:
int main(int argc, char* argv[])
{
INT8U in = "12";
INT8U* out = (INT8U*)malloc(UuEncodeLength(2));
INT32U n = UuEncode(out, in, 2);// Put a breakpoint here and look what happened
// Now n is 3 and inside out you find ",3(" i.e.:
// out[0] = ','
// out[1] = '3'
// out[2] = '('free(out);
} - The
-
sorry my english is not so good... I've meant that the methods UuEncodeLength and UuEncode are returning the same value... i.e. if you enter 2 characters they both returns 3. Example: 12 returns 3... but real UUencoding must return: ,3(@
May I suggest you actually read the comments above each of those functions. They provide a functional description of what each function does, and they make perfect sense to me. :)
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles] Nil Volentibus Arduum
Please use <PRE> tags for code snippets, they preserve indentation, and improve readability.
-
You have misunderstood the code:
- The
UuDecodeLength
andUuEncodeLength
functions get the length of a string as parameter and return the required length of the correspondant decoded/encoded string. - The
UuDecode
andUuEncode
functions get three parameters: an output buffer, an input string and the length of the input string. They return the number of bytes written to the output buffer, and put on the output buffer the decoded/encoded version of the input string.
I tried this on my PC:
int main(int argc, char* argv[])
{
INT8U in = "12";
INT8U* out = (INT8U*)malloc(UuEncodeLength(2));
INT32U n = UuEncode(out, in, 2);// Put a breakpoint here and look what happened
// Now n is 3 and inside out you find ",3(" i.e.:
// out[0] = ','
// out[1] = '3'
// out[2] = '('free(out);
} - The