The following statement in the article that you cited tells me enough to give a reasonable hint.
Quote:
Conversion of TIME or DATETIME values to numeric form (for example, by adding +0) results in a double-precision value with a microseconds part of .000000
The TIMESTAMP is a double-precision floating point number, which is interpreted as follows.
The fractional part, if nonzero, is the number of microseconds in the time.
The integral part is always present, is a Unix time.
With these two bits of information, you can use the following function, which you can implement as a static method, to convert the Unix timestamp.
public static DateTime UnixTimeStampToDateTime( double unixTimeStamp )
{
// Unix timestamp is seconds past epoch
System.DateTime dtDateTime = new DateTime(1970,1,1,0,0,0,0,System.DateTimeKind.Utc);
dtDateTime = dtDateTime.AddSeconds( unixTimeStamp ).ToLocalTime();
return dtDateTime;
}
The above is taken from the accepted answer to How can I convert a Unix timestamp to DateTime and vice versa?. Though I haven't tested it, I suspect it is at least essentially correct.
Since the input is double precision, you can amend it to handle the decimal part, which should be converted to ticks, where one tick is equal to 100 nanoseconds. The resulting tick count should then be added to the .Ticks property on the DateTime to get the final answer.
David A. Gray Delivering Solutions for the Ages, One Problem at a Time Interpreting the Fundamental Principle of Tabular Reporting