Remoting issue
-
Hi there, I have a smart client that uses remoting via http channel and binary formatter. It is hosted in IIS. Sending and receiving data works fine, except if a large amount of data is involved. Then I get the following error:
An unhandled exception of type 'System.Runtime.Serialization.SerializationException' occurred in mscorlib.dll
Additional information: BinaryFormatter Version incompatibility. Expected Version 1.0. Received Version 1008738336.1684104552.
In web.config and client configuration file I specified the following:
<channels>
<channel ref="http">
<clientProviders>
<formatter ref="binary" />
</clientProviders>
<serverProviders>
<formatter ref="binary" />
<provider ref="wsdl"/>
</serverProviders>
</channel>
</channels>Does anyone know why this error occurs and how I can solve it in order to be able to work with larger amounts of data? Thanks! Ludwig
-
Hi there, I have a smart client that uses remoting via http channel and binary formatter. It is hosted in IIS. Sending and receiving data works fine, except if a large amount of data is involved. Then I get the following error:
An unhandled exception of type 'System.Runtime.Serialization.SerializationException' occurred in mscorlib.dll
Additional information: BinaryFormatter Version incompatibility. Expected Version 1.0. Received Version 1008738336.1684104552.
In web.config and client configuration file I specified the following:
<channels>
<channel ref="http">
<clientProviders>
<formatter ref="binary" />
</clientProviders>
<serverProviders>
<formatter ref="binary" />
<provider ref="wsdl"/>
</serverProviders>
</channel>
</channels>Does anyone know why this error occurs and how I can solve it in order to be able to work with larger amounts of data? Thanks! Ludwig
-
I'm not sure how you're transfering the data. If your sending a huge array of bytes, try sending a reference to the stream instead. Just have your remoted object return a stream object instead of the bytes. Hope this helps -Steve
I'm attaching the dataset to a container object, that is sent to the client. At the client, it then becomes the datasource of a datagrid. I need the data to be at client side completely, because data can be grouped and if not all data is present, inaccurate results are shown. So I think a reference to the data is not an option here? Anyway, I changed the configuration files. The web.config now looks like this:
<channels>
<channel ref="http">
<serverProviders>
<formatter ref="binary" />
<provider ref="wsdl"/>
</serverProviders>
</channel>
</channels>and the client configuration file looks like this:
<channels>
<channel ref="http">
<clientProviders>
<formatter ref="binary" /> </clientProviders>
</channel>
</channels>No I don't get the error anymore, instead I now get an 'out-of-memory' error! Ludwig
-
I'm attaching the dataset to a container object, that is sent to the client. At the client, it then becomes the datasource of a datagrid. I need the data to be at client side completely, because data can be grouped and if not all data is present, inaccurate results are shown. So I think a reference to the data is not an option here? Anyway, I changed the configuration files. The web.config now looks like this:
<channels>
<channel ref="http">
<serverProviders>
<formatter ref="binary" />
<provider ref="wsdl"/>
</serverProviders>
</channel>
</channels>and the client configuration file looks like this:
<channels>
<channel ref="http">
<clientProviders>
<formatter ref="binary" /> </clientProviders>
</channel>
</channels>No I don't get the error anymore, instead I now get an 'out-of-memory' error! Ludwig
Well, I probably found out the reason for this behaviour. In the event log I found the message: "aspnet_wp.exe (PID: 1836) was recycled because memory consumption exceeded the 153 MB (60 percent of available RAM)." The question is... how can I avoid my dataset being filled with too much data?
-
Well, I probably found out the reason for this behaviour. In the event log I found the message: "aspnet_wp.exe (PID: 1836) was recycled because memory consumption exceeded the 153 MB (60 percent of available RAM)." The question is... how can I avoid my dataset being filled with too much data?
-
Well, I probably found out the reason for this behaviour. In the event log I found the message: "aspnet_wp.exe (PID: 1836) was recycled because memory consumption exceeded the 153 MB (60 percent of available RAM)." The question is... how can I avoid my dataset being filled with too much data?
Try using a PagedDataSource. I feel sorry for that web server....:laugh: WebBoxes - Yet another collapsable control, but it relies on a "graphics server" for dynamic pretty rounded corners, cool arrows and unlimited font support.
-
Well, I probably found out the reason for this behaviour. In the event log I found the message: "aspnet_wp.exe (PID: 1836) was recycled because memory consumption exceeded the 153 MB (60 percent of available RAM)." The question is... how can I avoid my dataset being filled with too much data?
We ran into a similar issue with remoting large datasets, and came to the conclusion that if the dataset was large enough there were either memory leaks or at least garbage that was not reclaimed by GC in a timely fasion. There is an MSDN Mag article[^] that sheds some additional light on the issues. We ended up serializing the datasets ourselves, (basically into a collection of arraylists) and then reconstructing the dataset at the client from our custom object. This resulted in 30% less network bandwidth usage and a 100% speed increase. The fundamental problem seems to be that the contents of the dataset (datatables and datarelations) get seialized to XML even if the containing dataset is using a binary serializer, and in some cases this fails and some memory seems not to get reclaimed. Also, if the datasets are large, they get allocated on the large object heap which is not GC'd with the same frequency as other garbage (large objects are 'presumed' to have a longer lifetime...:(). Some ideas are so stupid that only an intellectual could have thought of them - George Orwell