My kid just crashed a Steam game he's playing
-
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
-
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
-
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
Can't you even obey simple instructions? You are supposed to tell a programmer, not 12,821,673 >32768 of them! :laugh:
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
-
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
-
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
PIEBALDconsult wrote:
Are they really using signed 16-bit addresses? :-D
No. The index buffer defines the triangle faces of a 3D object. For some reason the vertex buffer (which is indexed by the index buffer) may contain no more than 32000 vertices. For most uses this may be enough. Rendering too many objects with 32000 vertices and a corresponding number of faces is a slow affair. On the other hand, this decreases the size of the buffers, so that you can load more 3D objects at the same time. Video memory has always been precious.
The language is JavaScript. that of Mordor, which I will not utter here
This is Javascript. If you put big wheels and a racing stripe on a golf cart, it's still a fucking golf cart.
"I don't know, extraterrestrial?" "You mean like from space?" "No, from Canada." If software development were a circus, we would all be the clowns. -
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
Did the steam engine just derail?
-
PIEBALDconsult wrote:
Are they really using signed 16-bit addresses? :-D
No. The index buffer defines the triangle faces of a 3D object. For some reason the vertex buffer (which is indexed by the index buffer) may contain no more than 32000 vertices. For most uses this may be enough. Rendering too many objects with 32000 vertices and a corresponding number of faces is a slow affair. On the other hand, this decreases the size of the buffers, so that you can load more 3D objects at the same time. Video memory has always been precious.
The language is JavaScript. that of Mordor, which I will not utter here
This is Javascript. If you put big wheels and a racing stripe on a golf cart, it's still a fucking golf cart.
"I don't know, extraterrestrial?" "You mean like from space?" "No, from Canada." If software development were a circus, we would all be the clowns.Vertex count is limited to 32 bits (per draw call / buffer). If you're using 16-bit index buffers then you can only reference up to vertex 65535, but you can still have up to 4294967295 indices in your buffer, though I've never actually tried. If there's a 32768 limit on buffer sizes it's in their game code, it's nothing to do with the graphics card (except having enough video memory to store everything you need).
-
Vertex count is limited to 32 bits (per draw call / buffer). If you're using 16-bit index buffers then you can only reference up to vertex 65535, but you can still have up to 4294967295 indices in your buffer, though I've never actually tried. If there's a 32768 limit on buffer sizes it's in their game code, it's nothing to do with the graphics card (except having enough video memory to store everything you need).
Anthony Mushrow wrote:
If you're using 16-bit index buffers then you can only reference up to vertex 65535, but you can still have up to 4294967295 indices in your buffer, though I've never actually tried.
I know that, but the error message that was shown mentioned 32768 as max. index value, so we must assume that they used a 16 bit signed type in the index buffer. And in the end it's irrelevant how many vertices you have in the vertex buffer if you can't access them.
The language is JavaScript. that of Mordor, which I will not utter here
This is Javascript. If you put big wheels and a racing stripe on a golf cart, it's still a fucking golf cart.
"I don't know, extraterrestrial?" "You mean like from space?" "No, from Canada." If software development were a circus, we would all be the clowns. -
As a programmer, I'm very intrigued. So you're saying that 103956 is more than 32768? Very interesting. Never knew that. Thanks. :doh:
Smart K8 wrote:
Very interesting. Never knew that.
It's a trending way to handle error messages. When your system crashes you popup random facts so that at least the user is gaining knowledge while using your app. ;)
There are two kinds of people in the world: those who can extrapolate from incomplete data. There are only 10 types of people in the world, those who understand binary and those who don't.
-
Smart K8 wrote:
Very interesting. Never knew that.
It's a trending way to handle error messages. When your system crashes you popup random facts so that at least the user is gaining knowledge while using your app. ;)
There are two kinds of people in the world: those who can extrapolate from incomplete data. There are only 10 types of people in the world, those who understand binary and those who don't.
-
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
Pong?
Someone's therapist knows all about you!
-
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
It is common that indices is 16bit. I guess your kid just load a complex 3D model contains many vertices.
-
Can't you even obey simple instructions? You are supposed to tell a programmer, not 12,821,673 >32768 of them! :laugh:
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
He did.
-
He says it crashes a lot, but he hadn't seen this one before. Are they really using signed 16-bit addresses? :-D Image of Steam index out of range error[^]
Purpose of using indices is to save GPU memory. A vertex contains the very least x, y, z coordinates. It can also have additional info like RGBA color or other texture coordinates(u and v). If it just has (x, y, z), its size is 3 * sizeof(float). A vertex often appears more than once in nearby triangles. If possible, we want to represent this same vertex with an index number, instead of duplicating same information. If indices is 32 bit, we can end up using more memory than saving it.
-
Anthony Mushrow wrote:
If you're using 16-bit index buffers then you can only reference up to vertex 65535, but you can still have up to 4294967295 indices in your buffer, though I've never actually tried.
I know that, but the error message that was shown mentioned 32768 as max. index value, so we must assume that they used a 16 bit signed type in the index buffer. And in the end it's irrelevant how many vertices you have in the vertex buffer if you can't access them.
The language is JavaScript. that of Mordor, which I will not utter here
This is Javascript. If you put big wheels and a racing stripe on a golf cart, it's still a fucking golf cart.
"I don't know, extraterrestrial?" "You mean like from space?" "No, from Canada." If software development were a circus, we would all be the clowns.The message actually says that there are too many indices for the index buffer, not that a specific index is too high. I've never seen anybody use a signed type in an index buffer since it's just a waste, I'm not even sure if you can. You could use a signed type in your own code, but it'll be interpreted as unsigned on the GPU.
-
The message actually says that there are too many indices for the index buffer, not that a specific index is too high. I've never seen anybody use a signed type in an index buffer since it's just a waste, I'm not even sure if you can. You could use a signed type in your own code, but it'll be interpreted as unsigned on the GPU.
Not to mention awkward. Did you see the video I posted here[^] two days ago. I must have gotten something right. :-)
The language is JavaScript. that of Mordor, which I will not utter here
This is Javascript. If you put big wheels and a racing stripe on a golf cart, it's still a fucking golf cart.
"I don't know, extraterrestrial?" "You mean like from space?" "No, from Canada." If software development were a circus, we would all be the clowns.