Need cut optimization algorith
-
Hello ! I'm making cut optimization software, and have problem with the cut optimization algorithm. I have one big rectangle, and x smaller rectangles (that may and may not be of the same size). Now I need way to calculate optimum position of these rectangles, so that as many as it is possible can fit into big one ... Also I need to draw that positions somehow ... Can anybody help ? Thanks
-
Hello ! I'm making cut optimization software, and have problem with the cut optimization algorithm. I have one big rectangle, and x smaller rectangles (that may and may not be of the same size). Now I need way to calculate optimum position of these rectangles, so that as many as it is possible can fit into big one ... Also I need to draw that positions somehow ... Can anybody help ? Thanks
-
It depends. Do you know more about the possible sizes of the small rectangles? For some combinations of sizes a simple greedy approach would be optimal. Also, are the rectangles allowed to be turned 90 degrees?
Yes, rectangles are allowed to be turned 90 degrees. I don't know possible sizes, they can be anything. Is there any way to somehow calculate coordinates of rectangles inside big rectangle? What do you mean by greedy approach ? Is there any kind of already made algorithm that could help me ?
-
Yes, rectangles are allowed to be turned 90 degrees. I don't know possible sizes, they can be anything. Is there any way to somehow calculate coordinates of rectangles inside big rectangle? What do you mean by greedy approach ? Is there any kind of already made algorithm that could help me ?
Greedy is when a series of optimal local choices will always lead to a global optimum Which is not the case here. It would have been for some sizes of small rectangles. But basically, this sucks. Does it absolutely have to be actually optimal? Also, are the coordinates not-too-big integers? If they are, what you could try (but I don't guarantee it's the best way) is a limited form of Branch&Bound combined with a heuristic to determine which choice to test first (to test "probable" cases first) and bound the subtree when you can determine that due to earlier choices it would be completely impossible to find a solution in that subtree (such as when the biggest open space is not big enough to place the biggest unplaced small-rectangle in it) It will still suck for any but tiny instances of the problem. There would be an enormous number of choices at each node - the total number of possible placements of all rectangles that were not yet placed. The heuristic might be something like "try the biggest rectangle first" combined with a sane way to place it (eg near the edge). I don't even see a way to do real B&B on this.. It's stupid though, there are many overlapping subproblems (you could use memoization but you'd get a Huge lookup table - maybe try something like "memoization only for the first k levels of recursion") and this algorithm would likely take longer than the estimated age of the universe to run on any interesting instance of the problem. If a non-optimal (but "good") result is also OK, you could use basically the same algorithm but with far less choices (eg use an heuristic to do only the k "probably best" choices at each node) - you can still expect major suckage. I hope this helped anyway.. though I don't think that's likely.. sorry Note: I apologize for all errors and falsities in this post, there can't be none but I don't know what they are yet.
-
Greedy is when a series of optimal local choices will always lead to a global optimum Which is not the case here. It would have been for some sizes of small rectangles. But basically, this sucks. Does it absolutely have to be actually optimal? Also, are the coordinates not-too-big integers? If they are, what you could try (but I don't guarantee it's the best way) is a limited form of Branch&Bound combined with a heuristic to determine which choice to test first (to test "probable" cases first) and bound the subtree when you can determine that due to earlier choices it would be completely impossible to find a solution in that subtree (such as when the biggest open space is not big enough to place the biggest unplaced small-rectangle in it) It will still suck for any but tiny instances of the problem. There would be an enormous number of choices at each node - the total number of possible placements of all rectangles that were not yet placed. The heuristic might be something like "try the biggest rectangle first" combined with a sane way to place it (eg near the edge). I don't even see a way to do real B&B on this.. It's stupid though, there are many overlapping subproblems (you could use memoization but you'd get a Huge lookup table - maybe try something like "memoization only for the first k levels of recursion") and this algorithm would likely take longer than the estimated age of the universe to run on any interesting instance of the problem. If a non-optimal (but "good") result is also OK, you could use basically the same algorithm but with far less choices (eg use an heuristic to do only the k "probably best" choices at each node) - you can still expect major suckage. I hope this helped anyway.. though I don't think that's likely.. sorry Note: I apologize for all errors and falsities in this post, there can't be none but I don't know what they are yet.
Yeah, you're helping and thanks for that :-D :-D Anyway, I don't think that coordinates will be bigger than 32000 (maybe not bigger than 10 000). I've thought of similar approach with recursion and memoization, but I'm not sure how to do it (I think I'll run into memory problems). And yeah, non-optimal but "good" result should be OK. I'm not familiar with Branch&Bound, but I'll research it and try to learn something :) And one more thought ... Should I try placing every rectangle two times(normal and 90 degrees rotated) in the big rectangle and use memoization? Then I can determine how must small rectangles can there be for that algorithm to be executing in about 10 seconds ? And I also have one more problem ... What's the best way to remember what parts of big rectangle I've already used ? I don't know how to do it ... Should I try with some custom class or ?? Please tell me if you have any ideas :) And one more time, thanks for your big help !
-
Yeah, you're helping and thanks for that :-D :-D Anyway, I don't think that coordinates will be bigger than 32000 (maybe not bigger than 10 000). I've thought of similar approach with recursion and memoization, but I'm not sure how to do it (I think I'll run into memory problems). And yeah, non-optimal but "good" result should be OK. I'm not familiar with Branch&Bound, but I'll research it and try to learn something :) And one more thought ... Should I try placing every rectangle two times(normal and 90 degrees rotated) in the big rectangle and use memoization? Then I can determine how must small rectangles can there be for that algorithm to be executing in about 10 seconds ? And I also have one more problem ... What's the best way to remember what parts of big rectangle I've already used ? I don't know how to do it ... Should I try with some custom class or ?? Please tell me if you have any ideas :) And one more time, thanks for your big help !
xx77abs wrote:
And one more thought ... Should I try placing every rectangle two times(normal and 90 degrees rotated) in the big rectangle and use memoization? Then I can determine how must small rectangles can there be for that algorithm to be executing in about 10 seconds ?
I don't really know what you mean by that - could you explain it in more detail?
xx77abs wrote:
And I also have one more problem ... What's the best way to remember what parts of big rectangle I've already used ? I don't know how to do it ... Should I try with some custom class or ?? Please tell me if you have any ideas
I have a couple of idea, but none of them are really good: 1) directly use the positions of the rectangles to calculate intersections (slow) 2) use a bitarray (too much memory) 3) hybrid: store big rectangles as rectangles, but use bitarrays to store the small rectangles (justification: in general the small rectangles will end up close to each other since using small rectangles leaves small gaps) Very complex though, and it may not be worth the effort You're welcome, but I have the nagging feeling that there is a better solution - I just don't know what it is.
-
xx77abs wrote:
And one more thought ... Should I try placing every rectangle two times(normal and 90 degrees rotated) in the big rectangle and use memoization? Then I can determine how must small rectangles can there be for that algorithm to be executing in about 10 seconds ?
I don't really know what you mean by that - could you explain it in more detail?
xx77abs wrote:
And I also have one more problem ... What's the best way to remember what parts of big rectangle I've already used ? I don't know how to do it ... Should I try with some custom class or ?? Please tell me if you have any ideas
I have a couple of idea, but none of them are really good: 1) directly use the positions of the rectangles to calculate intersections (slow) 2) use a bitarray (too much memory) 3) hybrid: store big rectangles as rectangles, but use bitarrays to store the small rectangles (justification: in general the small rectangles will end up close to each other since using small rectangles leaves small gaps) Very complex though, and it may not be worth the effort You're welcome, but I have the nagging feeling that there is a better solution - I just don't know what it is.
harold aptroot wrote:
I don't really know what you mean by that - could you explain it in more detail?
I meant that I'll try and make recursive function that will implement memoization and try to find best possible solution. Then I'll run it with various number of elements and determine what's the case in which recursive function works OK(not too slow). And if that's OK for the person that has requested the program from me - problem solver :)
harold aptroot wrote:
I have a couple of idea, but none of them are really good: 1) directly use the positions of the rectangles to calculate intersections (slow) 2) use a bitarray (too much memory) 3) hybrid: store big rectangles as rectangles, but use bitarrays to store the small rectangles (justification: in general the small rectangles will end up close to each other since using small rectangles leaves small gaps) Very complex though, and it may not be worth the effort You're welcome, but I have the nagging feeling that there is a better solution - I just don't know what it is.
1. Yes, I think that would be too slow 2. Yes, it would take too much memory if big rectangles maximum size if 32000x32000. But if it is 10000x10000, then memory consumption would be about 100 MB (that I can spare). So this is maybe good solution (I have to contact the person that has requested the program ...) 3. Well, I'm not even sure how to implement this :) Yeah, I too have feeling that there's a better solution - maybe already completed algorithm ...
-
harold aptroot wrote:
I don't really know what you mean by that - could you explain it in more detail?
I meant that I'll try and make recursive function that will implement memoization and try to find best possible solution. Then I'll run it with various number of elements and determine what's the case in which recursive function works OK(not too slow). And if that's OK for the person that has requested the program from me - problem solver :)
harold aptroot wrote:
I have a couple of idea, but none of them are really good: 1) directly use the positions of the rectangles to calculate intersections (slow) 2) use a bitarray (too much memory) 3) hybrid: store big rectangles as rectangles, but use bitarrays to store the small rectangles (justification: in general the small rectangles will end up close to each other since using small rectangles leaves small gaps) Very complex though, and it may not be worth the effort You're welcome, but I have the nagging feeling that there is a better solution - I just don't know what it is.
1. Yes, I think that would be too slow 2. Yes, it would take too much memory if big rectangles maximum size if 32000x32000. But if it is 10000x10000, then memory consumption would be about 100 MB (that I can spare). So this is maybe good solution (I have to contact the person that has requested the program ...) 3. Well, I'm not even sure how to implement this :) Yeah, I too have feeling that there's a better solution - maybe already completed algorithm ...
-
Hello ! I'm making cut optimization software, and have problem with the cut optimization algorithm. I have one big rectangle, and x smaller rectangles (that may and may not be of the same size). Now I need way to calculate optimum position of these rectangles, so that as many as it is possible can fit into big one ... Also I need to draw that positions somehow ... Can anybody help ? Thanks
-
I've worked on a similar problem before but not exactly like your issue. I would suggest that you google "rectangle packing problem" and read some of the papers in the google links. Maybe you can modify the algorithm to work for your case.