What path does it come up with if you connect time with movement? For example: moving straight is 1 second, but turning and moving straight takes 2 seconds?
I was also thinking about the diagonal movements. The hypotenuse is longer than each side, though shorter than travelling the two sides. Do you count each diagonal movement as the square root of 2, rather than 1?
What an incredibly useful exercise! You learned more in that optimization effort than 90% of the engineers I've ever worked with, not to mention the general public.
And the real-world applications are endless--not just in technical fields. The great flaw in so much political thinking is the idea that "we can fix this problem" without considering if it's really the right problem to fix, balanced against all the competing problems. Even the USSR optimized their systems for the outcome they got.
Thank you for sharing your project, process, and especially conclusions. Extrapolating your findings beyond Albert Heijn makes your work relevant to all of us.
The moment you say path A is the robotic version without reading the rest I can see something easily missed in a supermarket. Turns are cost. B looks like that preferring straight lines for a reason. It's the same as riding a bicycle and so many accidents caused by a reluctance to slow down because speeding up again is a cost. I'm not even sure you took this into account. Add directionality to it and prefer to go in the same direction by default until it can't. I'm afraid your mission is still far from complete. No wait you got there in the end it just took a few turns to do so.
I realize this is basically the opposite of the point, but you likely can solve this instance exactly.
First I'd point out that there is never any reason to do anything other than one of the following two policies in one non-branching corridor: Either do it in one pass via zig zag, or two passes via going down then back. So you can reduce the size of the problem significantly by replacing the nodes outside of intersections with weighted edges.
After that, the problem is likely small enough to solve exactly with an industrial strength solver. (In fact, it might even be to start with, but you'd have to check.)
You definitely can't solve the general problem in polynomial time, but that's why said "this instance". The complexity of a problem refers to the scaling as the size increases, but it doesn't make sense to talk about whether a specific problem instance is in P or not (indeed you can solve any given instance in constant time by just hard coding the solution into your program).
What I meant was that empirically speaking, even fairly large instances of many NP-hard problems can be solved quickly using something like z3 or Gurobi.
Aye, I suppose those approaches might find optimal solutions in this case. The real trick is knowing in advance which metaheuristic is suitable per case. If you knew this, you’d end up with a gaggle of algorithms to implement. For this author’s application, a general purpose metaheuristic, like SA or Tabu, was just the ticket. With the right local operators, and time spent tuning parameters, even SA can find optimal solutions.
(Parallel and population-based generalised approaches have a good record, with caveats)
I never really considered using some industrial solver for it, because I immediately thought that this instance would be way too big (approx 300 nodes or so).
A path is basically a permutation of all the nodes so that would mean that there would be approx 300! (10^164) different paths. I know that there are a lot of smart optimizations those solvers use but I would doubt that they would be able to prune enough to make this a feasible method.
I haven’t looked into this so I could be completely wrong however, so don’t take this too seriously. Would love to hear more about it
I'm not going to say anything about writing a program for that. Not only have I written similar programs, I'm the guy who, when asked to collect and stack sortation trays in a warehouse, started ordering them using the Fibonacci sequence (one gray tray, one blue tray, two gray, three blue, five gray, eight blue...) out of boredom.
Very interesting. Have you considered adding another layer of complexity, including different distributions of the most requested products and those that are less in demand?
What path does it come up with if you connect time with movement? For example: moving straight is 1 second, but turning and moving straight takes 2 seconds?
Good idea, might look into that in the coming weeks
I was also thinking about the diagonal movements. The hypotenuse is longer than each side, though shorter than travelling the two sides. Do you count each diagonal movement as the square root of 2, rather than 1?
Yeah, good observation! The diagonal movements account for square root of 2 distance, so about 1.414
What an incredibly useful exercise! You learned more in that optimization effort than 90% of the engineers I've ever worked with, not to mention the general public.
And the real-world applications are endless--not just in technical fields. The great flaw in so much political thinking is the idea that "we can fix this problem" without considering if it's really the right problem to fix, balanced against all the competing problems. Even the USSR optimized their systems for the outcome they got.
Thank you for this insight and thanks for reading!
Maybe you should inform Albert Heijn how to design the shop to optimize for cleaning?
Thank you for sharing your project, process, and especially conclusions. Extrapolating your findings beyond Albert Heijn makes your work relevant to all of us.
And thank you for reading!
The moment you say path A is the robotic version without reading the rest I can see something easily missed in a supermarket. Turns are cost. B looks like that preferring straight lines for a reason. It's the same as riding a bicycle and so many accidents caused by a reluctance to slow down because speeding up again is a cost. I'm not even sure you took this into account. Add directionality to it and prefer to go in the same direction by default until it can't. I'm afraid your mission is still far from complete. No wait you got there in the end it just took a few turns to do so.
Haha yeah I totally agree, there are unlimited ways to extend this to make it even more realistic and useful.
I realize this is basically the opposite of the point, but you likely can solve this instance exactly.
First I'd point out that there is never any reason to do anything other than one of the following two policies in one non-branching corridor: Either do it in one pass via zig zag, or two passes via going down then back. So you can reduce the size of the problem significantly by replacing the nodes outside of intersections with weighted edges.
After that, the problem is likely small enough to solve exactly with an industrial strength solver. (In fact, it might even be to start with, but you'd have to check.)
Can you solve this exactly in polynomial time? You win medals for proving P==NP…
You definitely can't solve the general problem in polynomial time, but that's why said "this instance". The complexity of a problem refers to the scaling as the size increases, but it doesn't make sense to talk about whether a specific problem instance is in P or not (indeed you can solve any given instance in constant time by just hard coding the solution into your program).
What I meant was that empirically speaking, even fairly large instances of many NP-hard problems can be solved quickly using something like z3 or Gurobi.
Aye, I suppose those approaches might find optimal solutions in this case. The real trick is knowing in advance which metaheuristic is suitable per case. If you knew this, you’d end up with a gaggle of algorithms to implement. For this author’s application, a general purpose metaheuristic, like SA or Tabu, was just the ticket. With the right local operators, and time spent tuning parameters, even SA can find optimal solutions.
(Parallel and population-based generalised approaches have a good record, with caveats)
I never really considered using some industrial solver for it, because I immediately thought that this instance would be way too big (approx 300 nodes or so).
A path is basically a permutation of all the nodes so that would mean that there would be approx 300! (10^164) different paths. I know that there are a lot of smart optimizations those solvers use but I would doubt that they would be able to prune enough to make this a feasible method.
I haven’t looked into this so I could be completely wrong however, so don’t take this too seriously. Would love to hear more about it
I'm not going to say anything about writing a program for that. Not only have I written similar programs, I'm the guy who, when asked to collect and stack sortation trays in a warehouse, started ordering them using the Fibonacci sequence (one gray tray, one blue tray, two gray, three blue, five gray, eight blue...) out of boredom.
Haha, that is funny, definitely a really programmer thing to do
Perfection--or in this case, optimization--like beauty, is in the eye of the beholder.
Very interesting. Have you considered adding another layer of complexity, including different distributions of the most requested products and those that are less in demand?