Branch-and-bound (BB) is an algorithm design paradigm used for solving discrete and combinatorial optimization problems by systematically enumerating candidate solutions through state-space search. It treats the solution set as a rooted tree, exploring branches that represent subsets of solutions. By estimating bounds on these branches, the algorithm discards those that cannot yield better results than the current best, improving efficiency over exhaustive search. First proposed in 1960 by Ailsa Land and Alison Doig at the London School of Economics with British Petroleum sponsorship, branch-and-bound is now widely used to solve NP-hard problems such as the traveling salesman problem.
Overview
The goal of a branch-and-bound algorithm is to find a value x that maximizes or minimizes the value of a real-valued function f(x), called an objective function, among some set S of admissible or candidate solutions. The set S is called the search space, or feasible region. The rest of this section assumes that minimization of f(x) is desired; this assumption comes without loss of generality, since one can find the maximum value of f(x) by finding the minimum of g(x) = −f(x). A B&B algorithm operates according to two principles:
- It recursively splits the search space into smaller spaces, then minimizes f(x) on these smaller spaces; the splitting is called branching.
- Branching alone would amount to brute-force enumeration of candidate solutions and testing them all. To improve on the performance of brute-force search, a B&B algorithm keeps track of bounds on the minimum that it is trying to find, and uses these bounds to "prune" the search space, eliminating candidate solutions that it can prove will not contain an optimal solution.
Turning these principles into a concrete algorithm for a specific optimization problem requires some kind of data structure that represents sets of candidate solutions. Such a representation is called an instance of the problem. Denote the set of candidate solutions of an instance I by SI. The instance representation has to come with three operations:
- branch(I) produces two or more instances that each represent a subset of SI. (Typically, the subsets are disjoint to prevent the algorithm from visiting the same candidate solution twice, but this is not required. However, an optimal solution among SI must be contained in at least one of the subsets.6)
- bound(I) computes a lower bound on the value of any candidate solution in the space represented by I, that is, bound(I) ≤ f(x) for all x in SI.
- solution(I) determines whether I represents a single candidate solution. (Optionally, if it does not, then the operation may choose to return some feasible solution from among SI.7) If solution(I) returns a solution, then f(solution(I)) provides an upper bound for the optimal objective value over the whole space of feasible solutions.
Using these operations, a B&B algorithm performs a top-down recursive search through the tree of instances formed by the branch operation. Upon visiting an instance I, it checks whether bound(I) is equal to or greater than the current upper bound; if so, I may be safely discarded from the search and the recursion stops. This pruning step is usually implemented by maintaining a global variable that records the minimum upper bound seen among all instances examined so far.
Generic version
The following is the skeleton of a generic branch-and-bound algorithm for minimizing an arbitrary objective function f.8 To obtain an actual algorithm from this, one requires a bounding function bound, that computes lower bounds of f on nodes of the search tree, as well as a problem-specific branching rule. As such, the generic algorithm presented here is a higher-order function.
- Using a heuristic, find a solution xh to the optimization problem. Store its value, B = f(xh). (If no heuristic is available, set B to infinity.) B will denote the best solution found so far, and will be used as an upper bound on candidate solutions.
- Initialize a queue to hold a partial solution with none of the variables of the problem assigned.
- Loop until the queue is empty:
- Take a node N off the queue.
- If N represents a single candidate solution x and f(x) < B, then x is the best solution so far. Record it and set B ← f(x).
- Else, branch on N to produce new nodes Ni. For each of these:
- If bound(Ni) > B, do nothing; since the lower bound on this node is greater than the upper bound of the problem, it will never lead to the optimal solution, and can be discarded.
- Else, store Ni on the queue.
Several different queue data structures can be used. This FIFO-queue-based implementation yields a breadth-first search. A stack (LIFO queue) will yield a depth-first algorithm. A best-first branch-and-bound algorithm can be obtained by using a priority queue that sorts nodes on their lower bounds.9
Examples of best-first search algorithms with this premise are Dijkstra's algorithm and its descendant A* search. The depth-first variant is recommended when no good heuristic is available for producing an initial solution, because it quickly produces full solutions, and therefore upper bounds.10
Pseudocode
A C++-like pseudocode implementation of the above is:
// C++-like implementation of branch and bound, // assuming the objective function f is to be minimized CombinatorialSolution branch_and_bound_solve( CombinatorialProblem problem, ObjectiveFunction objective_function /*f*/, BoundingFunction lower_bound_function /*bound*/) { // Step 1 above. double problem_upper_bound = std::numeric_limits<double>::infinity; // = B CombinatorialSolution heuristic_solution = heuristic_solve(problem); // x_h problem_upper_bound = objective_function(heuristic_solution); // B = f(x_h) CombinatorialSolution current_optimum = heuristic_solution; // Step 2 above queue<CandidateSolutionTree> candidate_queue; // problem-specific queue initialization candidate_queue = populate_candidates(problem); while (!candidate_queue.empty()) { // Step 3 above // Step 3.1 CandidateSolutionTree node = candidate_queue.pop(); // "node" represents N above if (node.represents_single_candidate()) { // Step 3.2 if (objective_function(node.candidate()) < problem_upper_bound) { current_optimum = node.candidate(); problem_upper_bound = objective_function(current_optimum); } // else, node is a single candidate which is not optimum } else { // Step 3.3: node represents a branch of candidate solutions // "child_branch" represents N_i above for (auto&& child_branch : node.candidate_nodes) { if (lower_bound_function(child_branch) <= problem_upper_bound) { candidate_queue.enqueue(child_branch); // Step 3.3.2 } // otherwise, bound(N_i) > B so we prune the branch; step 3.3.1 } } } return current_optimum; }In the above pseudocode, the functions heuristic_solve and populate_candidates called as subroutines must be provided as applicable to the problem. The functions f (objective_function) and bound (lower_bound_function) are treated as function objects as written, and could correspond to lambda expressions, function pointers, and other types of callable objects in the C++ programming language.
Improvements
When x {\displaystyle \mathbf {x} } is a vector of R n {\displaystyle \mathbb {R} ^{n}} , branch-and-bound algorithms can be combined with interval analysis11 and contractor techniques to provide guaranteed enclosures of the global minimum.1213
Applications
This approach is used for a number of NP-hard problems:
- Integer programming
- Nonlinear programming
- Travelling salesman problem (TSP)1415
- Quadratic assignment problem (QAP)
- Maximum satisfiability problem (MAX-SAT)
- Nearest neighbor search16 (by Keinosuke Fukunaga)
- Flow shop scheduling
- Cutting stock problem
- Computational phylogenetics
- Set inversion
- Parameter estimation
- 0/1 knapsack problem
- Set cover problem
- Feature selection in machine learning1718
- Structured prediction in computer vision19: 267–276
- Arc routing problem, including the Chinese Postman problem
- Talent Scheduling, scenes-shooting arrangement problem
Branch-and-bound may also be a base of various heuristics. For example, one may wish to stop branching when the gap between the upper and lower bounds becomes smaller than a certain threshold. This is used when the solution is "good enough for practical purposes" and can greatly reduce the computations required. This type of solution is particularly applicable when the cost function used is noisy or is the result of statistical estimates and so is not known precisely but rather only known to lie within a range of values with a specific probability.
Relation to other algorithms
Nau et al. present a generalization of branch and bound that also subsumes the A*, B* and alpha-beta search algorithms.20
Optimization example
Branch-and-bound can be used maximize Z = 5 x 1 + 6 x 2 {\displaystyle Z=5x_{1}+6x_{2}} with the constraints
x 1 + x 2 ≤ 50 {\displaystyle x_{1}+x_{2}\leq 50}
4 x 1 + 7 x 2 ≤ 280 {\displaystyle 4x_{1}+7x_{2}\leq 280}
x 1 , x 2 ≥ 0 {\displaystyle x_{1},x_{2}\geq 0}
x 1 {\displaystyle x_{1}} and x 2 {\displaystyle x_{2}} are integers.
The first step is to relax the integer constraint. We have two extreme points for the first equation that form a line: [ x 1 x 2 ] = [ 50 0 ] {\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}={\begin{bmatrix}50\\0\end{bmatrix}}} and [ 0 50 ] {\displaystyle {\begin{bmatrix}0\\50\end{bmatrix}}} . We can form the second line with the vector points [ 0 40 ] {\displaystyle {\begin{bmatrix}0\\40\end{bmatrix}}} and [ 70 0 ] {\displaystyle {\begin{bmatrix}70\\0\end{bmatrix}}} .
The third point is [ 0 0 ] {\displaystyle {\begin{bmatrix}0\\0\end{bmatrix}}} . This is a convex hull region, so the solution lies on one of the vertices of the region. We can find the intersection using row reduction, which is [ 70 / 3 80 / 3 ] {\displaystyle {\begin{bmatrix}70/3\\80/3\end{bmatrix}}} with a value of 276 + 2/3. We test the other endpoints by sweeping the line over the region and find this is the maximum over the reals.
We choose the variable with the maximum fractional part, in this case x 2 {\displaystyle x_{2}} becomes the parameter for the branch and bound method. We branch to x 2 ≤ 26 {\displaystyle x_{2}\leq 26} and obtain 276 at ⟨ 24 , 26 ⟩ {\displaystyle \langle 24,26\rangle } . We have reached an integer solution so we move to the other branch x 2 ≥ 27 {\displaystyle x_{2}\geq 27} . We obtain 275.75 at ⟨ 22.75 , 27 ⟩ {\displaystyle \langle 22.75,27\rangle } . We have a decimal, so we branch x 1 {\displaystyle x_{1}} to x 1 ≤ 22 {\displaystyle x_{1}\leq 22} and we find 274.571 at ⟨ 22 , 27.4286 ⟩ {\displaystyle \langle 22,27.4286\rangle } . We try the other branch x 1 ≥ 23 {\displaystyle x_{1}\geq 23} and there are no feasible solutions. Therefore, the maximum is 276 with x 1 = 24 {\displaystyle x_{1}=24} and x 2 = 26 {\displaystyle x_{2}=26} .
See also
- Backtracking
- Branch-and-cut, a hybrid between branch-and-bound and the cutting plane methods that is used extensively for solving integer linear programs.
- Evolutionary algorithm
- Alpha–beta pruning
External links
- LiPS – Free easy-to-use GUI program intended for solving linear, integer and goal programming problems.
- Cbc – (Coin-or branch and cut) is an open-source mixed integer programming solver written in C++.
References
A. H. Land and A. G. Doig (1960). "An automatic method of solving discrete programming problems". Econometrica. 28 (3): 497–520. doi:10.2307/1910129. JSTOR 1910129. /wiki/Doi_(identifier) ↩
"Staff News". www.lse.ac.uk. Archived from the original on 2021-02-24. Retrieved 2018-10-08. https://web.archive.org/web/20210224173541/https://www.lse.ac.uk/newsletters/pressAndInformation/staffNews/2010/20100218.htm ↩
Clausen, Jens (1999). Branch and Bound Algorithms—Principles and Examples (PDF) (Technical report). University of Copenhagen. Archived from the original (PDF) on 2015-09-23. Retrieved 2014-08-13. https://web.archive.org/web/20150923214803/http://www.diku.dk/OLD/undervisning/2003e/datV-optimer/JensClausenNoter.pdf ↩
Little, John D. C.; Murty, Katta G.; Sweeney, Dura W.; Karel, Caroline (1963). "An algorithm for the traveling salesman problem" (PDF). Operations Research. 11 (6): 972–989. doi:10.1287/opre.11.6.972. hdl:1721.1/46828. http://dspace.mit.edu/bitstream/handle/1721.1/46828/algorithmfortrav00litt.pdf ↩
Balas, Egon; Toth, Paolo (1983). Branch and bound methods for the traveling salesman problem (PDF) (Report). Carnegie Mellon University Graduate School of Industrial Administration. Archived (PDF) from the original on October 20, 2012. http://apps.dtic.mil/dtic/tr/fulltext/u2/a126957.pdf ↩
Bader, David A.; Hart, William E.; Phillips, Cynthia A. (2004). "Parallel Algorithm Design for Branch and Bound" (PDF). In Greenberg, H. J. (ed.). Tutorials on Emerging Methodologies and Applications in Operations Research. Kluwer Academic Press. Archived from the original (PDF) on 2017-08-13. Retrieved 2015-09-16. /wiki/Cynthia_A._Phillips ↩
Bader, David A.; Hart, William E.; Phillips, Cynthia A. (2004). "Parallel Algorithm Design for Branch and Bound" (PDF). In Greenberg, H. J. (ed.). Tutorials on Emerging Methodologies and Applications in Operations Research. Kluwer Academic Press. Archived from the original (PDF) on 2017-08-13. Retrieved 2015-09-16. /wiki/Cynthia_A._Phillips ↩
Clausen, Jens (1999). Branch and Bound Algorithms—Principles and Examples (PDF) (Technical report). University of Copenhagen. Archived from the original (PDF) on 2015-09-23. Retrieved 2014-08-13. https://web.archive.org/web/20150923214803/http://www.diku.dk/OLD/undervisning/2003e/datV-optimer/JensClausenNoter.pdf ↩
Clausen, Jens (1999). Branch and Bound Algorithms—Principles and Examples (PDF) (Technical report). University of Copenhagen. Archived from the original (PDF) on 2015-09-23. Retrieved 2014-08-13. https://web.archive.org/web/20150923214803/http://www.diku.dk/OLD/undervisning/2003e/datV-optimer/JensClausenNoter.pdf ↩
Mehlhorn, Kurt; Sanders, Peter (2008). Algorithms and Data Structures: The Basic Toolbox (PDF). Springer. p. 249. /wiki/Kurt_Mehlhorn ↩
Moore, R. E. (1966). Interval Analysis. Englewood Cliff, New Jersey: Prentice-Hall. ISBN 0-13-476853-1. 0-13-476853-1 ↩
Jaulin, L.; Kieffer, M.; Didrit, O.; Walter, E. (2001). Applied Interval Analysis. Berlin: Springer. ISBN 1-85233-219-0. 1-85233-219-0 ↩
Hansen, E.R. (1992). Global Optimization using Interval Analysis. New York: Marcel Dekker. ↩
Little, John D. C.; Murty, Katta G.; Sweeney, Dura W.; Karel, Caroline (1963). "An algorithm for the traveling salesman problem" (PDF). Operations Research. 11 (6): 972–989. doi:10.1287/opre.11.6.972. hdl:1721.1/46828. http://dspace.mit.edu/bitstream/handle/1721.1/46828/algorithmfortrav00litt.pdf ↩
Conway, Richard Walter; Maxwell, William L.; Miller, Louis W. (2003). Theory of Scheduling. Courier Dover Publications. pp. 56–61. ISBN 978-0-486-42817-8. 978-0-486-42817-8 ↩
Fukunaga, Keinosuke; Narendra, Patrenahalli M. (1975). "A branch and bound algorithm for computing k-nearest neighbors". IEEE Transactions on Computers (7): 750–753. doi:10.1109/t-c.1975.224297. S2CID 5941649. /wiki/Doi_(identifier) ↩
Narendra, Patrenahalli M.; Fukunaga, K. (1977). "A branch and bound algorithm for feature subset selection" (PDF). IEEE Transactions on Computers. C-26 (9): 917–922. doi:10.1109/TC.1977.1674939. S2CID 26204315. http://www.computer.org/csdl/trans/tc/1977/09/01674939.pdf ↩
Hazimeh, Hussein; Mazumder, Rahul; Saab, Ali (2020). "Sparse Regression at Scale: Branch-and-Bound rooted in First-Order Optimization". arXiv:2004.06152 [stat.CO]. /wiki/ArXiv_(identifier) ↩
Nowozin, Sebastian; Lampert, Christoph H. (2011). "Structured Learning and Prediction in Computer Vision". Foundations and Trends in Computer Graphics and Vision. 6 (3–4): 185–365. CiteSeerX 10.1.1.636.2651. doi:10.1561/0600000033. ISBN 978-1-60198-457-9. 978-1-60198-457-9 ↩
Nau, Dana S.; Kumar, Vipin; Kanal, Laveen (1984). "General branch and bound, and its relation to A∗ and AO∗" (PDF). Artificial Intelligence. 23 (1): 29–58. doi:10.1016/0004-3702(84)90004-3. https://www.cs.umd.edu/~nau/papers/nau1984general.pdf ↩