Misplaced Pages

Travelling salesman problem

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and explores the relationships between these classifications. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm .

#843156

139-459: In the theory of computational complexity , the travelling salesman problem ( TSP ) asks the following question: "Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city exactly once and returns to the origin city?" It is an NP-hard problem in combinatorial optimization , important in theoretical computer science and operations research . The travelling purchaser problem ,

278-422: A formal language , where the members of the language are instances whose output is yes, and the non-members are those instances whose output is no. The objective is to decide, with the aid of an algorithm , whether a given input string is a member of the formal language under consideration. If the algorithm deciding this problem returns the answer yes , the algorithm is said to accept the input string, otherwise it

417-539: A similarity measure between DNA fragments. The TSP also appears in astronomy, as astronomers observing many sources want to minimize the time spent moving the telescope between the sources; in such problems, the TSP can be embedded inside an optimal control problem . In many applications, additional constraints such as limited resources or time windows may be imposed. The origins of the travelling salesman problem are unclear. A handbook for travelling salesmen from 1832 mentions

556-545: A (very) slightly improved approximation algorithm was developed for the subset of "graphical" TSPs. In 2020 this tiny improvement was extended to the full (metric) TSP. Richard M. Karp showed in 1972 that the Hamiltonian cycle problem was NP-complete , which implies the NP-hardness of TSP. This supplied a mathematical explanation for the apparent computational difficulty of finding optimal tours. Great progress

695-652: A RAND staff member. RAND publishes the RAND Journal of Economics , a peer-reviewed journal of economics. Thirty-two recipients of the Nobel Prize , primarily in the fields of economics and physics, have been associated with RAND at some point in their career. RAND was created after individuals in the War Department , the Office of Scientific Research and Development , and industry began to discuss

834-724: A Turing machine. Since Turing machines are easy to analyze mathematically, and are believed to be as powerful as any other model of computation, the Turing machine is the most commonly used model in complexity theory. Many types of Turing machines are used to define complexity classes, such as deterministic Turing machines , probabilistic Turing machines , non-deterministic Turing machines , quantum Turing machines , symmetric Turing machines and alternating Turing machines . They are all equally powerful in principle, but when resources (such as time or space) are bounded, some of these may be more powerful than others. A deterministic Turing machine

973-513: A big advance in this direction: the Christofides-Serdyukov algorithm yields a solution that, in the worst case, is at most 1.5 times longer than the optimal solution. As the algorithm was simple and quick, many hoped it would give way to a near-optimal solution method. However, this hope for improvement did not immediately materialize, and Christofides-Serdyukov remained the method with the best worst-case scenario until 2011, when

1112-552: A circuit (used in circuit complexity ) and the number of processors (used in parallel computing ). One of the roles of computational complexity theory is to determine the practical limits on what computers can and cannot do. The P versus NP problem , one of the seven Millennium Prize Problems , is part of the field of computational complexity. Closely related fields in theoretical computer science are analysis of algorithms and computability theory . A key distinction between analysis of algorithms and computational complexity theory

1251-464: A computational resource. Complexity measures are very generally defined by the Blum complexity axioms . Other complexity measures used in complexity theory include communication complexity , circuit complexity , and decision tree complexity . The complexity of an algorithm is often expressed using big O notation . The best, worst and average case complexity refer to three different ways of measuring

1390-426: A departure to exactly one other city. The last constraint enforces that there is only a single tour covering all cities, and not two or more disjointed tours that only collectively cover all cities. Label the cities with the numbers 1, ..., n and define: Take c i j > 0 {\displaystyle c_{ij}>0} to be the distance from city i to city j . Then TSP can be written as

1529-473: A dummy variable u i {\displaystyle u_{i}} that keeps track of the order in which the cities are visited, counting from city 1 {\displaystyle 1} ; the interpretation is that u i < u j {\displaystyle u_{i}<u_{j}} implies city i {\displaystyle i} is visited before city j . {\displaystyle j.} For

SECTION 10

#1732801225844

1668-484: A general outline for Collbohm's proposed project. Douglas engineer Arthur Emmons Raymond came up with the name Project RAND, from "research and development". Collbohm suggested that he himself should serve as the project's first director, which he thought would be a temporary position while he searched for a permanent replacement for himself. He later became RAND's first president and served in that capacity until his retirement in 1967. On 1 October 1945, Project RAND

1807-710: A given tour (as encoded into values of the x i j {\displaystyle x_{ij}} variables), one may find satisfying values for the u i {\displaystyle u_{i}} variables by making u i {\displaystyle u_{i}} equal to the number of edges along that tour, when going from city 1 {\displaystyle 1} to city i . {\displaystyle i.} Because linear programming favors non-strict inequalities ( ≥ {\displaystyle \geq } ) over strict ( > {\displaystyle >} ), we would like to impose constraints to

1946-611: A group (fragment) of nearest unvisited cities, can find shorter routes with successive iterations. The NF operator can also be applied on an initial solution obtained by the NN algorithm for further improvement in an elitist model, where only better solutions are accepted. The bitonic tour of a set of points is the minimum-perimeter monotone polygon that has the points as its vertices; it can be computed efficiently with dynamic programming . Another constructive heuristic , Match Twice and Stitch (MTS), performs two sequential matchings , where

2085-424: A matching for the odd-degree vertices must be added, which increases the order of every odd-degree vertex by 1. This leaves us with a graph where every vertex is of even order, which is thus Eulerian. Adapting the above method gives the algorithm of Christofides and Serdyukov: The pairwise exchange or 2-opt technique involves iteratively removing two edges and replacing them with two different edges that reconnect

2224-409: A member of this set corresponds to solving the problem of multiplying two numbers. To measure the difficulty of solving a computational problem, one may wish to see how much time the best algorithm requires to solve the problem. However, the running time may, in general, depend on the instance. In particular, larger instances will require more time to solve. Thus the time required to solve a problem (or

2363-460: A method for finding an Eulerian tour to find a TSP solution. By the triangle inequality , we know that the TSP tour can be no longer than the Eulerian tour, and we therefore have a lower bound for the TSP. Such a method is described below. To improve the lower bound, a better way of creating an Eulerian graph is needed. By the triangle inequality, the best Eulerian graph must have the same cost as

2502-435: A microchip layout problem, currently the largest solved TSPLIB instance. For many other instances with millions of cities, solutions can be found that are guaranteed to be within 2–3% of an optimal tour. TSP can be modeled as an undirected weighted graph , such that cities are the graph's vertices , paths are the graph's edges , and a path's distance is the edge's weight. It is a minimization problem starting and finishing at

2641-538: A number of fields and industries. Since the 1950s, RAND research has helped inform United States policy decisions on a wide variety of issues, including the space race, the Vietnam War, the U.S.-Soviet nuclear arms confrontation, the creation of the Great Society social welfare programs, and national health care. The RAND Corporation originated as "Project RAND" (from the phrase "research and development") in

2780-426: A particular algorithm falls under the field of analysis of algorithms . To show an upper bound T ( n ) {\displaystyle T(n)} on the time complexity of a problem, one needs to show only that there is a particular algorithm with running time at most T ( n ) {\displaystyle T(n)} . However, proving lower bounds is much more difficult, since lower bounds make

2919-568: A polynomial factor of O ( n ! ) {\displaystyle O(n!)} , the factorial of the number of cities, so this solution becomes impractical even for only 20 cities. One of the earliest applications of dynamic programming is the Held–;Karp algorithm , which solves the problem in time O ( n 2 2 n ) {\displaystyle O(n^{2}2^{n})} . This bound has also been reached by Exclusion-Inclusion in an attempt preceding

SECTION 20

#1732801225844

3058-415: A polynomial-time algorithm. A Turing machine is a mathematical model of a general computing machine. It is a theoretical device that manipulates symbols contained on a strip of tape. Turing machines are not intended as a practical computing technology, but rather as a general model of a computing machine—anything from an advanced supercomputer to a mathematician with a pencil and paper. It is believed that if

3197-509: A problem being at most as difficult as another problem. For instance, if a problem X {\displaystyle X} can be solved using an algorithm for Y {\displaystyle Y} , X {\displaystyle X} is no more difficult than Y {\displaystyle Y} , and we say that X {\displaystyle X} reduces to Y {\displaystyle Y} . There are many different types of reductions, based on

3336-459: A problem can be solved by an algorithm, there exists a Turing machine that solves the problem. Indeed, this is the statement of the Church–Turing thesis . Furthermore, it is known that everything that can be computed on other models of computation known to us today, such as a RAM machine , Conway's Game of Life , cellular automata , lambda calculus or any programming language can be computed on

3475-589: A problem instance is a string over an alphabet . Usually, the alphabet is taken to be the binary alphabet (i.e., the set {0,1}), and thus the strings are bitstrings . As in a real-world computer , mathematical objects other than bitstrings must be suitably encoded. For example, integers can be represented in binary notation , and graphs can be encoded directly via their adjacency matrices , or by encoding their adjacency lists in binary. Even though some proofs of complexity-theoretic theorems regularly assume some concrete choice of input encoding, one tries to keep

3614-400: A problem instance, and should not be confused with the problem itself. In computational complexity theory, a problem refers to the abstract question to be solved. In contrast, an instance of this problem is a rather concrete utterance, which can serve as the input for a decision problem. For example, consider the problem of primality testing . The instance is a number (e.g., 15) and the solution

3753-406: A result from graph theory which helps improve on the lower bound of the TSP which originated from doubling the cost of the minimum spanning tree. Given an Eulerian graph , we can find an Eulerian tour in ⁠ O ( n ) {\displaystyle O(n)} ⁠ time, so if we had an Eulerian graph with cities from a TSP as vertices, then we can easily see that we could use such

3892-505: A route of at most 2000 kilometres passing through all of Germany's 15 largest cities? The quantitative answer to this particular problem instance is of little use for solving other instances of the problem, such as asking for a round trip through all sites in Milan whose total length is at most 10 km. For this reason, complexity theory addresses computational problems and not particular problem instances. When considering computational problems,

4031-399: A salesman who starts at a home or office and visits a fixed number of locations before returning to the start. In the following decades, the problem was studied by many researchers from mathematics , computer science , chemistry , physics , and other sciences. In the 1960s, however, a new approach was created that, instead of seeking optimal solutions, would produce a solution whose length

4170-429: A small fraction of 1%. The TSP has several applications even in its purest formulation, such as planning , logistics , and the manufacture of microchips . Slightly modified, it appears as a sub-problem in many areas, such as DNA sequencing . In these applications, the concept city represents, for example, customers, soldering points, or DNA fragments, and the concept distance represents travelling times or cost, or

4309-582: A small number of extra inequalities (cuts). They used this idea to solve their initial 49-city problem using a string model. They found they only needed 26 cuts to come to a solution for their 49 city problem. While this paper did not give an algorithmic approach to TSP problems, the ideas that lay within it were indispensable to later creating exact solution methods for the TSP, though it would take 15 years to find an algorithmic approach in creating these cuts. As well as cutting plane methods, Dantzig, Fulkerson, and Johnson used branch-and-bound algorithms perhaps for

Travelling salesman problem - Misplaced Pages Continue

4448-454: A specified vertex after having visited each other vertex exactly once. Often, the model is a complete graph (i.e., each pair of vertices is connected by an edge). If no path exists between two cities, then adding a sufficiently long edge will complete the graph without affecting the optimal tour. In the symmetric TSP , the distance between two cities is the same in each opposite direction, forming an undirected graph . This symmetry halves

4587-499: A statement about all possible algorithms that solve a given problem. The phrase "all possible algorithms" includes not just the algorithms known today, but any algorithm that might be discovered in the future. To show a lower bound of T ( n ) {\displaystyle T(n)} for a problem requires showing that no algorithm can have time complexity lower than T ( n ) {\displaystyle T(n)} . Upper and lower bounds are usually stated using

4726-414: A subset of the vertices; arguably, it is this global requirement that makes TSP a hard problem. The MTZ and DFJ formulations differ in how they express this final requirement as linear constraints. In addition to the x i j {\displaystyle x_{ij}} variables as above, there is for each i = 1 , … , n {\displaystyle i=1,\ldots ,n}

4865-471: A subtour of the cities of Q. Because this leads to an exponential number of possible constraints, in practice it is solved with row generation . The traditional lines of attack for the NP-hard problems are the following: The most direct solution would be to try all permutations (ordered combinations) and see which one is cheapest (using brute-force search ). The running time for this approach lies within

5004-422: Is "yes" if the number is prime and "no" otherwise (in this case, 15 is not prime and the answer is "no"). Stated another way, the instance is a particular input to the problem, and the solution is the output corresponding to the given input. To further highlight the difference between a problem and an instance, consider the following instance of the decision version of the travelling salesman problem : Is there

5143-438: Is a computational problem where a single output (of a total function ) is expected for every input, but the output is more complex than that of a decision problem —that is, the output is not just yes or no. Notable examples include the traveling salesman problem and the integer factorization problem . It is tempting to think that the notion of function problems is much richer than the notion of decision problems. However, this

5282-401: Is a deterministic Turing machine with an added feature of non-determinism, which allows a Turing machine to have multiple possible future actions from a given state. One way to view non-determinism is that the Turing machine branches into many possible computational paths at each step, and if it solves the problem in any of these branches, it is said to have solved the problem. Clearly, this model

5421-449: Is a set of problems of related complexity. Simpler complexity classes are defined by the following factors: Some complexity classes have complicated definitions that do not fit into this framework. Thus, a typical complexity class has a definition like the following: But bounding the computation time above by some concrete function f ( n ) {\displaystyle f(n)} often yields complexity classes that depend on

5560-578: Is at RAND's Santa Monica research facility. The Pardee RAND School is the world's largest PhD-granting program in policy analysis. Unlike many other programs, all Pardee RAND Graduate School students receive fellowships to cover their education costs. This allows them to dedicate their time to engage in research projects and provides them with on-the-job training. RAND also offers a number of internship and fellowship programs allowing students and others to assist in conducting research for RAND projects. Most of these are short-term independent projects mentored by

5699-522: Is at most f ( n ) {\displaystyle f(n)} . A decision problem A {\displaystyle A} can be solved in time f ( n ) {\displaystyle f(n)} if there exists a Turing machine operating in time f ( n ) {\displaystyle f(n)} that solves the problem. Since complexity theory is interested in classifying problems based on their difficulty, one defines sets of problems based on some criteria. For instance,

Travelling salesman problem - Misplaced Pages Continue

5838-414: Is because a polynomial-time solution to Π 1 {\displaystyle \Pi _{1}} would yield a polynomial-time solution to Π 2 {\displaystyle \Pi _{2}} . Similarly, because all NP problems can be reduced to the set, finding an NP-complete problem that can be solved in polynomial time would mean that P = NP. The complexity class P

5977-877: Is believed that N P {\displaystyle NP} is not equal to c o - N P {\displaystyle co{\text{-}}NP} ; however, it has not yet been proven. It is clear that if these two complexity classes are not equal then P {\displaystyle P} is not equal to N P {\displaystyle NP} , since P = c o - P {\displaystyle P=co{\text{-}}P} . Thus if P = N P {\displaystyle P=NP} we would have c o - P = c o - N P {\displaystyle co{\text{-}}P=co{\text{-}}NP} whence N P = P = c o - P = c o - N P {\displaystyle NP=P=co{\text{-}}P=co{\text{-}}NP} . Similarly, it

6116-410: Is defined to be the maximum time taken over all inputs of size n {\displaystyle n} . If T ( n ) {\displaystyle T(n)} is a polynomial in n {\displaystyle n} , then the algorithm is said to be a polynomial time algorithm. Cobham's thesis argues that a problem can be solved with a feasible amount of resources if it admits

6255-412: Is exactly one incoming edge and one outgoing edge, which may be expressed as the 2 n {\displaystyle 2n} linear equations These ensure that the chosen set of edges locally looks like that of a tour, but still allow for solutions violating the global requirement that there is one tour which visits all vertices, as the edges chosen could make up several tours, each visiting only

6394-424: Is harder than X {\displaystyle X} , since an algorithm for X {\displaystyle X} allows us to solve any problem in C {\displaystyle C} . The notion of hard problems depends on the type of reduction being used. For complexity classes larger than P, polynomial-time reductions are commonly used. In particular, the set of problems that are hard for NP

6533-573: Is in N P {\displaystyle NP} and in c o - N P {\displaystyle co{\text{-}}NP} (and even in UP and co-UP ). If the problem is N P {\displaystyle NP} -complete, the polynomial time hierarchy will collapse to its first level (i.e., N P {\displaystyle NP} will equal c o - N P {\displaystyle co{\text{-}}NP} ). The best known algorithm for integer factorization

6672-616: Is located in Cambridge , United Kingdom ; Brussels , Belgium ; and Rotterdam , Netherlands . RAND Australia is located in Canberra , Australia . RAND is home to the Frederick S. Pardee RAND Graduate School , one of eight original graduate programs in public policy and the first to offer a PhD . The program aims to provide practical experience for students, who work with RAND analysts on addressing real-world problems. The campus

6811-940: Is needed in order to increase the number of problems that can be solved. More precisely, the time hierarchy theorem states that D T I M E ( o ( f ( n ) ) ) ⊊ D T I M E ( f ( n ) ⋅ log ⁡ ( f ( n ) ) ) {\displaystyle {\mathsf {DTIME}}{\big (}o(f(n)){\big )}\subsetneq {\mathsf {DTIME}}{\big (}f(n)\cdot \log(f(n)){\big )}} . The space hierarchy theorem states that D S P A C E ( o ( f ( n ) ) ) ⊊ D S P A C E ( f ( n ) ) {\displaystyle {\mathsf {DSPACE}}{\big (}o(f(n)){\big )}\subsetneq {\mathsf {DSPACE}}{\big (}f(n){\big )}} . The time and space hierarchy theorems form

6950-528: Is not correct. Instead MTZ use the n ( n − 1 ) {\displaystyle n(n-1)} linear constraints where the constant term n − 1 {\displaystyle n-1} provides sufficient slack that x i j = 0 {\displaystyle x_{ij}=0} does not impose a relation between u j {\displaystyle u_{j}} and u i . {\displaystyle u_{i}.} The way that

7089-439: Is not known if L {\displaystyle L} (the set of all problems that can be solved in logarithmic space) is strictly contained in P {\displaystyle P} or equal to P {\displaystyle P} . Again, there are many complexity classes between the two, such as N L {\displaystyle NL} and N C {\displaystyle NC} , and it

SECTION 50

#1732801225844

7228-552: Is not known if they are distinct or equal classes. It is suspected that P {\displaystyle P} and B P P {\displaystyle BPP} are equal. However, it is currently open if B P P = N E X P {\displaystyle BPP=NEXP} . RAND Corporation The RAND Corporation is an American nonprofit global policy think tank , research institute , and public sector consulting firm . RAND Corporation engages in research and development (R&D) in

7367-599: Is not meant to be a physically realizable model, it is just a theoretically interesting abstract machine that gives rise to particularly interesting complexity classes. For examples, see non-deterministic algorithm . Many machine models different from the standard multi-tape Turing machines have been proposed in the literature, for example random-access machines . Perhaps surprisingly, each of these models can be converted to another without providing any extra computational power. The time and memory consumption of these alternate models may vary. What all these models have in common

7506-401: Is not really the case, since function problems can be recast as decision problems. For example, the multiplication of two integers can be expressed as the set of triples ( a , b , c ) {\displaystyle (a,b,c)} such that the relation a × b = c {\displaystyle a\times b=c} holds. Deciding whether a given triple is

7645-520: Is often seen as a mathematical abstraction modeling those computational tasks that admit an efficient algorithm. This hypothesis is called the Cobham–Edmonds thesis . The complexity class NP , on the other hand, contains many problems that people would like to solve efficiently, but for which no efficient algorithm is known, such as the Boolean satisfiability problem , the Hamiltonian path problem and

7784-902: Is possible that P = P S P A C E {\displaystyle P=PSPACE} . If P {\displaystyle P} is not equal to N P {\displaystyle NP} , then P {\displaystyle P} is not equal to P S P A C E {\displaystyle PSPACE} either. Since there are many known complexity classes between P {\displaystyle P} and P S P A C E {\displaystyle PSPACE} , such as R P {\displaystyle RP} , B P P {\displaystyle BPP} , P P {\displaystyle PP} , B Q P {\displaystyle BQP} , M A {\displaystyle MA} , P H {\displaystyle PH} , etc., it

7923-444: Is possible that all these complexity classes collapse to one class. Proving that any of these classes are unequal would be a major breakthrough in complexity theory. Along the same lines, c o - N P {\displaystyle co{\text{-}}NP} is the class containing the complement problems (i.e. problems with the yes / no answers reversed) of N P {\displaystyle NP} problems. It

8062-487: Is provably bounded by a multiple of the optimal length, and in doing so would create lower bounds for the problem; these lower bounds would then be used with branch-and-bound approaches. One method of doing this was to create a minimum spanning tree of the graph and then double all its edges, which produces the bound that the length of an optimal tour is at most twice the weight of a minimum spanning tree. In 1976, Christofides and Serdyukov (independently of each other) made

8201-427: Is said to reject the input. An example of a decision problem is the following. The input is an arbitrary graph . The problem consists in deciding whether the given graph is connected or not. The formal language associated with this decision problem is then the set of all connected graphs — to obtain a precise definition of this language, one has to decide how graphs are encoded as binary strings. A function problem

8340-422: Is that the former is devoted to analyzing the amount of resources needed by a particular algorithm to solve a problem, whereas the latter asks a more general question about all possible algorithms that could be used to solve the same problem. More precisely, computational complexity theory tries to classify problems that can or cannot be solved with appropriately restricted resources. In turn, imposing restrictions on

8479-449: Is that the machines operate deterministically . However, some computational problems are easier to analyze in terms of more unusual resources. For example, a non-deterministic Turing machine is a computational model that is allowed to branch out to check many different possibilities at once. The non-deterministic Turing machine has very little to do with how we physically want to compute algorithms, but its branching exactly captures many of

SECTION 60

#1732801225844

8618-494: Is the general number field sieve , which takes time O ( e ( 64 9 3 ) ( log ⁡ n ) 3 ( log ⁡ log ⁡ n ) 2 3 ) {\displaystyle O(e^{\left({\sqrt[{3}]{\frac {64}{9}}}\right){\sqrt[{3}]{(\log n)}}{\sqrt[{3}]{(\log \log n)^{2}}}})} to factor an odd integer n {\displaystyle n} . However,

8757-660: Is the European arm of the RAND Corporation, and like its main branch, it is a not-for-profit policy research organization dedicated to improving decision-making through evidence-based research and analysis. RAND Europe's stated mission is to improve policy and decision-making through rigorous, independent research. RAND Europe is incorporated in, and has offices in, Cambridge, Rotterdam, and Brussels. The research of RAND stems from its development of systems analysis . Important contributions are claimed in space systems and

8896-462: Is the class of all decision problems. For the complexity classes defined in this way, it is desirable to prove that relaxing the requirements on (say) computation time indeed defines a bigger set of problems. In particular, although DTIME( n {\displaystyle n} ) is contained in DTIME( n 2 {\displaystyle n^{2}} ), it would be interesting to know if

9035-481: Is the computational problem of determining the prime factorization of a given integer. Phrased as a decision problem, it is the problem of deciding whether the input has a prime factor less than k {\displaystyle k} . No efficient integer factorization algorithm is known, and this fact forms the basis of several modern cryptographic systems, such as the RSA algorithm. The integer factorization problem

9174-403: Is the most basic Turing machine, which uses a fixed set of rules to determine its future actions. A probabilistic Turing machine is a deterministic Turing machine with an extra supply of random bits. The ability to make probabilistic decisions often helps algorithms solve problems more efficiently. Algorithms that use random bits are called randomized algorithms . A non-deterministic Turing machine

9313-400: Is the set of NP-hard problems. If a problem X {\displaystyle X} is in C {\displaystyle C} and hard for C {\displaystyle C} , then X {\displaystyle X} is said to be complete for C {\displaystyle C} . This means that X {\displaystyle X} is

9452-415: Is the total number of state transitions, or steps, the machine makes before it halts and outputs the answer ("yes" or "no"). A Turing machine M {\displaystyle M} is said to operate within time f ( n ) {\displaystyle f(n)} if the time required by M {\displaystyle M} on each input of length n {\displaystyle n}

9591-400: Is whether the graph isomorphism problem is in P {\displaystyle P} , N P {\displaystyle NP} -complete, or NP-intermediate. The answer is not known, but it is believed that the problem is at least not NP-complete. If graph isomorphism is NP-complete, the polynomial time hierarchy collapses to its second level. Since it is widely believed that

9730-492: The u i {\displaystyle u_{i}} variables then enforce that a single tour visits all cities is that they increase by at least 1 {\displaystyle 1} for each step along a tour, with a decrease only allowed where the tour passes through city  1. {\displaystyle 1.} That constraint would be violated by every tour which does not pass through city  1 , {\displaystyle 1,} so

9869-436: The { x i j } i , j {\displaystyle \{x_{ij}\}_{i,j}} will effectively range over all subsets of the set of edges, which is very far from the sets of edges in a tour, and allows for a trivial minimum where all x i j = 0 {\displaystyle x_{ij}=0} . Therefore, both formulations also have the constraints that, at each vertex, there

10008-522: The Cold War , RAND researchers contributed to the development of nuclear strategy concepts such as deterrence theory and mutually assured destruction . In recent years, RAND has analyzed military readiness, force modernization, and counterterrorism strategies. For example, one study examined the effectiveness of counterinsurgency operations in Iraq and Afghanistan . RAND designed and conducted one of

10147-479: The big O notation , which hides constant factors and smaller terms. This makes the bounds independent of the specific details of the computational model used. For instance, if T ( n ) = 7 n 2 + 15 n + 40 {\displaystyle T(n)=7n^{2}+15n+40} , in big O notation one would write T ( n ) = O ( n 2 ) {\displaystyle T(n)=O(n^{2})} . A complexity class

10286-421: The cutting plane method for its solution. They wrote what is considered the seminal paper on the subject in which, with these new methods, they solved an instance with 49 cities to optimality by constructing a tour and proving that no other tour could be shorter. Dantzig, Fulkerson, and Johnson, however, speculated that, given a near-optimal solution, one may be able to find optimality or prove optimality by adding

10425-482: The discrete logarithm problem and the integer factorization problem are examples of problems believed to be NP-intermediate. They are some of the very few NP problems not known to be in P {\displaystyle P} or to be N P {\displaystyle NP} -complete. The graph isomorphism problem is the computational problem of determining whether two finite graphs are isomorphic . An important unsolved problem in complexity theory

10564-438: The vehicle routing problem and the ring star problem are three generalizations of TSP. The decision version of the TSP (where given a length L , the task is to decide whether the graph has a tour whose length is at most L ) belongs to the class of NP-complete problems. Thus, it is possible that the worst-case running time for any algorithm for the TSP increases superpolynomially (but no more than exponentially ) with

10703-611: The vertex cover problem . Since deterministic Turing machines are special non-deterministic Turing machines, it is easily observed that each problem in P is also member of the class NP. The question of whether P equals NP is one of the most important open questions in theoretical computer science because of the wide implications of a solution. If the answer is yes, many important problems can be shown to have more efficient solutions. These include various types of integer programming problems in operations research , many problems in logistics , protein structure prediction in biology , and

10842-477: The 1930s by Merrill M. Flood who was looking to solve a school bus routing problem. Hassler Whitney at Princeton University generated interest in the problem, which he called the "48 states problem". The earliest publication using the phrase "travelling [or traveling] salesman problem" was the 1949 RAND Corporation report by Julia Robinson , "On the Hamiltonian game (a traveling salesman problem)." In

10981-467: The 1930s in Vienna and at Harvard , notably by Karl Menger , who defines the problem, considers the obvious brute-force algorithm, and observes the non-optimality of the nearest neighbour heuristic: We denote by messenger problem (since in practice this question should be solved by each postman, anyway also by many travelers) the task to find, for finitely many points whose pairwise distances are known,

11120-652: The 1950s and 1960s, the problem became increasingly popular in scientific circles in Europe and the United States after the RAND Corporation in Santa Monica offered prizes for steps in solving the problem. Notable contributions were made by George Dantzig , Delbert Ray Fulkerson , and Selmer M. Johnson from the RAND Corporation, who expressed the problem as an integer linear program and developed

11259-642: The Miller–Tucker–Zemlin (MTZ) formulation and the Dantzig–Fulkerson–Johnson (DFJ) formulation. The DFJ formulation is stronger, though the MTZ formulation is still useful in certain settings. Common to both these formulations is that one labels the cities with the numbers 1 , … , n {\displaystyle 1,\ldots ,n} and takes c i j > 0 {\displaystyle c_{ij}>0} to be

11398-428: The NN algorithm give the worst route. This is true for both asymmetric and symmetric TSPs. Rosenkrantz et al. showed that the NN algorithm has the approximation factor Θ ( log ⁡ | V | ) {\displaystyle \Theta (\log |V|)} for instances satisfying the triangle inequality. A variation of the NN algorithm, called nearest fragment (NF) operator, which connects

11537-977: The United States' space program , in computing and in artificial intelligence . RAND researchers developed many of the principles that were used to build the Internet . RAND also contributed to the development and use of wargaming . Current areas of expertise include: child policy, law , civil and criminal justice , education , health ( public health and health care ), international policy/ foreign policy , labor markets , national security , defense policy , infrastructure , energy , environment , business and corporate governance , economic development , intelligence policy , long-range planning, crisis management and emergency management-disaster preparation , population studies , regional studies , comparative studies , science and technology , social policy , welfare , terrorism and counterterrorism , cultural policy, arts policy , and transportation. During

11676-802: The ability to find formal proofs of pure mathematics theorems. The P versus NP problem is one of the Millennium Prize Problems proposed by the Clay Mathematics Institute . There is a US$ 1,000,000 prize for resolving the problem. It was shown by Ladner that if P ≠ N P {\displaystyle P\neq NP} then there exist problems in N P {\displaystyle NP} that are neither in P {\displaystyle P} nor N P {\displaystyle NP} -complete. Such problems are called NP-intermediate problems. The graph isomorphism problem ,

11815-598: The absence of evidence is not evidence of absence. Both proponents and opponents of various gun control measures have cited the RAND initiative. Additionally, RAND has researched the opioid epidemic, and alcoholism . The RAND analysis of the Intensive Partnerships for Effective Teaching , a $ 575 million initiative from the Bill & Melinda Gates Foundation to increase teacher effectiveness, found that

11954-404: The algorithm used. The theory formalizes this intuition, by introducing mathematical models of computation to study these problems and quantifying their computational complexity , i.e., the amount of resources needed to solve them, such as time and storage. Other measures of complexity are also used, such as the amount of communication (used in communication complexity ), the number of gates in

12093-407: The available resources is what distinguishes computational complexity from computability theory: the latter theory asks what kinds of problems can, in principle, be solved algorithmically. A computational problem can be viewed as an infinite collection of instances together with a set (possibly empty) of solutions for every instance. The input string for a computational problem is referred to as

12232-516: The basis for most separation results of complexity classes. For instance, the time hierarchy theorem tells us that P is strictly contained in EXPTIME, and the space hierarchy theorem tells us that L is strictly contained in PSPACE. Many complexity classes are defined using the concept of a reduction. A reduction is a transformation of one problem into another problem. It captures the informal notion of

12371-444: The basis for the complexity class P , which is the set of decision problems solvable by a deterministic Turing machine within polynomial time. The corresponding set of function problems is FP . Many important complexity classes can be defined by bounding the time or space used by the algorithm. Some important complexity classes of decision problems defined in this manner are the following: Logarithmic-space classes do not account for

12510-507: The best known quantum algorithm for this problem, Shor's algorithm , does run in polynomial time. Unfortunately, this fact doesn't say much about where the problem lies with respect to non-quantum complexity classes. Many known complexity classes are suspected to be unequal, but this has not been proved. For instance P ⊆ N P ⊆ P P ⊆ P S P A C E {\displaystyle P\subseteq NP\subseteq PP\subseteq PSPACE} , but it

12649-411: The best travelling salesman tour; hence, finding optimal Eulerian graphs is at least as hard as TSP. One way of doing this is by minimum weight matching using algorithms with a complexity of O ( n 3 ) {\displaystyle O(n^{3})} . Making a graph into an Eulerian graph starts with the minimum spanning tree; all the vertices of odd order must then be made even, so

12788-613: The chief of staff of the newly created United States Air Force approved the evolution of Project RAND into a nonprofit corporation , independent of Douglas. On 14 May 1948, RAND was incorporated as a nonprofit corporation under the laws of the State of California and on 1 November 1948, the Project RAND contract was formally transferred from the Douglas Aircraft Company to the RAND Corporation. Initial capital for

12927-590: The chosen machine model. For instance, the language { x x ∣ x  is any binary string } {\displaystyle \{xx\mid x{\text{ is any binary string}}\}} can be solved in linear time on a multi-tape Turing machine, but necessarily requires quadratic time in the model of single-tape Turing machines. If we allow polynomial variations in running time, Cobham-Edmonds thesis states that "the time complexities in any two reasonable and general models of computation are polynomially related" ( Goldreich 2008 , Chapter 1.2). This forms

13066-408: The cost (distance) from city i {\displaystyle i} to city j {\displaystyle j} . The main variables in the formulations are: It is because these are 0/1 variables that the formulations become integer programs; all other constraints are purely linear. In particular, the objective in the program is to minimize the tour length Without further constraints,

13205-441: The discussion abstract enough to be independent of the choice of encoding. This can be achieved by ensuring that different representations can be transformed into each other efficiently. Decision problems are one of the central objects of study in computational complexity theory. A decision problem is a type of computational problem where the answer is either yes or no (alternatively, 1 or 0). A decision problem can be viewed as

13344-552: The dynamic programming approach. Improving these time bounds seems to be difficult. For example, it has not been determined whether a classical exact algorithm for TSP that runs in time O ( 1.9999 n ) {\displaystyle O(1.9999^{n})} exists. The currently best quantum exact algorithm for TSP due to Ambainis et al. runs in time O ( 1.728 n ) {\displaystyle O(1.728^{n})} . Other approaches include: An exact solution for 15,112 German towns from TSPLIB

13483-428: The effect that Merely requiring u j ≥ u i + x i j {\displaystyle u_{j}\geq u_{i}+x_{ij}} would not achieve that, because this also requires u j ≥ u i {\displaystyle u_{j}\geq u_{i}} when x i j = 0 , {\displaystyle x_{ij}=0,} which

13622-744: The evidence of the effects of gun policies in the United States. The second expanded review in 2020 analyzed almost 13,000 relevant studies on guns and gun violence since 1995 and selected 123 as having sufficient methodological rigor for inclusion. These studies were used to evaluate scientific support for eighteen classes of gun policy. The review found supportive evidence that child-access prevention laws reduce firearm self-injuries (including suicides), firearm homicides or assault injuries, and unintentional firearm injuries and deaths among youth. Conversely, it identified that stand-your-ground laws increase firearm homicides and shall-issue concealed carry laws increase total and firearm homicides. RAND also emphasized that

13761-493: The first time. In 1959, Jillian Beardwood , J.H. Halton, and John Hammersley published an article entitled "The Shortest Path Through Many Points" in the journal of the Cambridge Philosophical Society . The Beardwood–Halton–Hammersley theorem provides a practical solution to the travelling salesman problem. The authors derived an asymptotic formula to determine the length of the shortest route for

13900-560: The following integer linear programming problem: The last constraint of the DFJ formulation—called a subtour elimination constraint—ensures that no proper subset Q can form a sub-tour, so the solution returned is a single tour and not the union of smaller tours. Intuitively, for each proper subset Q of the cities, the constraint requires that there be fewer edges than cities in Q: if there were to be as many edges in Q as cities in Q, that would represent

14039-476: The fragments created by edge removal into a new and shorter tour. Similarly, the 3-opt technique removes 3 edges and reconnects them to form a shorter tour. These are special cases of the k -opt method. The label Lin–Kernighan is an often heard misnomer for 2-opt; Lin–Kernighan is actually the more general k -opt method. Computational complexity theory A problem is regarded as inherently difficult if its solution requires significant resources, whatever

14178-465: The hardest problem in C {\displaystyle C} . (Since many problems could be equally hard, one might say that X {\displaystyle X} is one of the hardest problems in C {\displaystyle C} .) Thus the class of NP-complete problems contains the most difficult problems in NP, in the sense that they are the ones most likely not to be in P. Because

14317-512: The idea of a "winnable" nuclear exchange in his 1960 book On Thermonuclear War . This led to Kahn's being one of the models for the titular character of the film Dr. Strangelove , in which RAND is spoofed as the "BLAND Corporation". Even in the late 1940s and early 1950s, long before Sputnik, the RAND project was secretly recommending to the US government a major effort to design a human-made satellite that would take photographs from space and

14456-516: The inclusion is strict. For time and space requirements, the answer to such questions is given by the time and space hierarchy theorems respectively. They are called hierarchy theorems because they induce a proper hierarchy on the classes defined by constraining the respective resources. Thus there are pairs of complexity classes such that one is properly included in the other. Having deduced such proper set inclusions, we can proceed to make quantitative statements about how much more additional time or space

14595-420: The interventions had no effect on student achievement. RAND has examined the implications of emerging technologies such as artificial intelligence, cybersecurity threats, and autonomous systems. It was accused of working too closely with Open Philanthropy in its work on AI, at the risk of losing its independence. RAND employees have expressed concerns to Politico about the organization's objectivity after it

14734-460: The largest and most important studies of health insurance between 1974 and 1982. The RAND Health Insurance Experiment , funded by the then–U.S. Department of Health, Education and Welfare , established an insurance corporation to compare demand for health services with their cost to the patient. In 2018, RAND began its Gun Policy in America initiative, which resulted in comprehensive reviews of

14873-484: The list in half, also needing O ( n log ⁡ n ) {\displaystyle O(n\log n)} time. To classify the computation time (or similar resources, such as space consumption), it is helpful to demonstrate upper and lower bounds on the maximum amount of time required by the most efficient algorithm to solve a given problem. The complexity of an algorithm is usually taken to be its worst-case complexity unless specified otherwise. Analyzing

15012-466: The mathematical models we want to analyze, so that non-deterministic time is a very important resource in analyzing computational problems. For a precise definition of what it means to solve a problem using a given amount of time and space, a computational model such as the deterministic Turing machine is used. The time required by a deterministic Turing machine M {\displaystyle M} on input x {\displaystyle x}

15151-555: The method of reduction, such as Cook reductions, Karp reductions and Levin reductions, and the bound on the complexity of reductions, such as polynomial-time reductions or log-space reductions . The most commonly used reduction is a polynomial-time reduction. This means that the reduction process takes polynomial time. For example, the problem of squaring an integer can be reduced to the problem of multiplying two integers. This means an algorithm for multiplying two integers can be used to square an integer. Indeed, this can be done by giving

15290-465: The need for a private organization to connect operational research with research and development decisions. The immediate impetus for the creation of RAND was a conversation in September 1945 between General Henry H. "Hap" Arnold and Douglas executive Franklin R. Collbohm . Both men were deeply worried that ongoing demobilization meant the federal government was about to lose direct control of

15429-449: The number of cities. The problem was first formulated in 1930 and is one of the most intensively studied problems in optimization. It is used as a benchmark for many optimization methods. Even though the problem is computationally difficult, many heuristics and exact algorithms are known, so that some instances with tens of thousands of cities can be solved completely, and even problems with millions of cities can be approximated within

15568-467: The number of possible solutions. In the asymmetric TSP , paths may not exist in both directions or the distances might be different, forming a directed graph . Traffic congestion, one-way streets, and airfares for cities with different departure and arrival fees are real-world considerations that could yield a TSP problem in asymmetric form. The TSP can be formulated as an integer linear program . Several formulations are known. Two notable formulations are

15707-400: The only way to satisfy it is that the tour passing city  1 {\displaystyle 1} also passes through all other cities. The MTZ formulation of TSP is thus the following integer linear programming problem: The first set of equalities requires that each city is arrived at from exactly one other city, and the second set of equalities requires that from each city there is

15846-470: The optimal solution. Several categories of heuristics are recognized. The nearest neighbour (NN) algorithm (a greedy algorithm ) lets the salesman choose the nearest unvisited city as his next move. This algorithm quickly yields an effectively short route. For N cities randomly distributed on a plane, the algorithm on average yields a path 25% longer than the shortest possible path; however, there exist many specially-arranged city distributions which make

15985-530: The polynomial hierarchy does not collapse to any finite level, it is believed that graph isomorphism is not NP-complete. The best algorithm for this problem, due to László Babai and Eugene Luks has run time O ( 2 n log ⁡ n ) {\displaystyle O(2^{\sqrt {n\log n}})} for graphs with n {\displaystyle n} vertices, although some recent work by Babai offers some potentially new perspectives on this. The integer factorization problem

16124-430: The postwar period immediately after World War II . The United States Army Air Forces established Project RAND with the objective of investigating long-range planning of future weapons. Douglas Aircraft Company was granted a contract to research intercontinental warfare. Project RAND later evolved into the RAND Corporation, and expanded its research into civilian fields such as education and international affairs. It

16263-400: The problem P = NP is not solved, being able to reduce a known NP-complete problem, Π 2 {\displaystyle \Pi _{2}} , to another problem, Π 1 {\displaystyle \Pi _{1}} , would indicate that there is no known polynomial-time solution for Π 1 {\displaystyle \Pi _{1}} . This

16402-565: The problem and includes example tours through Germany and Switzerland , but contains no mathematical treatment. The TSP was mathematically formulated in the 19th century by the Irish mathematician William Rowan Hamilton and by the British mathematician Thomas Kirkman . Hamilton's icosian game was a recreational puzzle based on finding a Hamiltonian cycle . The general form of the TSP appears to have been first studied by mathematicians during

16541-531: The problem of sorting a list of integers. The worst-case is when the pivot is always the largest or smallest value in the list (so the list is never divided). In this case, the algorithm takes time O ( n 2 {\displaystyle n^{2}} ). If we assume that all possible permutations of the input list are equally likely, the average time taken for sorting is O ( n log ⁡ n ) {\displaystyle O(n\log n)} . The best case occurs when each pivoting divides

16680-441: The rockets to put such a satellite in orbit. RAND was not the first think tank, but during the 1960s, it was the first to be regularly referred to as a "think tank". Accordingly, RAND served as the "prototype" for the modern definition of that term. In the early 1990s, RAND established a European branch to serve clients across the public, private, and third sectors, including governments, charities, and corporations. RAND Europe

16819-581: The same input to both inputs of the multiplication algorithm. Thus we see that squaring is not more difficult than multiplication, since squaring can be reduced to multiplication. This motivates the concept of a problem being hard for a complexity class. A problem X {\displaystyle X} is hard for a class of problems C {\displaystyle C} if every problem in C {\displaystyle C} can be reduced to X {\displaystyle X} . Thus no problem in C {\displaystyle C}

16958-415: The second matching is executed after deleting all the edges of the first matching, to yield a set of cycles. The cycles are then stitched to produce the final tour. The algorithm of Christofides and Serdyukov follows a similar outline but combines the minimum spanning tree with a solution of another problem, minimum-weight perfect matching . This gives a TSP tour which is at most 1.5 times the optimal. It

17097-403: The set of problems solvable within time f ( n ) {\displaystyle f(n)} on a deterministic Turing machine is then denoted by DTIME ( f ( n ) {\displaystyle f(n)} ). Analogous definitions can be made for space requirements. Although time and space are the most well-known complexity resources, any complexity measure can be viewed as

17236-425: The shortest route connecting the points. Of course, this problem is solvable by finitely many trials. Rules which would push the number of trials below the number of permutations of the given points, are not known. The rule that one first should go from the starting point to the closest point, then to the point closest to this, etc., in general does not yield the shortest route. It was first considered mathematically in

17375-545: The space required to represent the problem. It turns out that PSPACE = NPSPACE and EXPSPACE = NEXPSPACE by Savitch's theorem . Other important complexity classes include BPP , ZPP and RP , which are defined using probabilistic Turing machines ; AC and NC , which are defined using Boolean circuits; and BQP and QMA , which are defined using quantum Turing machines. #P is an important complexity class of counting problems (not decision problems). Classes like IP and AM are defined using Interactive proof systems . ALL

17514-435: The space required, or any measure of complexity) is calculated as a function of the size of the instance. The input size is typically measured in bits. Complexity theory studies how algorithms scale as input size increases. For instance, in the problem of finding whether a graph is connected, how much more time does it take to solve a problem for a graph with 2 n {\displaystyle 2n} vertices compared to

17653-677: The spin-off was provided by the Ford Foundation . Since the 1950s, RAND research has helped inform United States policy decisions on a wide variety of issues, including the space race, the Vietnam War , the U.S.-Soviet nuclear arms confrontation, the creation of the Great Society social welfare programs, the digital revolution, and national health care. In the 1970s the Rand Corporation adjusted computer models it

17792-417: The time complexity (or any other complexity measure) of different inputs of the same size. Since some inputs of size n {\displaystyle n} may be faster to solve than others, we define the following complexities: The order from cheap to costly is: Best, average (of discrete uniform distribution ), amortized, worst. For example, the deterministic sorting algorithm quicksort addresses

17931-416: The time taken for a graph with n {\displaystyle n} vertices? If the input size is n {\displaystyle n} , the time taken can be expressed as a function of n {\displaystyle n} . Since the time taken on different inputs of the same size can be different, the worst-case time complexity T ( n ) {\displaystyle T(n)}

18070-588: The travelling salesman problem of visiting all 24,978 towns in Sweden was solved: a tour of length approximately 72,500 kilometres was found, and it was proven that no shorter tour exists. In March 2005, the travelling salesman problem of visiting all 33,810 points in a circuit board was solved using Concorde TSP Solver : a tour of length 66,048,945 units was found, and it was proven that no shorter tour exists. The computation took approximately 15.7 CPU-years (Cook et al. 2006). In April 2006 an instance with 85,900 points

18209-465: The vast amount of American scientific brainpower assembled to fight World War II. As soon as Arnold realized Collbohm had been thinking along similar lines, he said, "I know just what you're going to tell me. It's the most important thing we can do." With Arnold's blessing, Collbohm quickly pulled in additional people from Douglas to help, and together with Donald Douglas , they convened with Arnold two days later at Hamilton Army Airfield to sketch out

18348-399: Was found in 2001 using the cutting-plane method proposed by George Dantzig , Ray Fulkerson , and Selmer M. Johnson in 1954, based on linear programming . The computations were performed on a network of 110 processors located at Rice University and Princeton University . The total computation time was equivalent to 22.6 years on a single 500 MHz Alpha processor . In May 2004,

18487-646: Was made in the late 1970s and 1980, when Grötschel, Padberg, Rinaldi and others managed to exactly solve instances with up to 2,392 cities, using cutting planes and branch-and-bound . In the 1990s, Applegate , Bixby , Chvátal , and Cook developed the program Concorde that has been used in many recent record solutions. Gerhard Reinelt published the TSPLIB in 1991, a collection of benchmark instances of varying difficulty, which has been used by many research groups for comparing results. In 2006, Cook and others computed an optimal tour through an 85,900-city instance given by

18626-485: Was one of the first approximation algorithms , and was in part responsible for drawing attention to approximation algorithms as a practical approach to intractable problems . As a matter of fact, the term "algorithm" was not commonly extended to approximation algorithms until later; the Christofides algorithm was initially referred to as the Christofides heuristic. This algorithm looks at things differently by using

18765-639: Was revealed that RAND helped draft the Executive Order on AI, following over $ 15 million in funding from a Facebook founder-backed Open Philanthropy. In December 2023, the House Science Committee sent a bipartisan letter to the National Institute of Standards and Technology raising concerns over RAND's "research that has failed to go through robust review processes, such as academic peer review." On Septiembre 13, 2024,

18904-533: Was set up under special contract to the Douglas Aircraft Company and began operations in December 1945. In May 1946, the Preliminary Design of an Experimental World-Circling Spaceship was released. By late 1947, Douglas Aircraft executives had expressed their concerns that their close relationship with RAND might create conflict of interest problems on future hardware contracts. In February 1948,

19043-418: Was solved using Concorde TSP Solver , taking over 136 CPU-years; see Applegate et al. (2006) . Various heuristics and approximation algorithms , which quickly yield good solutions, have been devised. These include the multi-fragment algorithm . Modern methods can find solutions for extremely large problems (millions of cities) within a reasonable time which are, with a high probability, just 2–3% away from

19182-724: Was the first think tank to be regularly referred to as a "think tank". RAND receives both public and private funding. Its funding sources include the U.S. government , private endowments , corporations, universities , charitable foundations , U.S. state and local governments, international organizations , and to a small extent, foreign governments. RAND has approximately 1,850 employees. Its American locations include: Santa Monica, California (headquarters); Arlington, Virginia ; Pittsburgh , Pennsylvania ; and Boston , Massachusetts . The RAND Gulf States Policy Institute has an office in New Orleans , Louisiana . RAND Europe

19321-554: Was using to recommend closures of fire stations in New York City so that fire stations were closed in the most fire-prone areas, home to Black and Puerto Rican residents, rather than in wealthier, more affluent neighborhoods. RAND contributed to the doctrine of nuclear deterrence by mutually assured destruction (MAD), developed under the guidance of then-Defense Secretary Robert McNamara and based upon their work with game theory . Chief strategist Herman Kahn also posited

#843156