Bayesian search theory is the application of Bayesian statistics to the search for lost objects. It has been used several times to find lost sea vessels, for example USS Scorpion, and has played a key role in the recovery of the flight recorders in the Air France Flight 447 disaster of 2009. It has also been used in the attempts to locate the remains of Malaysia Airlines Flight 370. [1] [2] [3]
The usual procedure is as follows:
In other words, first search where it most probably will be found, then search where finding it is less probable, then search where the probability is even less (but still possible due to limitations on fuel, range, water currents, etc.), until insufficient hope of locating the object at acceptable cost remains.
The advantages of the Bayesian method are that all information available is used coherently (i.e., in a "leak-proof" manner) and the method automatically produces estimates of the cost for a given success probability. That is, even before the start of searching, one can say, hypothetically, "there is a 65% chance of finding it in a 5-day search. That probability will rise to 90% after a 10-day search and 97% after 15 days" or a similar statement. Thus the economic viability of the search can be estimated before committing resources to a search.
Apart from the USS Scorpion, other vessels located by Bayesian search theory include the MV Derbyshire, the largest British vessel ever lost at sea, and the SS Central America. It also proved successful in the search for a lost hydrogen bomb following the 1966 Palomares B-52 crash in Spain, [4] and the recovery in the Atlantic Ocean of the crashed Air France Flight 447.
Bayesian search theory is incorporated into the CASP (Computer Assisted Search Program) mission planning software used by the United States Coast Guard for search and rescue. This program was later adapted for inland search by adding terrain and ground cover factors for use by the United States Air Force and Civil Air Patrol.
Suppose a grid square has a probability p of containing the wreck and that the probability of successfully detecting the wreck if it is there is q. If the square is searched and no wreck is found, then, by Bayes' theorem, the revised probability of the wreck being in the square is given by
For every other grid square, if its prior probability is r, its posterior probability is given by
In May 1968, the U.S. Navy's nuclear submarine USS Scorpion (SSN-589) failed to arrive as expected at her home port of Norfolk, Virginia. The command officers of the U.S. Navy were nearly certain that the vessel had been lost off the Eastern Seaboard, but an extensive search there failed to discover the remains of Scorpion.
Then, a Navy deep-water expert, John P. Craven, suggested that Scorpion had sunk elsewhere. Craven organised a search southwest of the Azores based on a controversial approximate triangulation by hydrophones. He was allocated only a single ship, Mizar, and he took advice from Metron Inc., a firm of consultant mathematicians in order to maximise his resources. A Bayesian search methodology was adopted. Experienced submarine commanders were interviewed to construct hypotheses about what could have caused the loss of Scorpion.
The sea area was divided up into grid squares and a probability assigned to each square, under each of the hypotheses, to give a number of probability grids, one for each hypothesis. These were then added together to produce an overall probability grid. The probability attached to each square was then the probability that the wreck was in that square. A second grid was constructed with probabilities that represented the probability of successfully finding the wreck if that square were to be searched and the wreck were to be actually there. This was a known function of water depth. The result of combining this grid with the previous grid is a grid which gives the probability of finding the wreck in each grid square of the sea if it were to be searched.
At the end of October 1968, the Navy's oceanographic research ship, Mizar, located sections of the hull of Scorpion on the seabed, about 740 km (400 nmi; 460 mi) southwest of the Azores, [5] under more than 3,000 m (9,800 ft) of water. This was after the Navy had released sound tapes from its underwater "SOSUS" listening system, which contained the sounds of the destruction of Scorpion. The court of inquiry was subsequently reconvened and other vessels, including the bathyscaphe Trieste II, were dispatched to the scene, collecting many pictures and other data.
Although Craven received much credit for locating the wreckage of Scorpion, Gordon Hamilton, an acoustics expert who pioneered the use of hydroacoustics to pinpoint Polaris missile splashdown locations, was instrumental in defining a compact "search box" wherein the wreck was ultimately found. Hamilton had established a listening station in the Canary Islands that obtained a clear signal of what some scientists believe was the noise of the vessel's pressure hull imploding as she passed crush depth. A Naval Research Laboratory scientist named Chester "Buck" Buchanan, using a towed camera sled of his own design aboard Mizar, finally located Scorpion. [5] The towed camera sled, which was fabricated by J. L. "Jac" Hamm of Naval Research Laboratory's Engineering Services Division, is housed in the National Museum of the United States Navy. Buchanan had located the wrecked hull of Thresher in 1964 using this technique.
The classical book on this subject The Theory of Optimal Search (Operations Research Society of America, 1975) by Lawrence D. Stone of Metron Inc. won the 1975 Lanchester Prize by the American Operations Research Society.
Assume that a stationary object is hidden in one of n boxes (locations). For each location there are three known parameters: the cost of a single search, the probability of finding the object by a single search if the object is there, and the probability that the object is there. A searcher looks for the object. They know the a priori probabilities at the beginning and update them by Bayes’ law after each (unsuccessful) attempt. The problem of finding the object in minimal expected cost is a classical problem solved by David Blackwell. [6] Surprisingly, the optimal policy is easy to describe: at each stage look into the location which maximizes . This is actually a special case of the Gittins index.
A statistical hypothesis test is a method of statistical inference used to decide whether the data at hand sufficiently support a particular hypothesis. Hypothesis testing allows us to make probabilistic statements about population parameters.
The raven paradox, also known as Hempel's paradox, Hempel's ravens, or rarely the paradox of indoor ornithology, is a paradox arising from the question of what constitutes evidence for the truth of a statement. Observing objects that are neither black nor ravens may formally increase the likelihood that all ravens are black even though, intuitively, these observations are unrelated.
Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".
A Bayesian network is a probabilistic graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG). Bayesian networks are ideal for taking an event that occurred and predicting the likelihood that any one of several possible known causes was the contributing factor. For example, a Bayesian network could represent the probabilistic relationships between diseases and symptoms. Given symptoms, the network can be used to compute the probabilities of the presence of various diseases.
USS Scorpion (SSN-589) was a Skipjack-class nuclear-powered submarine that served in the United States Navy, and the sixth vessel, and second submarine, of the U.S. Navy to carry that name.
In mathematical optimization and decision theory, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its opposite, in which case it is to be maximized. The loss function could include terms from several levels of the hierarchy.
A prior probability distribution of an uncertain quantity, often simply called the prior, is its assumed probability distribution before some evidence is taken into account. For example, the prior could be the probability distribution representing the relative proportions of voters who will vote for a particular politician in a future election. The unknown quantity may be a parameter of the model or a latent variable rather than an observable variable.
The Bayes factor is a ratio of two competing statistical models represented by their evidence, and is used to quantify the support for one model over the other. The models in questions can have a common set of parameters, such as a null hypothesis and an alternative, but this is not necessary; for instance, it could also be a non-linear model compared to its linear approximation. The Bayes factor can be thought of as a Bayesian analog to the likelihood-ratio test, but since it uses the (integrated) marginal likelihood rather than the maximized likelihood, both tests only coincide under simple hypotheses. Also, in contrast with null hypothesis significance testing, Bayes factors support evaluation of evidence in favor of a null hypothesis, rather than only allowing the null to be rejected or not rejected.
USNS Mizar (MA-48/T-AGOR-11/T-AK-272) was a vessel of the United States Navy. She was named after the star Mizar.
The secretary problem demonstrates a scenario involving optimal stopping theory that is studied extensively in the fields of applied probability, statistics, and decision theory. It is also known as the marriage problem, the sultan's dowry problem, the fussy suitor problem, the googol game, and the best choice problem.
In network theory, small-world routing refers to routing methods for small-world networks. Networks of this type are peculiar in that relatively short paths exist between any two nodes. Determining these paths, however, can be a difficult problem from the perspective of an individual routing node in the network if no further information is known about the network as a whole.
One-shot learning is an object categorization problem, found mostly in computer vision. Whereas most machine learning-based object categorization algorithms require training on hundreds or thousands of examples, one-shot learning aims to classify objects from one, or only a few, examples. The term few-shot learning is also used for these problems, especially when more than one example is needed.
Frequentist inference is a type of statistical inference based in frequentist probability, which treats “probability” in equivalent terms to “frequency” and draws conclusions from sample-data by means of emphasizing the frequency or proportion of findings in the data. Frequentist-inference underlies frequentist statistics, in which the well-established methodologies of statistical hypothesis testing and confidence intervals are founded.
In statistics and machine learning, ensemble methods use multiple learning algorithms to obtain better predictive performance than could be obtained from any of the constituent learning algorithms alone. Unlike a statistical ensemble in statistical mechanics, which is usually infinite, a machine learning ensemble consists of only a concrete finite set of alternative models, but typically allows for much more flexible structure to exist among those alternatives.
The Lévy flight foraging hypothesis is a hypothesis in the field of biology that may be stated as follows:
Since Lévy flights and walks can optimize search efficiencies, therefore natural selection should have led to adaptations for Lévy flight foraging.
The weapon target assignment problem (WTA) is a class of combinatorial optimization problems present in the fields of optimization and operations research. It consists of finding an optimal assignment of a set of weapons of various types to a set of targets in order to maximize the total expected damage done to the opponent.
t-distributed stochastic neighbor embedding (t-SNE) is a statistical method for visualizing high-dimensional data by giving each datapoint a location in a two or three-dimensional map. It is based on Stochastic Neighbor Embedding originally developed by Sam Roweis and Geoffrey Hinton, where Laurens van der Maaten proposed the t-distributed variant. It is a nonlinear dimensionality reduction technique for embedding high-dimensional data for visualization in a low-dimensional space of two or three dimensions. Specifically, it models each high-dimensional object by a two- or three-dimensional point in such a way that similar objects are modeled by nearby points and dissimilar objects are modeled by distant points with high probability.
Stochastic scheduling concerns scheduling problems involving random attributes, such as random processing times, random due dates, random weights, and stochastic machine breakdowns. Major applications arise in manufacturing systems, computer systems, communication systems, logistics and transportation, and machine learning, among others.
In statistical hypothesis testing, the error exponent of a hypothesis testing procedure is the rate at which the probabilities of Type I and Type II decay exponentially with the size of the sample used in the test. For example, if the probability of error of a test decays as , where is the sample size, the error exponent is .
Probabilistic numerics is an active field of study at the intersection of applied mathematics, statistics, and machine learning centering on the concept of uncertainty in computation. In probabilistic numerics, tasks in numerical analysis such as finding numerical solutions for integration, linear algebra, optimization and simulation and differential equations are seen as problems of statistical, probabilistic, or Bayesian inference.