Split Up (expert system)

Last updated

Split Up is an intelligent decision support system, which makes predictions about the distribution of marital property following divorce in Australia. It is designed to assist judges, registrars of the Family Court of Australia, mediators and lawyers. Split Up operates as a hybrid system, combining rule – based reasoning with neural network theory. [1] Rule based reasoning operates within strict parameters, in the form:

Contents

IF < condition(s) > then <action>. [2] :196,202

Neural networks, by contrast, are considered to be better suited to generate decisions in uncertain domains, since they can be taught to weigh the factors considered by judicial decision makers from case data. Yet, they do not provide an explanation for the conclusions they reach. Split_up, with a view to overcome this flaw, uses argument structures proposed by Toulmin as the basis for representations from which explanations can be generated. [3] :186

Application

In Australian family law, a judge in determining the distribution of property will:

  1. identify the assets of the marriage included in the common pool
  2. establish what percentage of the common pool each party will receive
  3. determine a final property order in line with the decisions made in 1. and 2.

Split_Up implements step 1 and 2 : the common pool determination and the prediction of a percentage split.

The common pool determination

Since the determination of marital property is rule based, it is implemented using directed graphs. [4] :269

However, the percentage split between the parties is discretionary in that a judge has a wide discretion to look at each party's contributions to the marriage under section 79(4) of the Family Law Act 1975. Broadly, the contributions can be taken as financial or non-financial. The party who can demonstrate a larger contribution to the marital relationship will receive a larger proportion of the assets. The court may further look at each party's financial resources and future needs under section 75(2)of the Family Law Act 1975. These needs can include factors such as the inability to gain employment, the continued care of a child under 18 years of age or medical expenses.

This means that different judges may and will reach different conclusions based on the same facts, since each judge assigns different relevant weights to each factor. Split_up determines the percentage split by using a combination of rule- based reasoning and neural networks.

The percentage split determination

In order to determine how judges weigh the different factors, 103 written judgements of commonplace cases were used to establish a database comprising 94 relevant factors for percentage split determination. [4] :273

The factors relevant for a percentage split determination are:

The factors relevant for a determination of past contributions are

The hierarchy provides a structure that is used to decompose the task of predicting an outcome into 35 subtasks. Outputs of tasks further down the hierarchy are used as inputs into sub-tasks higher up the hierarchy. Each sub-task is treated as a separate and smaller data mining exercise. Twenty one solid arcs represent inferences performed with the use of rule sets. For example, the level of wealth of a marriage is determined by a rule, which uses the common pool value.

By contrast, the fourteen dashed arcs establish inferences performed with the use of neural networks. These receive their name from the fact that they resemble a nervous system in the brain. They consist of many self – adjusting processing elements cooperating in a densely interconnected network. Each processing element generates a single output that is transmitted to the other processing element. The output signal of a processing element depends on the input to the processing element, i.e. each input is gated by a weighting factor that determines the amount of influence that the input will have on the output. The strength of the weighting factors is adjusted autonomously by the processing element as the data is processed. [2] :196

In Split_Up, the neural network is a statistical technique for learning the weights of each of the relevant attributes used in a percentage split determination of marital property.

Hence the inputs to the neural network are contributions, future needs and wealth, and the output the percentage split predicted.

On each arc there is a statistical weight. Using back propagation the neural network learns the necessary pattern to recognize the prediction. It is trained by repeatedly exposing it to examples of the problem and learning the significance (weights) of the input nodes. [2] :196

The neural network used by Split_up is said to generalise well if the output of the network is correct (or nearly correct) for examples not seen during training, which classifies it as an intelligent system. [4] :274

Toulmin Argument Structure

Since the manner in which these weights are learned is primarily statistical, domain knowledge of legal rules and principles is not modelled directly. However, explanations for a legal conclusion in a domain as discretionary as the determining the distribution of property following divorce, are at least as important as the conclusion reached. Hence the creators of Split_Up used Toulmin Argument structures, to provide independent explanations of the conclusions reached. [3] :189

These operate on the basis that every argument makes an assertion based on some data. The assertion of the argument stands as the claim of the argument. Since knowing the data and the claim, does not necessarily mean that the claim follows from the data, a mechanism is required to justify the claim in the light of the data. The justification is known as the warrant. The backing of an argument supports the validity of the warrant. In the legal domain, this is typically a reference to a statute or a precedent.

Here, a neural network (or rules), produce a conclusion from the data of an argument and the data, warrant and backing are reproduced to generate an explanation.

It is noteworthy, though, that an argument's warrant is reproduced as an explanation regardless of the claim values used. This lack of claim - sensitivity must be overcome by the different users, i.e., the judge, the representatives for the wife and the representatives for the husband, each of whom is encouraged to use the system to prepare their cases, but not to rely exclusively on its outcome.

Related Research Articles

<span class="mw-page-title-main">Analysis</span> Process of understanding a complex topic or substance

Analysis is the process of breaking a complex topic or substance into smaller parts in order to gain a better understanding of it. The technique has been applied in the study of mathematics and logic since before Aristotle, though analysis as a formal concept is a relatively recent development.

Artificial intelligence (AI) is the intelligence of machines or software, as opposed to the intelligence of humans or other animals. It is a field of study in computer science that develops and studies intelligent machines. Such machines may be called AIs.

A mathematical model is an abstract description of a concrete system using mathematical concepts and language. The process of developing a mathematical model is termed mathematical modeling. Mathematical models are used in applied mathematics and in the natural sciences and engineering disciplines, as well as in non-physical systems such as the social sciences (such as economics, psychology, sociology, political science). It can also be taught as a subject in its own right.

<span class="mw-page-title-main">Supervised learning</span> A paradigm in machine learning

Supervised learning (SL) is a paradigm in machine learning where input objects and a desired output value train a model. The training data is processed, building a function that maps new data on expected output values. An optimal scenario will allow for the algorithm to correctly determine output values for unseen instances. This requires the learning algorithm to generalize from the training data to unseen situations in a "reasonable" way. This statistical quality of an algorithm is measured through the so-called generalization error.

Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of statistical algorithms that can learn from data and generalize to unseen data, and thus perform tasks without explicit instructions. Recently, generative artificial neural networks have been able to surpass many previous approaches in performance.

Unsupervised learning is a paradigm in machine learning where, in contrast to supervised learning and semi-supervised learning, algorithms learn patterns exclusively from unlabeled data.

Inferences are steps in reasoning, moving from premises to logical consequences; etymologically, the word infer means to "carry forward". Inference is theoretically traditionally divided into deduction and induction, a distinction that in Europe dates at least to Aristotle. Deduction is inference deriving logical conclusions from premises known or assumed to be true, with the laws of valid inference being studied in logic. Induction is inference from particular evidence to a universal conclusion. A third type of inference is sometimes distinguished, notably by Charles Sanders Peirce, contradistinguishing abduction from induction.

In artificial intelligence, symbolic artificial intelligence is the term for the collection of all methods in artificial intelligence research that are based on high-level symbolic (human-readable) representations of problems, logic and search. Symbolic AI used tools such as logic programming, production rules, semantic nets and frames, and it developed applications such as knowledge-based systems, symbolic mathematics, automated theorem provers, ontologies, the semantic web, and automated planning and scheduling systems. The Symbolic AI paradigm led to seminal ideas in search, symbolic programming languages, agents, multi-agent systems, the semantic web, and the strengths and limitations of formal knowledge and reasoning systems.

The term Inductive reasoning is used to refer to any method of reasoning in which broad generalizations or principles are derived from a body of observations. This article is concerned with the inductive reasoning other than deductive reasoning, where the conclusion of a deductive argument is certain given the premises are correct; in contrast, the truth of the conclusion of an inductive argument is at best probable, based upon the evidence given.

In computer programming, gene expression programming (GEP) is an evolutionary algorithm that creates computer programs or models. These computer programs are complex tree structures that learn and adapt by changing their sizes, shapes, and composition, much like a living organism. And like living organisms, the computer programs of GEP are also encoded in simple linear chromosomes of fixed length. Thus, GEP is a genotype–phenotype system, benefiting from a simple genome to keep and transmit the genetic information and a complex phenotype to explore the environment and adapt to it.

<span class="mw-page-title-main">Argumentation theory</span> Study of how conclusions are reached through logical reasoning; one of four rhetorical modes

Argumentation theory is the interdisciplinary study of how conclusions can be supported or undermined by premises through logical reasoning. With historical origins in logic, dialectic, and rhetoric, argumentation theory includes the arts and sciences of civil debate, dialogue, conversation, and persuasion. It studies rules of inference, logic, and procedural rules in both artificial and real-world settings.

<span class="mw-page-title-main">Neuro-fuzzy</span>

In the field of artificial intelligence, the designation neuro-fuzzy refers to combinations of artificial neural networks and fuzzy logic.

<span class="mw-page-title-main">Analysis of competing hypotheses</span> Process to evaluate alternative hypotheses

The analysis of competing hypotheses (ACH) is a methodology for evaluating multiple competing hypotheses for observed data. It was developed by Richards (Dick) J. Heuer, Jr., a 45-year veteran of the Central Intelligence Agency, in the 1970s for use by the Agency. ACH is used by analysts in various fields who make judgments that entail a high risk of error in reasoning. ACH aims to help an analyst overcome, or at least minimize, some of the cognitive limitations that make prescient intelligence analysis so difficult to achieve.

<span class="mw-page-title-main">Outline of thought</span> Overview of and topical guide to thought

The following outline is provided as an overview of and topical guide to thought (thinking):

Fraud represents a significant problem for governments and businesses and specialized analysis techniques for discovering fraud using them are required. Some of these methods include knowledge discovery in databases (KDD), data mining, machine learning and statistics. They offer applicable and successful solutions in different areas of electronic fraud crimes.

There are many types of artificial neural networks (ANN).

An artificial neural network's learning rule or learning process is a method, mathematical logic or algorithm which improves the network's performance and/or training time. Usually, this rule is applied repeatedly over the network. It is done by updating the weights and bias levels of a network when a network is simulated in a specific data environment. A learning rule may accept existing conditions of the network and will compare the expected result and actual result of the network to give new and improved values for weights and bias. Depending on the complexity of actual model being simulated, the learning rule of the network can be as simple as an XOR gate or mean squared error, or as complex as the result of a system of differential equations.

A legal expert system is a domain-specific expert system that uses artificial intelligence to emulate the decision-making abilities of a human expert in the field of law. Legal expert systems employ a rule base or knowledge base and an inference engine to accumulate, reference and produce expert knowledge on specific subjects within the legal domain.

This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.

Explainable AI (XAI), often overlapping with Interpretable AI, or Explainable Machine Learning (XML), either refers to an AI system over which it is possible for humans to retain intellectual oversight, or to the methods to achieve this. The main focus is usually on the reasoning behind the decisions or predictions made by the AI which are made more understandable and transparent. XAI counters the "black box" tendency of machine learning, where even the AI's designers cannot explain why it arrived at a specific decision.

References

  1. Stranieri, A. and Zeleznikow, J., Split_Up: The use of an argument based knowledge representation to meet expectations of different users for discretionary decision making, p. 1 Research has shown that rule-based reasoning on its own is not ideal in discretionary fields of law.
  2. 1 2 3 Stranieri, A. and Zeleznikow, J., Split Up: An Intelligent Decision Support System Which Provides Advice Upon Property Division Following Divorce, International Journal of Law and Information Technology, Vol 6. No. 2, 1998, 190–213.
  3. 1 2 Stranieri, A. and Zeleznikow, J. (1995) The split-up system: integrating neural networks and rule-based reasoning in the legal domain.
  4. 1 2 3 Nolan, J.R. and Zeleznikow, J., Using soft computing to build real world intelligent decision support systems in uncertain domains, Decision Support Systems, 31 (2001) 263–285.