Criminology |
---|
Main Theories |
Methods |
Subfields and other major theories |
Browse |
Computational criminology is an interdisciplinary field which uses computing science methods to formally define criminology concepts, improve our understanding of complex phenomena, and generate solutions for related problems.
Computing science methods being used include:
Computational criminology is interdisciplinary in the sense that both criminologists and computing scientists work together to ensure that computational models properly match their theoretical and real-world counterparts. Areas of criminology for which computational approaches are being used include:
Computational forensics (CF) is a quantitative approach to the methodology of the forensic sciences. It involves computer-based modeling, computer simulation, analysis, and recognition in studying and solving problems posed in various forensic disciplines. CF integrates expertise from computational science and forensic sciences.
A broad range of objects, substances and processes are investigated, which are mainly based on pattern evidence, such as toolmarks, fingerprints, shoeprints, documents etc., [1] but also physiological and behavioral patterns, DNA, digital evidence and crime scenes.
Computational methods find a place in the forensic sciences in several ways, [2] [3] [4] [5] [6] as for example:
Algorithms implemented are from the fields of signal and image processing, computer vision, [7] computer graphics, data visualization, statistical pattern recognition, data mining, machine learning, and robotics.
Computer forensics (also referred to as "digital forensics" or "forensic information technology") is one specific discipline that could use computational science to study digital evidence. Computational Forensics examines diverse types of evidence.
Forensic animation is a branch of forensic science in which audio-visual reconstructions of incidents or accidents are created to aid investigators. Examples include the use of computer animation, stills, and other audio visual aids. Application of computer animation in courtrooms today is becoming more popular.
The first use of forensic animation was in Connors v. United States, both sides used computer re-creations and animations in a case surrounding the crash of Delta Flight 191 on August 2, 1985. [8] The crash resulted in the deaths of 137 people and extensive property damage. In the resulting lawsuit a method was required to explain complicated information and situations to the jury. As part of the plaintiff presentation, a 45-minute computer generated presentation was created to explain the intricacies of the evidence and thus began forensic animation. [9]
The first reported use of computer animation in a U.S. criminal trial was in the 1991 Marin County, CA homicide trial of James Mitchell (of the porno-businessman Mitchell Brothers) [10] The prosecution used the animation to explain the complex details of the shooting incident to the jury. It showed the positions of James Mitchell, Artie Mitchell (the victim), the bullet impact points, and the path taken by bullets as they entered Artie's body. The animation was admitted, over objection by the defense, and the case resulted in a conviction. The use of the animation was upheld on appeal and the success of the forensic animation led to its use in many other trials. In India Prof. T D Dogra at AIIMS New Delhi in 2008 used animation to explain the court of law and investigating agencies first time in two important cases of firearm injuries, case of Murder and Terrorist encounter killings (Batla house encounter case). [11]
In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.
Neuromorphic computing is an approach to computing that is inspired by the structure and function of the human brain. A neuromorphic computer/chip is any device that uses physical artificial neurons to do computations. In recent times, the term neuromorphic has been used to describe analog, digital, mixed-mode analog/digital VLSI, and software systems that implement models of neural systems. Recent advances have even discovered ways to mimic the human nervous system through liquid solutions of chemical systems.
Computer forensics is a branch of digital forensic science pertaining to evidence found in computers and digital storage media. The goal of computer forensics is to examine digital media in a forensically sound manner with the aim of identifying, preserving, recovering, analyzing and presenting facts and opinions about the digital information.
The expression computational intelligence (CI) usually refers to the ability of a computer to learn a specific task from data or experimental observation. Even though it is commonly considered a synonym of soft computing, there is still no commonly accepted definition of computational intelligence.
Hardware acceleration is the use of computer hardware designed to perform specific functions more efficiently when compared to software running on a general-purpose central processing unit (CPU). Any transformation of data that can be calculated in software running on a generic CPU can also be calculated in custom-made hardware, or in some mix of both.
Demosaicing, also known as color reconstruction, is a digital image processing algorithm used to reconstruct a full color image from the incomplete color samples output from an image sensor overlaid with a color filter array (CFA) such as a Bayer filter. It is also known as CFA interpolation or debayering.
Honeywell, Inc. v. Sperry Rand Corp., et al., 180 U.S.P.Q. 673, was a landmark U.S. federal court case that in October 1973 invalidated the 1964 patent for the ENIAC, the world's first general-purpose electronic digital computer. The decision held, in part, the following: 1. that the ENIAC inventors had derived the subject matter of the electronic digital computer from the Atanasoff–Berry computer (ABC), prototyped in 1939 by John Atanasoff and Clifford Berry, 2. that Atanasoff should have legal recognition as the inventor of the first electronic digital computer and 3. that the invention of the electronic digital computer ought to be placed in the public domain.
Les Hatton is a British-born computer scientist and mathematician most notable for his work on failures and vulnerabilities in software controlled systems.
The following outline is provided as an overview of and topical guide to forensic science:
Informatics is the study of computational systems. According to the ACM Europe Council and Informatics Europe, informatics is synonymous with computer science and computing as a profession, in which the central notion is transformation of information. In some cases, the term "informatics" may also be used with different meanings, e.g. in the context of social computing, or in context of library science.
Sargur Narasimhamurthy Srihari was an Indian and American computer scientist and educator who made contributions to the field of pattern recognition. The principal impact of his work has been in handwritten address reading systems and in computer forensics. He was a SUNY Distinguished Professor in the School of Engineering and Applied Sciences at the University at Buffalo, Buffalo, New York, USA.
Audio forensics is the field of forensic science relating to the acquisition, analysis, and evaluation of sound recordings that may ultimately be presented as admissible evidence in a court of law or some other official venue.
The MNIST database is a large database of handwritten digits that is commonly used for training various image processing systems. The database is also widely used for training and testing in the field of machine learning. It was created by "re-mixing" the samples from NIST's original datasets. The creators felt that since NIST's training dataset was taken from American Census Bureau employees, while the testing dataset was taken from American high school students, it was not well-suited for machine learning experiments. Furthermore, the black and white images from NIST were normalized to fit into a 28x28 pixel bounding box and anti-aliased, which introduced grayscale levels.
Dynamic texture is the texture with motion which can be found in videos of sea-waves, fire, smoke, wavy trees, etc. Dynamic texture has a spatially repetitive pattern with time-varying visual pattern. Modeling and analyzing dynamic texture is a topic of images processing and pattern recognition in computer vision.
Soft computing is an umbrella term used to describe types of algorithms that produce approximate solutions to unsolvable high-level problems in computer science. Typically, traditional hard-computing algorithms heavily rely on concrete data and mathematical models to produce solutions to problems. Soft computing was coined in the late 20th century. During this period, revolutionary research in three fields greatly impacted soft computing. Fuzzy logic is a computational paradigm that entertains the uncertainties in data by using levels of truth rather than rigid 0s and 1s in binary. Next, neural networks which are computational models influenced by human brain functions. Finally, evolutionary computation is a term to describe groups of algorithm that mimic natural processes such as evolution and natural selection.
Ali Dehghantanha is an academic-entrepreneur in cybersecurity and cyber threat intelligence. He is a Professor of Cybersecurity and a Canada Research Chair in Cybersecurity and Threat Intelligence.
IoT Forensics or IoT Forensic Science, a branch of digital forensics, that deals with the use of any digital forensics processes and procedures relating to the recovery of digital evidence which originates from one or more IoT devices for the purpose of preservation, identification, extraction or documentation of digital evidence with the intention of reconstructing IoT-related events. These events may reside across one or more configurable computing resources that are within close proximity to the location where the event has taken place.
Dmitri Maslov is a Canadian-American computer scientist known for his work on quantum circuit synthesis and optimization, quantum advantage, and benchmarking quantum computers. Currently, he is the Chief Software Architect at IBM Quantum. Maslov was formerly a program director for Quantum Information Science at the National Science Foundation. He was named a Fellow of the Institute of Electrical and Electronics Engineers in 2021 "for contributions to quantum circuit synthesis and optimization, and compiling for quantum computers."
Nikola Kirilov Kasabov also known as Nikola Kirilov Kassabov is a Bulgarian and New Zealand computer scientist, academic and author. He is a professor emeritus of Knowledge Engineering at Auckland University of Technology, Founding Director of the Knowledge Engineering and Discovery Research Institute (KEDRI), George Moore Chair of Data Analytics at Ulster University, as well as visiting professor at both the Institute for Information and Communication Technologies (IICT) at the Bulgarian Academy of Sciences and Dalian University in China. He is also the Founder and Director of Knowledge Engineering Consulting.