Thinning (morphology)

Last updated

Thinning is the transformation of a digital image into a simplified, but topologically equivalent image. It is a type of topological skeleton, but computed using mathematical morphology operators.

Contents

Example


Let , and consider the eight composite structuring elements, composed by:

and ,
and

and the three rotations of each by , , and . The corresponding composite structuring elements are denoted .

For any i between 1 and 8, and any binary image X, define

,

where denotes the set-theoretical difference and denotes the hit-or-miss transform.

The thinning of an image A is obtained by cyclically iterating until convergence:

.

Thickening

Thickening is the dual of thinning that is used to grow selected regions of foreground pixels. In most cases in image processing thickening is performed by thinning the background [1]

where denotes the set-theoretical difference and denotes the hit-or-miss transform, and is the structural element and is the image being operated on.

Related Research Articles

Alpha compositing An operation in computer graphics

In computer graphics, alpha compositing is the process of combining one image with a background to create the appearance of partial or full transparency. It is often useful to render picture elements (pixels) in separate passes or layers and then combine the resulting 2D images into a single, final image called the composite. Compositing is used extensively in film when combining computer-rendered image elements with live footage. Alpha blending is also used in 2D computer graphics to put rasterized foreground elements over a background.

In mathematics, the Lp spaces are function spaces defined using a natural generalization of the p-norm for finite-dimensional vector spaces. They are sometimes called Lebesgue spaces, named after Henri Lebesgue, although according to the Bourbaki group they were first introduced by Frigyes Riesz. Lp spaces form an important class of Banach spaces in functional analysis, and of topological vector spaces. Because of their key role in the mathematical analysis of measure and probability spaces, Lebesgue spaces are used also in the theoretical discussion of problems in physics, statistics, finance, engineering, and other disciplines.

In mathematics, a chain complex is an algebraic structure that consists of a sequence of abelian groups and a sequence of homomorphisms between consecutive groups such that the image of each homomorphism is included in the kernel of the next. Associated to a chain complex is its homology, which describes how the images are included in the kernels.

In differential geometry, the Lie derivative, named after Sophus Lie by Władysław Ślebodziński, evaluates the change of a tensor field, along the flow defined by another vector field. This change is coordinate invariant and therefore the Lie derivative is defined on any differentiable manifold.

In mathematics, the Kronecker product, sometimes denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix. It is a generalization of the outer product from vectors to matrices, and gives the matrix of the tensor product with respect to a standard choice of basis. The Kronecker product is to be distinguished from the usual matrix multiplication, which is an entirely different operation. The Kronecker product is also sometimes called matrix direct product.

In linear algebra and functional analysis, the partial trace is a generalization of the trace. Whereas the trace is a scalar valued function on operators, the partial trace is an operator-valued function. The partial trace has applications in quantum information and decoherence which is relevant for quantum measurement and thereby to the decoherent approaches to interpretations of quantum mechanics, including consistent histories and the relative state interpretation.

In commutative algebra and field theory, the Frobenius endomorphism is a special endomorphism of commutative rings with prime characteristic p, an important class which includes finite fields. The endomorphism maps every element to its p-th power. In certain contexts it is an automorphism, but this is not true in general.

In mathematics, there are usually many different ways to construct a topological tensor product of two topological vector spaces. For Hilbert spaces or nuclear spaces there is a simple well-behaved theory of tensor products, but for general Banach spaces or locally convex topological vector spaces the theory is notoriously subtle.

In mathematics, an operad is concerned with prototypical algebras that model properties such as commutativity or anticommutativity as well as various amounts of associativity. Operads generalize the various associativity properties already observed in algebras and coalgebras such as Lie algebras or Poisson algebras by modeling computational trees within the algebra. Algebras are to operads as group representations are to groups. An operad can be seen as a set of operations, each one having a fixed finite number of inputs (arguments) and one output, which can be composed one with others. They form a category-theoretic analog of universal algebra.

The theory of quantum error correction plays a prominent role in the practical realization and engineering of quantum computing and quantum communication devices. The first quantum error-correcting codes are strikingly similar to classical block codes in their operation and performance. Quantum error-correcting codes restore a noisy, decohered quantum state to a pure quantum state. A stabilizer quantum error-correcting code appends ancilla qubits to qubits that we want to protect. A unitary encoding circuit rotates the global state into a subspace of a larger Hilbert space. This highly entangled, encoded state corrects for local noisy errors. A quantum error-correcting code makes quantum computation and quantum communication practical by providing a way for a sender and receiver to simulate a noiseless qubit channel given a noisy qubit channel whose noise conforms to a particular error model.

In mathematics, especially in linear algebra and matrix theory, the vectorization of a matrix is a linear transformation which converts the matrix into a column vector. Specifically, the vectorization of a m × n matrix A, denoted vec(A), is the mn × 1 column vector obtained by stacking the columns of the matrix A on top of one another:

In mathematics, Hochschild homology is a homology theory for associative algebras over rings. There is also a theory for Hochschild homology of certain functors. Hochschild cohomology was introduced by Gerhard Hochschild (1945) for algebras over a field, and extended to algebras over more general rings by Henri Cartan and Samuel Eilenberg (1956).

In mathematical morphology, hit-or-miss transform is an operation that detects a given configuration in a binary image, using the morphological erosion operator and a pair of disjoint structuring elements. The result of the hit-or-miss transform is the set of positions where the first structuring element fits in the foreground of the input image, and the second structuring element misses it completely.

In multilinear algebra, the tensor rank decomposition or canonical polyadic decomposition (CPD) is one generalization of the matrix singular value decomposition (SVD) to tensors, which have found application in statistics, signal processing, computer vision, computer graphics, psychometrics, linguistics and chemometrics. The tensor rank decomposition was introduced by Hitchcock in 1927 and later rediscovered several times, notably in psychometrics. For this reason, the tensor rank decomposition is sometimes historically referred to as PARAFAC or CANDECOMP.

In algebraic geometry, a derived scheme is a pair consisting of a topological space X and a sheaf of commutative ring spectra on X such that (1) the pair is a scheme and (2) is a quasi-coherent -module. The notion gives a homotopy-theoretic generalization of a scheme.

In mathematics, the base change theorems relate the direct image and the pull-back of sheaves. More precisely, they are about the base change map, given by the following natural transformation of sheaves:

Anyon fusion is the process by which multiple anyons behave as one larger composite anyon. Anyon fusion is essential to understanding the physics of non-abelian anyons and how they can be used in quantum information.

The strongest locally convex topological vector space (TVS) topology on , the tensor product of two locally convex TVSs, making the canonical map continuous is called the projective topology or the π-topology. When X ⊗ Y is endowed with this topology then it is denoted by and called the projective tensor product of X and Y.

In mathematics, the injective tensor product of two topological vector spaces was introduced by Alexander Grothendieck and was used by him to define nuclear spaces.

References

  1. Gonzalez, Rafael C. (2002). Digital image processing. Woods, Richard E. (Richard Eugene), 1954- (2nd ed.). Upper Saddle River, N.J. ISBN   0-201-18075-8. OCLC   48944550.