L-semi-inner product

Last updated

In mathematics, there are two different notions of semi-inner-product. The first, and more common, is that of an inner product which is not required to be strictly positive. This article will deal with the second, called a L-semi-inner product or semi-inner product in the sense of Lumer, which is an inner product not required to be conjugate symmetric. It was formulated by Günter Lumer, for the purpose of extending Hilbert space type arguments to Banach spaces in functional analysis. [1] Fundamental properties were later explored by Giles. [2]

Contents

Definition

We mention again that the definition presented here is different from that of the "semi-inner product" in standard functional analysis textbooks, [3] where a "semi-inner product" satisfies all the properties of inner products (including conjugate symmetry) except that it is not required to be strictly positive.

A semi-inner-product, L-semi-inner product, or a semi-inner product in the sense of Lumer for a linear vector space over the field of complex numbers is a function from to usually denoted by , such that for all

  1. Nonnegative-definiteness:
  2. Linearity in the 1st argument, meaning:
    1. Additivity in the 1st argument:
    2. Homogeneity in the 1st argument:
  3. Conjugate homogeneity in the 2nd argument:
  4. Cauchy-Schwartz inequality:

Difference from inner products

A semi-inner-product is different from inner products in that it is in general not conjugate symmetric, that is,

generally. This is equivalent to saying that [4]

In other words, semi-inner-products are generally nonlinear about its second variable.

Semi-inner-products for normed spaces

If is a semi-inner-product for a linear vector space then

defines a norm on . Conversely, if is a normed vector space with the norm then there always exists a (not necessarily unique) semi-inner-product on that is consistent with the norm on in the sense that[ citation needed ]

Examples

The Euclidean space with the norm ()

has the consistent semi-inner-product:

where

In general, the space of -integrable functions on a measure space where with the norm

possesses the consistent semi-inner-product:

Applications

  1. Following the idea of Lumer, semi-inner-products were widely applied to study bounded linear operators on Banach spaces. [5] [6] [7]
  2. In 2007, Der and Lee applied semi-inner-products to develop large margin classification in Banach spaces. [8]
  3. Recently, semi-inner-products have been used as the main tool in establishing the concept of reproducing kernel Banach spaces for machine learning. [9]
  4. Semi-inner-products can also be used to establish the theory of frames, Riesz bases for Banach spaces. [10]

See also

Related Research Articles

In mathematics, more specifically in functional analysis, a Banach space is a complete normed vector space. Thus, a Banach space is a vector space with a metric that allows the computation of vector length and distance between vectors and is complete in the sense that a Cauchy sequence of vectors always converges to a well defined limit that is within the space.

In mathematics, a topological space is called separable if it contains a countable, dense subset; that is, there exists a sequence of elements of the space such that every nonempty open subset of the space contains at least one element of the sequence.

<span class="mw-page-title-main">Quaternion group</span>

In group theory, the quaternion group Q8 (sometimes just denoted by Q) is a non-abelian group of order eight, isomorphic to the eight-element subset of the quaternions under multiplication. It is given by the group presentation

<span class="mw-page-title-main">Lorentz group</span> Lie group of Lorentz transformations

In physics and mathematics, the Lorentz group is the group of all Lorentz transformations of Minkowski spacetime, the classical and quantum setting for all (non-gravitational) physical phenomena. The Lorentz group is named for the Dutch physicist Hendrik Lorentz.

<span class="mw-page-title-main">Reproducing kernel Hilbert space</span>

In functional analysis, a reproducing kernel Hilbert space (RKHS) is a Hilbert space of functions in which point evaluation is a continuous linear functional. Roughly speaking, this means that if two functions and in the RKHS are close in norm, i.e., is small, then and are also pointwise close, i.e., is small for all . The converse does not need to be true.

In mathematics, the total variation identifies several slightly different concepts, related to the structure of the codomain of a function or a measure. For a real-valued continuous function f, defined on an interval [a, b] ⊂ R, its total variation on the interval of definition is a measure of the one-dimensional arclength of the curve with parametric equation xf(x), for x ∈ [a, b]. Functions whose total variation is finite are called functions of bounded variation.

In mathematics, nuclear spaces are topological vector spaces that can be viewed as a generalization of finite dimensional Euclidean spaces and share many of their desirable properties. Nuclear spaces are however quite different from Hilbert spaces, another generalization of finite dimensional Euclidean spaces. They were introduced by Alexander Grothendieck.

<span class="mw-page-title-main">Complex torus</span>

In mathematics, a complex torus is a particular kind of complex manifold M whose underlying smooth manifold is a torus in the usual sense. Here N must be the even number 2n, where n is the complex dimension of M.

In complex analysis, functional analysis and operator theory, a Bergman space, named after Stefan Bergman, is a function space of holomorphic functions in a domain D of the complex plane that are sufficiently well-behaved at the boundary that they are absolutely integrable. Specifically, for 0 < p < ∞, the Bergman space Ap(D) is the space of all holomorphic functions in D for which the p-norm is finite:

In mathematics, the Kolmogorov extension theorem is a theorem that guarantees that a suitably "consistent" collection of finite-dimensional distributions will define a stochastic process. It is credited to the English mathematician Percy John Daniell and the Russian mathematician Andrey Nikolaevich Kolmogorov.

In mathematics, Bochner spaces are a generalization of the concept of spaces to functions whose values lie in a Banach space which is not necessarily the space or of real or complex numbers.

In mathematics—specifically, in functional analysis—a weakly measurable function taking values in a Banach space is a function whose composition with any element of the dual space is a measurable function in the usual (strong) sense. For separable spaces, the notions of weak and strong measurability agree.

<span class="mw-page-title-main">Hilbert space</span> Generalization of Euclidean space allowing infinite dimensions

In mathematics, Hilbert spaces allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. A Hilbert space is a vector space equipped with an inner product which defines a distance function for which it is a complete metric space. Hilbert spaces arise naturally and frequently in mathematics and physics, typically as function spaces.

In mathematics, the direct method in the calculus of variations is a general method for constructing a proof of the existence of a minimizer for a given functional, introduced by Stanisław Zaremba and David Hilbert around 1900. The method relies on methods of functional analysis and topology. As well as being used to prove the existence of a solution, direct methods may be used to compute the solution to desired accuracy.

In mathematics, the Dirichlet space on the domain , is the reproducing kernel Hilbert space of holomorphic functions, contained within the Hardy space , for which the Dirichlet integral, defined by

In mathematics, the symmetric decreasing rearrangement of a function is a function which is symmetric and decreasing, and whose level sets are of the same size as those of the original function.

In machine learning, the kernel embedding of distributions comprises a class of nonparametric methods in which a probability distribution is represented as an element of a reproducing kernel Hilbert space (RKHS). A generalization of the individual data-point feature mapping done in classical kernel methods, the embedding of distributions into infinite-dimensional feature spaces can preserve all of the statistical features of arbitrary distributions, while allowing one to compare and manipulate distributions using Hilbert space operations such as inner products, distances, projections, linear transformations, and spectral analysis. This learning framework is very general and can be applied to distributions over any space on which a sensible kernel function may be defined. For example, various kernels have been proposed for learning from data which are: vectors in , discrete classes/categories, strings, graphs/networks, images, time series, manifolds, dynamical systems, and other structured objects. The theory behind kernel embeddings of distributions has been primarily developed by Alex Smola, Le Song , Arthur Gretton, and Bernhard Schölkopf. A review of recent works on kernel embedding of distributions can be found in.

<span class="mw-page-title-main">Generative adversarial network</span> Deep learning method

A generative adversarial network (GAN) is a class of machine learning frameworks designed by Ian Goodfellow and his colleagues in June 2014. Two neural networks contest with each other in the form of a zero-sum game, where one agent's gain is another agent's loss.

In the mathematical discipline of functional analysis, a differentiable vector-valued function from Euclidean space is a differentiable function valued in a topological vector space (TVS) whose domains is a subset of some finite-dimensional Euclidean space. It is possible to generalize the notion of derivative to functions whose domain and codomain are subsets of arbitrary topological vector spaces (TVSs) in multiple ways. But when the domain of a TVS-valued function is a subset of a finite-dimensional Euclidean space then many of these notions become logically equivalent resulting in a much more limited number of generalizations of the derivative and additionally, differentiability is also more well-behaved compared to the general case. This article presents the theory of -times continuously differentiable functions on an open subset of Euclidean space , which is an important special case of differentiation between arbitrary TVSs. This importance stems partially from the fact that every finite-dimensional vector subspace of a Hausdorff topological vector space is TVS isomorphic to Euclidean space so that, for example, this special case can be applied to any function whose domain is an arbitrary Hausdorff TVS by restricting it to finite-dimensional vector subspaces.

This is a glossary for the terminology in a mathematical field of functional analysis.

References

  1. Lumer, G. (1961), "Semi-inner-product spaces", Transactions of the American Mathematical Society , 100: 29–43, doi: 10.2307/1993352 , MR   0133024 .
  2. J. R. Giles, Classes of semi-inner-product spaces, Transactions of the American Mathematical Society 129 (1967), 436–446.
  3. J. B. Conway. A Course in Functional Analysis. 2nd Edition, Springer-Verlag, New York, 1990, page 1.
  4. S. V. Phadke and N. K. Thakare, When an s.i.p. space is a Hilbert space?, The Mathematics Student 42 (1974), 193–194.
  5. S. Dragomir, Semi-inner Products and Applications, Nova Science Publishers, Hauppauge, New York, 2004.
  6. D. O. Koehler, A note on some operator theory in certain semi-inner-product spaces, Proceedings of the American Mathematical Society 30 (1971), 363–366.
  7. E. Torrance, Strictly convex spaces via semi-inner-product space orthogonality, Proceedings of the American Mathematical Society 26 (1970), 108–110.
  8. R. Der and D. Lee, Large-margin classification in Banach spaces, JMLR Workshop and Conference Proceedings 2: AISTATS (2007), 91–98.
  9. Haizhang Zhang, Yuesheng Xu and Jun Zhang, Reproducing kernel Banach spaces for machine learning, Journal of Machine Learning Research 10 (2009), 2741–2775.
  10. Haizhang Zhang and Jun Zhang, Frames, Riesz bases, and sampling expansions in Banach spaces via semi-inner products, Applied and Computational Harmonic Analysis 31 (1) (2011), 1–25.