Free-choice profiling

Last updated

Free-choice profiling is a method for determining the quality of a thing by having a large number of subjects experience (view, taste, read, etc.) it and then allowing them to describe the thing in their own words, as opposed to posing them a set of "yes-no-maybe" questions. All of the descriptions are then analyzed to determine a "consensus configuration" of qualities, usually through Generalized Procrustes analysis (GPA) or Multiple factor analysis (MFA). [1]

Free-choice profiling first emerged in 1984 but the original published model has been modified by researchers into variations that are more applicable to their particular use. [2] For example, a technique employed by Jean Marc Sieffermann combined it with flash profiling, specifically using the free-profiling strategy of individual panelist vocabulary generation. [3] The method allows panelists to freely develop their own descriptors and scales. [4] A study show that free-choice profiling can provide more accurate sample maps compared with other methodologies such as project mapping and free sorting in the area of sensory characterization. [5]

Dr Françoise Wemelsfelder is a well known scientist who has done extensive research in this field.

Notes and references

  1. A comparison between GPA and MFA, based on sensory data, is a chapter of a recent book: Pagès Jérôme (2014). Multiple Factor Analysis by Example Using R. Chapman & Hall/CRC The R Series London 272 p
  2. Carpenter, Roland P.; Lyon, David H.; Hasdell, Terry A. (2000). Guidelines for Sensory Analysis in Food Product Development and Quality Control. Gaithesburg, MD: Aspen Publishers, Inc. p. 48. ISBN   0834216426.
  3. Lawless, Harry T.; Heymann, Hildegarde (2010). Sensory Evaluation of Food: Principles and Practices. New York: Springer Science & Business Media. p. 252. ISBN   9781441964878.
  4. Nollet, Leo M. L. (2012-05-29). Handbook of Meat, Poultry and Seafood Quality. John Wiley & Sons. ISBN   9781118352458.
  5. Varela, Paula; Ares, Gastón (2014). Novel Techniques in Sensory Characterization and Consumer Profiling. Boca Raton, FL: CRC Press. p. 373. ISBN   9781466566309.

Sources

Related Research Articles

Analytical chemistry Study of the separation, identification, and quantification of the chemical components of materials

Analytical chemistry studies and uses instruments and methods used to separate, identify, and quantify matter. In practice, separation, identification or quantification may constitute the entire analysis or be combined with another method. Separation isolates analytes. Qualitative analysis identifies analytes, while quantitative analysis determines the numerical amount or concentration. Analytical chemistry is the science of obtaining, processing, and communicating information about the composition and structure of matter. In other words, it is the art and science of determining what matter is and how much of it exists. ... It is one of the most popular fields of work for ACS chemists.

Meta-analysis Statistical method that summarizes data from multiple sources

A meta-analysis is a statistical analysis that combines the results of multiple scientific studies. Meta-analyses can be performed when there are multiple scientific studies addressing the same question, with each individual study reporting measurements that are expected to have some degree of error. The aim then is to use approaches from statistics to derive a pooled estimate closest to the unknown common truth based on how this error is perceived.

Factor analysis is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors. For example, it is possible that variations in six observed variables mainly reflect the variations in two unobserved (underlying) variables. Factor analysis searches for such joint variations in response to unobserved latent variables. The observed variables are modelled as linear combinations of the potential factors, plus "error" terms.

Chemometrics is the science of extracting information from chemical systems by data-driven means. Chemometrics is inherently interdisciplinary, using methods frequently employed in core data-analytic disciplines such as multivariate statistics, applied mathematics, and computer science, in order to address problems in chemistry, biochemistry, medicine, biology and chemical engineering. In this way, it mirrors other interdisciplinary fields, such as psychometrics and econometrics.

Structural equation modeling Form of causal modeling that fit networks of constructs to data

Structural equation modeling (SEM) is a label for a diverse set of methods used by scientists in both experimental and observational research across the sciences, business, and other fields. It is used most in the social and behavioral sciences. A definition of SEM is difficult without reference to highly technical language, but a good starting place is the name itself.

Sensory analysis is a scientific discipline that applies principles of experimental design and statistical analysis to the use of human senses for the purposes of evaluating consumer products. The discipline requires panels of human assessors, on whom the products are tested, and recording the responses made by them. By applying statistical techniques to the results it is possible to make inferences and insights about the products under test. Most large consumer goods companies have departments dedicated to sensory analysis. Sensory analysis can mainly be broken down into three sub-sections:

Spatial analysis Formal techniques which study entities using their topological, geometric, or geographic properties

Spatial analysis or spatial statistics includes any of the formal techniques which studies entities using their topological, geometric, or geographic properties. Spatial analysis includes a variety of techniques, many still in their early development, using different analytic approaches and applied in fields as diverse as astronomy, with its studies of the placement of galaxies in the cosmos, to chip fabrication engineering, with its use of "place and route" algorithms to build complex wiring structures. In a more restricted sense, spatial analysis is the technique applied to structures at the human scale, most notably in the analysis of geographic data or transcriptomics data.

Characterization (materials science) Study of material structure and properties

Characterization, when used in materials science, refers to the broad and general process by which a material's structure and properties are probed and measured. It is a fundamental process in the field of materials science, without which no scientific understanding of engineering materials could be ascertained. The scope of the term often differs; some definitions limit the term's use to techniques which study the microscopic structure and properties of materials, while others use the term to refer to any materials analysis process including macroscopic techniques such as mechanical testing, thermal analysis and density calculation. The scale of the structures observed in materials characterization ranges from angstroms, such as in the imaging of individual atoms and chemical bonds, up to centimeters, such as in the imaging of coarse grain structures in metals.

Generalized Procrustes analysis (GPA) is a method of statistical analysis that can be used to compare the shapes of objects, or the results of surveys, interviews, or panels. It was developed for analysing the results of free-choice profiling, a survey technique which allows respondents to describe a range of products in their own words or language. GPA is one way to make sense of free-choice profiling data; other ways can be multiple factor analysis (MFA), or the STATIS method. The method was first published by J. C. Gower in 1975.

Particle size analysis

Particle size analysis, particle size measurement, or simply particle sizing, is the collective name of the technical procedures, or laboratory techniques which determines the size range, and/or the average, or mean size of the particles in a powder or liquid sample.

Odor Volatile chemical compounds perceived by the sense of smell

An odor or odour is caused by one or more volatilized chemical compounds that are generally found in low concentrations that humans and animals can perceive by their sense of smell. An odor is also called a "smell" or a "scent", which can refer to either a pleasant or an unpleasant odor.

Olfactometer

An olfactometer is an instrument used to detect and measure odor dilution. Olfactometers are used in conjunction with human subjects in laboratory settings, most often in market research, to quantify and qualify human olfaction. Olfactometers are used to gauge the odor detection threshold of substances. To measure intensity, olfactometers introduce an odorous gas as a baseline against which other odors are compared.

Food rheology

Food rheology is the study of the rheological properties of food, that is, the consistency and flow of food under tightly specified conditions. The consistency, degree of fluidity, and other mechanical properties are important in understanding how long food can be stored, how stable it will remain, and in determining food texture. The acceptability of food products to the consumer is often determined by food texture, such as how spreadable and creamy a food product is. Food rheology is important in quality control during food manufacture and processing. Food rheology terms have been noted since ancient times. In ancient Egypt, bakers judged the consistency of dough by rolling it in their hands.

In statistics, multiple correspondence analysis (MCA) is a data analysis technique for nominal categorical data, used to detect and represent underlying structures in a data set. It does this by representing data as points in a low-dimensional Euclidean space. The procedure thus appears to be the counterpart of principal component analysis for categorical data. MCA can be viewed as an extension of simple correspondence analysis (CA) in that it is applicable to a large set of categorical variables.

Pascalization, bridgmanization, high pressure processing (HPP) or high hydrostatic pressure (HHP) processing is a method of preserving and sterilizing food, in which a product is processed under very high pressure, leading to the inactivation of certain microorganisms and enzymes in the food. HPP has a limited effect on covalent bonds within the food product, thus maintaining both the sensory and nutritional aspects of the product. The technique was named after Blaise Pascal, a French scientist of the 17th century whose work included detailing the effects of pressure on fluids. During pascalization, more than 50,000 pounds per square inch may be applied for around fifteen minutes, leading to the inactivation of yeast, mold, and bacteria. Pascalization is also known as bridgmanization, named for physicist Percy Williams Bridgman.

Sensory design aims to establish an overall diagnosis of the sensory perceptions of a product, and define appropriate means to design or redesign it on that basis. It involves an observation of the diverse and varying situations in which a given product or object is used in order to measure the users' overall opinion of the product, its positive and negative aspects in terms of tactility, appearance, sound and so on.

Le Bail analysis is a whole diffraction pattern profile fitting technique used to characterize the properties of crystalline materials, such as structure. It was invented by Armel Le Bail around 1988.

Multiple factor analysis (MFA) is a factorial method devoted to the study of tables in which a group of individuals is described by a set of variables structured in groups. It may be seen as an extension of:

Flavor lexicons or flavour lexicons are used by professional taste testers to develop and detail the sensory perception experienced from food. The lexicon is a word bank developed by professional taste testers in order to identify an objective, nuanced and cross-cultural word bank for food.

Gas chromatography-olfactometry (GC-O) is a technique that integrates the separation of volatile compounds using a gas chromatograph with the detection of odour using an olfactometer. It was first invented and applied in 1964 by Fuller and co-workers. While GC separates and quantifies volatile compounds from an extract, human olfaction detects the odour activity of each eluting compound. In this olfactometric detection, a human assessor may qualitatively determine whether a compound has odour activity or describe the odour perceived, or quantitatively evaluate the intensity of the odour or the duration of the odour activity. The olfactometric detection of compounds allows the assessment of the relationship between a quantified substance and the human perception of its odour, without instrumental detection limits present in other kinds of detectors.