Quality of experience (QoE) is a measure of the delight or annoyance of a customer's experiences with a service (e.g., web browsing, phone call, TV broadcast). [1] QoE focuses on the entire service experience; it is a holistic concept, similar to the field of user experience, but with its roots in telecommunication. [2] QoE is an emerging multidisciplinary field based on social psychology, cognitive science, economics, and engineering science, focused on understanding overall human quality requirements.
In 2013, within the context of the COST Action QUALINET, QoE has been defined as: [1]
The degree of delight or annoyance of the user of an application or service. It results from the fulfillment of his or her expectations with respect to the utility and / or enjoyment of the application or service in the light of the user’s personality and current state.
This definition has been adopted in 2016 by the International Telecommunication Union in Recommendation ITU-T P.10/G.100. [3] Before, various definitions of QoE had existed in the domain, with the above-mentioned definition now finding wide acceptance in the community.
QoE has historically emerged from Quality of Service (QoS), which attempts to objectively measure service parameters (such as packet loss rates or average throughput). QoS measurement is most of the time not related to a customer, but to the media or network itself. QoE however is a purely subjective measure from the user's perspective of the overall quality of the service provided, by capturing people's aesthetic and hedonic needs. [4]
QoE looks at a vendor's or purveyor's offering from the standpoint of the customer or end user, and asks, "What mix of goods, services, and support, do you think will provide you with the perception that the total product is providing you with the experience you desired and/or expected?" It then asks, "Is this what the vendor/purveyor has actually provided?" If not, "What changes need to be made to enhance your total experience?" In short, QoE provides an assessment of human expectations, feelings, perceptions, cognition and satisfaction with respect to a particular product, service or application. [5]
QoE is a blueprint of all human subjective and objective quality needs and experiences arising from the interaction of a person with technology and with business entities in a particular context. [4] Although QoE is perceived as subjective, it is an important measure that counts for customers of a service. Being able to measure it in a controlled manner helps operators understand what may be wrong with their services and how to improve them.
QoE aims at taking into consideration every factor that contributes to a user's perceived quality of a system or service. This includes system, human and contextual factors. [6] The following so-called "influence factors" have been identified and classified by Reiter et al.: [6]
Studies in the field of QoE have typically focused on system factors, primarily due to its origin in the QoS and network engineering domains. Through the use of dedicated test laboratories, the context is often sought to be kept constant.
QoE is strongly related to but different from the field of User Experience (UX), which also focuses on users' experiences with services. Historically, QoE has emerged from telecommunication research, while UX has its roots in Human–Computer Interaction. [2] Both fields can be considered multi-disciplinary. In contrast to UX, the goal of improving QoE for users was more strongly motivated by economic needs. [7]
Wechsung and De Moor identify the following key differences between the fields: [2]
QoE | UX | |
---|---|---|
Origins | Telecommunication | Human–Computer Interaction |
Driving Force | Technology-centered | Human-centered |
Theoretical Basis | Measurement and instrumentation approaches Historical lack of theoretical frameworks | Non-instrumental research Theoretic background in hedonic psychology |
Measurement and Evaluation | Predominantly quantitative research Empirical–positivist research | Predominantly qualitative methods Interpretative and constructivist research |
Experience and Perceptions | Focus on “quality formation” and perception of quality | Focus on “experience” concept |
As a measure of the end-to-end performance at the service level from the user's perspective, QoE is an important metric for the design of systems and engineering processes. This is particularly relevant for video services because – due to their high traffic demands –, bad network performance may highly affect the user's experience. [8] [9] So, when designing systems, the expected output, i.e. the expected QoE, is often taken into account – also as a system output metric and optimization goal.
To measure this level of QoE, human ratings can be used. The mean opinion score (MOS) is a widely used measure for assessing the quality of media signals. It is a limited form of QoE measurement, relating to a specific media type, in a controlled environment and without explicitly taking into account user expectations. The MOS as an indicator of experienced quality has been used for audio and speech communication, as well as for the assessment of quality of Internet video, television and other multimedia signals, [10] and web browsing. [11] Due to inherent limitations in measuring QoE in a single scalar value, the usefulness of the MOS is often debated. [12]
Subjective quality evaluation requires a lot of human resources, establishing it as a time-consuming process. Objective evaluation methods can provide quality results faster, but require dedicated computing resources. Since such instrumental video quality algorithms are often developed based on a limited set of subjective data, their QoE prediction accuracy may be low when compared to human ratings.
QoE metrics are often measured at the end devices and can conceptually be seen as the remaining quality after the distortion introduced during the preparation of the content and the delivery through the network, until it reaches the decoder at the end device. There are several elements in the media preparation and delivery chain, and some of them may introduce distortion. This causes degradation of the content, and several elements in this chain can be considered as ”QoE-relevant“ for the offered services. The causes of degradation are applicable for any multimedia service, that is, not exclusive to video or speech. Typical degradations occur at the encoding system (compression degradation), transport network, access network (e.g., packet loss or packet delay), home network (e.g. WiFi performance) and end device (e.g. decoding performance).
Several QoE-centric network management and bandwidth management solutions have been proposed, which aim to improve the QoE delivered to the end-users. [13] [14] [15] [16]
When managing a network, QoE fairness may be taken into account in order to keep the users sufficiently satisfied (i.e., high QoE) in a fair manner. From a QoE perspective, network resources and multimedia services should be managed in order to guarantee specific QoE levels instead of classical QoS parameters, which are unable to reflect the actual delivered QoE. A pure QoE-centric management is challenged by the nature of the Internet itself, as the Internet protocols and architecture were not originally designed to support today's complex and high demanding multimedia services.
As an example for an implementation of QoE management, network nodes can become QoE-aware by estimating the status of the multimedia service as perceived by the end-users. [17] This information can then be used to improve the delivery of the multimedia service over the network and proactively improve the users' QoE. [18] This can be achieved, for example, via traffic shaping. QoE management gives the service provider and network operator the capability to minimize storage and network resources by allocating only the resources that are sufficient to maintain a specific level of user satisfaction.
As it may involve limiting resources for some users or services in order to increase the overall network performance and QoE, the practice of QoE management requires that net neutrality regulations are considered. [19]
Quality of service (QoS) is the description or measurement of the overall performance of a service, such as a telephony or computer network, or a cloud computing service, particularly the performance seen by the users of the network. To quantitatively measure quality of service, several related aspects of the network service are often considered, such as packet loss, bit rate, throughput, transmission delay, availability, jitter, etc.
In telecommunication, provisioning involves the process of preparing and equipping a network to allow it to provide new services to its users. In National Security/Emergency Preparedness telecommunications services, "provisioning" equates to "initiation" and includes altering the state of an existing priority service or capability.
Telephony is the field of technology involving the development, application, and deployment of telecommunication services for the purpose of electronic transmission of voice, fax, or data, between distant parties. The history of telephony is intimately linked to the invention and development of the telephone.
Context awareness refers, in information and communication technologies, to a capability to take into account the situation of entities, which may be users or devices, but are not limited to those. Location is only the most obvious element of this situation. Narrowly defined for mobile devices, context awareness does thus generalize location awareness. Whereas location may determine how certain processes around a contributing device operate, context may be applied more flexibly with mobile users, especially with users of smart phones. Context awareness originated as a term from ubiquitous computing or as so-called pervasive computing which sought to deal with linking changes in the environment with computer systems, which are otherwise static. The term has also been applied to business theory in relation to contextual application design and business process management issues.
Human-centered computing (HCC) studies the design, development, and deployment of mixed-initiative human-computer systems. It is emerged from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts. Human-centered computing is closely related to human-computer interaction and information science. Human-centered computing is usually concerned with systems and practices of technology use while human-computer interaction is more focused on ergonomics and the usability of computing artifacts and information science is focused on practices surrounding the collection, manipulation, and use of information.
Mean opinion score (MOS) is a measure used in the domain of Quality of Experience and telecommunications engineering, representing overall quality of a stimulus or system. It is the arithmetic mean over all individual "values on a predefined scale that a subject assigns to his opinion of the performance of a system quality". Such ratings are usually gathered in a subjective quality evaluation test, but they can also be algorithmically estimated.
Video quality is a characteristic of a video passed through a video transmission or processing system that describes perceived video degradation. Video processing systems may introduce some amount of distortion or artifacts in the video signal that negatively impact the user's perception of the system. For many stakeholders in video production and distribution, ensuring video quality is an important task.
Subjective video quality is video quality as experienced by humans. It is concerned with how video is perceived by a viewer and designates their opinion on a particular video sequence. It is related to the field of Quality of Experience. Measuring subjective video quality is necessary because objective quality assessment algorithms such as PSNR have been shown to correlate poorly with subjective ratings. Subjective ratings may also be used as ground truth to develop new algorithms.
Fairness measures or metrics are used in network engineering to determine whether users or applications are receiving a fair share of system resources. There are several mathematical and conceptual definitions of fairness.
Sentiment analysis is the use of natural language processing, text analysis, computational linguistics, and biometrics to systematically identify, extract, quantify, and study affective states and subjective information. Sentiment analysis is widely applied to voice of the customer materials such as reviews and survey responses, online and social media, and healthcare materials for applications that range from marketing to customer service to clinical medicine. With the rise of deep language models, such as RoBERTa, also more difficult data domains can be analyzed, e.g., news texts where authors typically express their opinion/sentiment less explicitly.
Latency refers to a short period of delay between when an audio signal enters a system and when it emerges. Potential contributors to latency in an audio system include analog-to-digital conversion, buffering, digital signal processing, transmission time, digital-to-analog conversion and the speed of sound in the transmission medium.
Mobile broadband is the marketing term for wireless Internet access via mobile networks. Access to the network can be made through a portable modem, wireless modem, or a tablet/smartphone or other mobile device. The first wireless Internet access became available in 1991 as part of the second generation (2G) of mobile phone technology. Higher speeds became available in 2001 and 2006 as part of the third (3G) and fourth (4G) generations. In 2011, 90% of the world's population lived in areas with 2G coverage, while 45% lived in areas with 2G and 3G coverage. Mobile broadband uses the spectrum of 225 MHz to 3700 MHz.
Edholm's law, proposed by and named after Phil Edholm, refers to the observation that the three categories of telecommunication, namely wireless (mobile), nomadic and wired networks (fixed), are in lockstep and gradually converging. Edholm's law also holds that data rates for these telecommunications categories increase on similar exponential curves, with the slower rates trailing the faster ones by a predictable time lag. Edholm's law predicts that the bandwidth and data rates double every 18 months, which has proven to be true since the 1970s. The trend is evident in the cases of Internet, cellular (mobile), wireless LAN and wireless personal area networks.
Perceptual Evaluation of Video Quality(PEVQ) is an end-to-end (E2E) measurement algorithm to score the picture quality of a video presentation by means of a 5-point mean opinion score (MOS). It is, therefore, a video quality model. PEVQ was benchmarked by the Video Quality Experts Group (VQEG) in the course of the Multimedia Test Phase 2007–2008. Based on the performance results, in which the accuracy of PEVQ was tested against ratings obtained by human viewers, PEVQ became part of the new International Standard.
In computing, bandwidth is the maximum rate of data transfer across a given path. Bandwidth may be characterized as network bandwidth, data bandwidth, or digital bandwidth.
Adaptive bitrate streaming is a technique used in streaming multimedia over computer networks.
Software-defined networking (SDN) is an approach to network management that enables dynamic and programmatically efficient network configuration to improve network performance and monitoring in a manner more akin to cloud computing than to traditional network management. SDN is meant to improve the static architecture of traditional networks and may be employed to centralize network intelligence in one network component by disassociating the forwarding process of network packets from the routing process. The control plane consists of one or more controllers, which are considered the brains of the SDN network, where the whole intelligence is incorporated. However, centralization has certain drawbacks related to security, scalability and elasticity.
Traffic classification is an automated process which categorises computer network traffic according to various parameters into a number of traffic classes. Each resulting traffic class can be treated differently in order to differentiate the service implied for the data generator or consumer.
Fog computing or fog networking, also known as fogging, is an architecture that uses edge devices to carry out a substantial amount of computation, storage, and communication locally and routed over the Internet backbone.
5G network slicing is a network architecture that enables the multiplexing of virtualized and independent logical networks on the same physical network infrastructure. Each network slice is an isolated end-to-end network tailored to fulfill diverse requirements requested by a particular application.