In software engineering, coupling is the degree of interdependence between software modules, a measure of how closely connected two routines or modules are, [1] and the strength of the relationships between modules. [2] Coupling is not binary but multi-dimensional. [3]
Coupling is usually contrasted with cohesion. Low coupling often correlates with high cohesion, and vice versa. Low coupling is often thought to be a sign of a well-structured computer system and a good design, and when combined with high cohesion, supports the general goals of high readability and maintainability.[ citation needed ]
The software quality metrics of coupling and cohesion were invented by Larry Constantine in the late 1960s as part of a structured design, based on characteristics of “good” programming practices that reduced maintenance and modification costs. Structured design, including cohesion and coupling, were published in the article Stevens, Myers & Constantine (1974) [4] and the book Yourdon & Constantine (1979), [5] and the latter subsequently became standard terms.
Coupling can be "low" (also "loose" and "weak") or "high" (also "tight" and "strong"). Some types of coupling, in order of highest to lowest coupling, are as follows:
A module here refers to a subroutine of any kind, i.e. a set of one or more statements having a name and preferably its own set of variable names.
UserProfile
component. This component is designed to return the entire user profile information in response to requests, even when consumers only require a specific attribute. This practice exemplifies stamp coupling, which can lead to significant bandwidth issues, especially at scale. When any attribute within the UserProfile
component changes, all consumers that interact with it may need to undergo testing, even if they do not utilize the modified attribute. [6] In recent work various other coupling concepts have been investigated and used as indicators for different modularization principles used in practice. [7]
The goal of defining and measuring this type of coupling is to provide a run-time evaluation of a software system. It has been argued that static coupling metrics lose precision when dealing with an intensive use of dynamic binding or inheritance. [8] In the attempt to solve this issue, dynamic coupling measures have been taken into account.
This kind of a coupling metric considers the conceptual similarities between software entities using, for example, comments and identifiers and relying on techniques such as latent semantic indexing (LSI).
Logical coupling (or evolutionary coupling or change coupling) analysis exploits the release history of a software system to find change patterns among modules or classes: e.g., entities that are likely to be changed together or sequences of changes (a change in a class A is always followed by a change in a class B).
According to Gregor Hohpe, coupling is multi-dimensional: [3]
Tightly coupled systems tend to exhibit the following developmental characteristics, which are often seen as disadvantages:
Whether loosely or tightly coupled, a system's performance is often reduced by message and parameter creation, transmission, translation (e.g. marshaling) and message interpretation (which might be a reference to a string, array or data structure), which require less overhead than creating a complicated message such as a SOAP message. Longer messages require more CPU and memory to produce. To optimize runtime performance, message length must be minimized and message meaning must be maximized.
One approach to decreasing coupling is functional design, which seeks to limit the responsibilities of modules along functionality. Coupling increases between two classes A and B if:
Low coupling refers to a relationship in which one module interacts with another module through a simple and stable interface and does not need to be concerned with the other module's internal implementation (see Information Hiding).
Systems such as CORBA or COM allow objects to communicate with each other without having to know anything about the other object's implementation. Both of these systems even allow for objects to communicate with objects written in other languages.
Coupling describes the degree and nature of dependency between software components, focusing on what they share (e.g., data, control flow, technology) and how tightly they are bound. It evaluates two key dimensions: strength, which measures how difficult it is to change the dependency, and scope (or visibility), which indicates how widely the dependency is exposed across modules or boundaries. Traditional coupling types typically include content coupling, common coupling, control coupling, stamp coupling, external coupling, and data coupling. [9] [10] [11]
Connascence, introduced by Meilir Page-Jones, provides a systematic framework for analyzing and measuring coupling dependencies. It evaluates dependencies based on three dimensions: strength, which measures the effort required to refactor or modify the dependency; locality, which considers how physically or logically close dependent components are in the codebase; and degree, which measures how many components are affected by the dependency. Connascence can be categorized into static (detectable at compile-time) and dynamic (detectable at runtime) forms. Static connascence refers to compile-time dependencies, such as method signatures, while dynamic connascence refers to runtime dependencies, which can manifest in forms like connascence of timing, values, or algorithm. [9] [10] [11]
Each coupling flavor can exhibit multiple types of connascence, a specific type, or, in rare cases, none at all, depending on how the dependency is implemented. Common types of connascence include connascence of name, type, position, and meaning. Certain coupling types naturally align with specific connascence types; for example, data coupling often involves connascence of name or type. However, not every combination of coupling and connascence is practically meaningful. Dependencies relying on parameter order in a method signature demonstrate connascence of position, which is fragile and difficult to refactor because reordering parameters breaks the interface. In contrast, connascence of name, which relies on field or parameter names, is generally more resilient to change. Connascence types themselves exhibit a natural hierarchy of strength, with connascence of name typically considered weaker than connascence of meaning. [9] [10] [11]
Dependencies spanning module boundaries or distributed systems typically have higher coordination costs, increasing the difficulty of refactoring and propagating changes across distant boundaries. Modern practices, such as dependency injection and interface-based programming, are often employed to reduce coupling strength and improve the maintainability of dependencies. [9] [10] [11]
While coupling identifies what is shared between components, connascence evaluates how those dependencies behave, how changes propagate, and how difficult they are to refactor. Strength, locality, and degree are interrelated; dependencies with high strength, wide scope, and spanning distant boundaries are significantly harder to refactor and maintain. Together, coupling provides a high-level overview of dependency relationships, while connascence offers a granular framework for analyzing dependency strength, locality, degree, and resilience to change, supporting the design of maintainable and robust systems. [9] [10] [11]
Coupling and cohesion are terms which occur together very frequently. Coupling refers to the interdependencies between modules, while cohesion describes how related the functions within a single module are. Low cohesion implies that a given module performs tasks which are not very related to each other and hence can create problems as the module becomes large.
Coupling in Software Engineering [12] describes a version of metrics associated with this concept.
For data and control flow coupling:
For global coupling:
For environmental coupling:
Coupling(C)
makes the value larger the more coupled the module is. This number ranges from approximately 0.67 (low coupling) to 1.0 (highly coupled)
For example, if a module has only a single input and output data parameter
If a module has 5 input and output data parameters, an equal number of control parameters, and accesses 10 items of global data, with a fan-in of 3 and a fan-out of 4,
A computer program is a sequence or set of instructions in a programming language for a computer to execute. It is one component of software, which also includes documentation and other intangible components.
Erlang is a general-purpose, concurrent, functional high-level programming language, and a garbage-collected runtime system. The term Erlang is used interchangeably with Erlang/OTP, or Open Telecom Platform (OTP), which consists of the Erlang runtime system, several ready-to-use components (OTP) mainly written in Erlang, and a set of design principles for Erlang programs.
The following outline is provided as an overview of and topical guide to software engineering:
Test-driven development (TDD) is a way of writing code that involves writing an automated unit-level test case that fails, then writing just enough code to make the test pass, then refactoring both the test code and the production code, then repeating with another new test case.
In compiler theory, dead-code elimination is a compiler optimization to remove dead code. Removing such code has several benefits: it shrinks program size, an important consideration in some contexts, it reduces resource usage such as the number of bytes to be transferred and it allows the running program to avoid executing irrelevant operations, which reduces its running time. It can also enable further optimizations by simplifying program structure. Dead code includes code that can never be executed, and code that only affects dead variables, that is, irrelevant to the program.
In computer programming, cohesion refers to the degree to which the elements inside a module belong together. In one sense, it is a measure of the strength of relationship between the methods and data of a class and some unifying purpose or concept served by that class. In another sense, it is a measure of the strength of relationship between the class's methods and data.
Modular programming is a software design technique that emphasizes separating the functionality of a program into independent, interchangeable modules, such that each contains everything necessary to execute only one aspect or "concern" of the desired functionality.
A data-flow diagram is a way of representing a flow of data through a process or a system. The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram has no control flow — there are no decision rules and no loops. Specific operations based on the data can be represented by a flowchart.
In software engineering, dependency injection is a programming technique in which an object or function receives other objects or functions that it requires, as opposed to creating them internally. Dependency injection aims to separate the concerns of constructing objects and using them, leading to loosely coupled programs. The pattern ensures that an object or function that wants to use a given service should not have to know how to construct those services. Instead, the receiving "client" is provided with its dependencies by external code, which it is not aware of. Dependency injection makes implicit dependencies explicit and helps solve the following problems:
In computing and systems design, a loosely coupled system is one
Decomposition in computer science, also known as factoring, is breaking a complex problem or system into parts that are easier to conceive, understand, program, and maintain.
A data structure diagram (DSD) is the visual representation of a certain kind of data model that contains entities, their relationships, and the constraints that are placed on them. It is an older alternative to the entity–relationship model.
The single-responsibility principle (SRP) is a computer programming principle that states that "A module should be responsible to one, and only one, actor." The term actor refers to a group that requires a change in the module.
Object-oriented analysis and design (OOAD) is a technical approach for analyzing and designing an application, system, or business by applying object-oriented programming, as well as using visual modeling throughout the software development process to guide stakeholder communication and product quality.
Larry LeRoy Constantine is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous concepts and techniques forming the foundations of modern practice in software engineering and applications design and development.
In software engineering, structured analysis (SA) and structured design (SD) are methods for analyzing business requirements and developing specifications for converting practices into computer programs, hardware configurations, and related manual procedures.
Service-oriented architecture library (SOALIB) is used to distribute reusable service-oriented architecture (SOA) software in a manner similar to other computing libraries. SOA consists of loosely coupled interoperable services which use messaging based on both Simple Object Access Protocol (SOAP) and Representational State Transfer (REST). A library in computing is a set of compiled modules which are tested and ready for reuse. A similar concept is used for SOA, in that whatever technology is used to develop the service can also be distributed in library form. A Java-based SOA library may be distributed in Web ARchive (WAR) or Enterprise Archive (EAR) file formats. C, C++, and .NET applications may be distributed as a shared object, a Dynamic Link Library, or as an executable file.
In the field of software engineering, the interface segregation principle (ISP) states that no code should be forced to depend on methods it does not use. ISP splits interfaces that are very large into smaller and more specific ones so that clients will only have to know about the methods that are of interest to them. Such shrunken interfaces are also called role interfaces. ISP is intended to keep a system decoupled and thus easier to refactor, change, and redeploy. ISP is one of the five SOLID principles of object-oriented design, similar to the High Cohesion Principle of GRASP. Beyond object-oriented design, ISP is also a key principle in the design of distributed systems in general and one of the six IDEALS principles for microservice design.
Connascence is a software design metric introduced by Meilir Page-Jones that quantifies the degree and type of dependency between software components, evaluating their strength and locality. It can be categorized as static or dynamic and includes forms such as Connascence of Name, Type, and Position, each representing different dependency characteristics and levels of fragility.
In computer programming, a function is a callable unit of software logic that has a well-defined interface and behavior and can be invoked multiple times.
{{cite book}}
: CS1 maint: location missing publisher (link)