Coupling (computer programming)

Last updated

In software engineering, coupling is the degree of interdependence between software modules, a measure of how closely connected two routines or modules are [1] , and the strength of the relationships between modules. [2] Coupling is not binary but multi-dimensional. [3]

Contents

Coupling and cohesion CouplingVsCohesion.svg
Coupling and cohesion

Coupling is usually contrasted with cohesion. Low coupling often correlates with high cohesion, and vice versa. Low coupling is often thought to be a sign of a well-structured computer system and a good design, and when combined with high cohesion, supports the general goals of high readability and maintainability.[ citation needed ]

History

The software quality metrics of coupling and cohesion were invented by Larry Constantine in the late 1960s as part of a structured design, based on characteristics of “good” programming practices that reduced maintenance and modification costs. Structured design, including cohesion and coupling, were published in the article Stevens, Myers & Constantine (1974) [4] and the book Yourdon & Constantine (1979), [5] and the latter subsequently became standard terms.

Types of coupling

Conceptual model of coupling Coupling sketches cropped 1.svg
Conceptual model of coupling

Coupling can be "low" (also "loose" and "weak") or "high" (also "tight" and "strong"). Some types of coupling, in order of highest to lowest coupling, are as follows:

Procedural programming

A module here refers to a subroutine of any kind, i.e. a set of one or more statements having a name and preferably its own set of variable names.

Content coupling (high)
Content coupling is said to occur when one module uses the code of another module, for instance a branch. This violates information hiding – a basic software design concept.
Common coupling
Common coupling is said to occur when several modules have access to the same global data. But it can lead to uncontrolled error propagation and unforeseen side-effects when changes are made.
External coupling
External coupling occurs when two modules share an externally imposed data format, communication protocol, or device interface. This is basically related to the communication to external tools and devices.
Control coupling
Control coupling is one module controlling the flow of another, by passing it information on what to do (e.g., passing a what-to-do flag).
Stamp coupling (data-structured coupling)
Stamp coupling occurs when modules share a composite data structure and use only parts of it, possibly different parts (e.g., passing a whole record to a function that needs only one field of it).
In this situation, a modification in a field that a module does not need may lead to changing the way the module reads the record.
Data coupling
Data coupling occurs when modules share data through, for example, parameters. Each datum is an elementary piece, and these are the only data shared (e.g., passing an integer to a function that computes a square root).

Object-oriented programming

Subclass coupling
Describes the relationship between a child and its parent. The child is connected to its parent, but the parent is not connected to the child.
Temporal coupling
It is when two actions are bundled together into one module just because they happen to occur at the same time.

In recent work various other coupling concepts have been investigated and used as indicators for different modularization principles used in practice. [6]

Dynamic coupling

The goal of defining and measuring this type of coupling is to provide a run-time evaluation of a software system. It has been argued that static coupling metrics lose precision when dealing with an intensive use of dynamic binding or inheritance. [7] In the attempt to solve this issue, dynamic coupling measures have been taken into account.

Semantic coupling

This kind of a coupling metric considers the conceptual similarities between software entities using, for example, comments and identifiers and relying on techniques such as latent semantic indexing (LSI).

Logical coupling

Logical coupling (or evolutionary coupling or change coupling) analysis exploits the release history of a software system to find change patterns among modules or classes: e.g., entities that are likely to be changed together or sequences of changes (a change in a class A is always followed by a change in a class B).

Dimensions of coupling

According to Gregor Hohpe, coupling is multi-dimensional: [3]

Disadvantages of tight coupling

Tightly coupled systems tend to exhibit the following developmental characteristics, which are often seen as disadvantages:

  1. A change in one module usually forces a ripple effect of changes in other modules.
  2. Assembly of modules might require more effort and/or time due to the increased inter-module dependency.
  3. A particular module might be harder to reuse and/or test because dependent modules must be included.

Performance issues

Whether loosely or tightly coupled, a system's performance is often reduced by message and parameter creation, transmission, translation (e.g. marshaling) and message interpretation (which might be a reference to a string, array or data structure), which require less overhead than creating a complicated message such as a SOAP message. Longer messages require more CPU and memory to produce. To optimize runtime performance, message length must be minimized and message meaning must be maximized.

Message Transmission Overhead and Performance
Since a message must be transmitted in full to retain its complete meaning, message transmission must be optimized. Longer messages require more CPU and memory to transmit and receive. Also, when necessary, receivers must reassemble a message into its original state to completely receive it. Hence, to optimize runtime performance, message length must be minimized and message meaning must be maximized.
Message Translation Overhead and Performance
Message protocols and messages themselves often contain extra information (i.e., packet, structure, definition and language information). Hence, the receiver often needs to translate a message into a more refined form by removing extra characters and structure information and/or by converting values from one type to another. Any sort of translation increases CPU and/or memory overhead. To optimize runtime performance, message form and content must be reduced and refined to maximize its meaning and reduce translation.
Message Interpretation Overhead and Performance
All messages must be interpreted by the receiver. Simple messages such as integers might not require additional processing to be interpreted. However, complex messages such as SOAP messages require a parser and a string transformer for them to exhibit intended meanings. To optimize runtime performance, messages must be refined and reduced to minimize interpretation overhead.

Solutions

One approach to decreasing coupling is functional design, which seeks to limit the responsibilities of modules along functionality. Coupling increases between two classes A and B if:

Low coupling refers to a relationship in which one module interacts with another module through a simple and stable interface and does not need to be concerned with the other module's internal implementation (see Information Hiding).

Systems such as CORBA or COM allow objects to communicate with each other without having to know anything about the other object's implementation. Both of these systems even allow for objects to communicate with objects written in other languages.

Coupling versus cohesion

Coupling and cohesion are terms which occur together very frequently. Coupling refers to the interdependencies between modules, while cohesion describes how related the functions within a single module are. Low cohesion implies that a given module performs tasks which are not very related to each other and hence can create problems as the module becomes large.

Module coupling

Coupling in Software Engineering [8] describes a version of metrics associated with this concept.

For data and control flow coupling:

For global coupling:

For environmental coupling:

Coupling(C) makes the value larger the more coupled the module is. This number ranges from approximately 0.67 (low coupling) to 1.0 (highly coupled)

For example, if a module has only a single input and output data parameter

If a module has 5 input and output data parameters, an equal number of control parameters, and accesses 10 items of global data, with a fan-in of 3 and a fan-out of 4,

See also

Related Research Articles

<span class="mw-page-title-main">Computer program</span> Instructions a computer can execute

A computer program is a sequence or set of instructions in a programming language for a computer to execute. It is one component of software, which also includes documentation and other intangible components.

<span class="mw-page-title-main">Erlang (programming language)</span> Programming language

Erlang is a general-purpose, concurrent, functional high-level programming language, and a garbage-collected runtime system. The term Erlang is used interchangeably with Erlang/OTP, or Open Telecom Platform (OTP), which consists of the Erlang runtime system, several ready-to-use components (OTP) mainly written in Erlang, and a set of design principles for Erlang programs.

An optimizing compiler is a compiler designed to generate code that is optimized in aspects such as minimizing program execution time, memory use, storage size, and power consumption. Optimization is generally implemented as a sequence of optimizing transformations, algorithms that transform code to produce semantically equivalent code optimized for some aspect. It is typically CPU and memory intensive. In practice, factors such as available memory and a programmer's willingness to wait for compilation limit the optimizations that a compiler might provide. Research indicates that some optimization problems are NP-complete, or even undecidable.

In software engineering and development, a software metric is a standard of measure of a degree to which a software system or process possesses some property. Even if a metric is not a measurement, often the two terms are used as synonyms. Since quantitative measurements are essential in all sciences, there is a continuous effort by computer science practitioners and theoreticians to bring similar approaches to software development. The goal is obtaining objective, reproducible and quantifiable measurements, which may have numerous valuable applications in schedule and budget planning, cost estimation, quality assurance, testing, software debugging, software performance optimization, and optimal personnel task assignments.

The following outline is provided as an overview of and topical guide to software engineering:

In compiler theory, dead-code elimination is a compiler optimization to remove dead code. Removing such code has several benefits: it shrinks program size, an important consideration in some contexts, it reduces resource usage such as the number of bytes to be transferred and it allows the running program to avoid executing irrelevant operations, which reduces its running time. It can also enable further optimizations by simplifying program structure. Dead code includes code that can never be executed, and code that only affects dead variables, that is, irrelevant to the program.

In computer programming, cohesion refers to the degree to which the elements inside a module belong together. In one sense, it is a measure of the strength of relationship between the methods and data of a class and some unifying purpose or concept served by that class. In another sense, it is a measure of the strength of relationship between the class's methods and data.

Modular programming is a software design technique that emphasizes separating the functionality of a program into independent, interchangeable modules, such that each contains everything necessary to execute only one aspect of the desired functionality.

<span class="mw-page-title-main">Edward Yourdon</span> American software engineer and pioneer in the software engineering methodology

Edward Nash Yourdon was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis techniques of the 1970s and a co-developer of both the Yourdon/Whitehead method for object-oriented analysis/design in the late 1980s and the Coad/Yourdon methodology for object-oriented analysis/design in the 1990s.

A data-flow diagram is a way of representing a flow of data through a process or a system. The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram has no control flowthere are no decision rules and no loops. Specific operations based on the data can be represented by a flowchart.

In computing and systems design, a loosely coupled system is one

  1. in which components are weakly associated with each other, and thus changes in one component least affect existence or performance of another component.
  2. in which each of its components has, or makes use of, little or no knowledge of the definitions of other separate components. Subareas include the coupling of classes, interfaces, data, and services. Loose coupling is the opposite of tight coupling.

Decomposition in computer science, also known as factoring, is breaking a complex problem or system into parts that are easier to conceive, understand, program, and maintain.

In software engineering, profiling is a form of dynamic program analysis that measures, for example, the space (memory) or time complexity of a program, the usage of particular instructions, or the frequency and duration of function calls. Most commonly, profiling information serves to aid program optimization, and more specifically, performance engineering.

The single-responsibility principle (SRP) is a computer programming principle that states that "A module should be responsible to one, and only one, actor." The term actor refers to a group that requires a change in the module.

Object-oriented analysis and design (OOAD) is a technical approach for analyzing and designing an application, system, or business by applying object-oriented programming, as well as using visual modeling throughout the software development process to guide stakeholder communication and product quality.

This is an alphabetical list of articles pertaining specifically to software engineering.

Programming complexity is a term that includes software properties that affect internal interactions. Several commentators distinguish between the terms "complex" and "complicated". Complicated implies being difficult to understand, but ultimately knowable. Complex, by contrast, describes the interactions between entities. As the number of entities increases, the number of interactions between them increases exponentially, making it impossible to know and understand them all. Similarly, higher levels of complexity in software increase the risk of unintentionally interfering with interactions, thus increasing the risk of introducing defects when changing the software. In more extreme cases, it can make modifying the software virtually impossible.

<span class="mw-page-title-main">Larry Constantine</span> American software engineer

Larry LeRoy Constantine is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous concepts and techniques forming the foundations of modern practice in software engineering and applications design and development.

<span class="mw-page-title-main">Structured analysis</span>

In software engineering, structured analysis (SA) and structured design (SD) are methods for analyzing business requirements and developing specifications for converting practices into computer programs, hardware configurations, and related manual procedures.

Connascence is a software quality metric invented by Meilir Page-Jones to allow reasoning about the complexity caused by dependency relationships in object-oriented design much like coupling did for structured design. In software engineering, two components are connascent if a change in one would require the other to be modified in order to maintain the overall correctness of the system. In addition to allowing categorization of dependency relationships, connascence also provides a system for comparing different types of dependency. Such comparisons between potential designs can often hint at ways to improve the quality of the software.

References

  1. ISO/IEC/IEEE 24765:2010 Systems and software engineering — Vocabulary
  2. ISO/IEC TR 19759:2005, Software Engineering — Guide to the Software Engineering Body of Knowledge (SWEBOK)
  3. 1 2 Hohpe, Gregor. Enterprise Integration Patterns: Designing, Building, and Deploying Messaging Solutions. Addison-Wesley Professional. ISBN   978-0321200686.
  4. Stevens, Wayne P.; Myers, Glenford J.; Constantine, Larry LeRoy (June 1974). "Structured design". IBM Systems Journal . 13 (2): 115–139. doi:10.1147/sj.132.0115.
  5. Yourdon, Edward; Constantine, Larry LeRoy (1979) [1975]. Structured Design: Fundamentals of a Discipline of Computer Program and Systems Design. Yourdon Press. Bibcode:1979sdfd.book.....Y. ISBN   978-0-13-854471-3.
  6. Beck, Fabian; Diehl, Stephan (September 2011). "On the Congruence of Modularity and Code Coupling". In Proceedings of the 19th ACM SIGSOFT Symposium and the 13th European Conference on Foundations of Software Engineering (SIGSOFT/FSE '11). Szeged, Hungary. p. 354. doi:10.1145/2025113.2025162. ISBN   9781450304436. S2CID   2413103.{{cite book}}: CS1 maint: location missing publisher (link)
  7. Arisholm, Erik; Briand, Lionel C.; Føyen, Audun (August 2004). "Dynamic coupling measurement for object-oriented software". IEEE Transactions on Software Engineering . 30 (8). IEEE: 491–506. doi:10.1109/TSE.2004.41. hdl: 10852/9090 . S2CID   3074827.
  8. Pressman, Roger S. (1982). Software Engineering - A Practitioner's Approach (4 ed.). McGraw-Hill. ISBN   0-07-052182-4.

Further reading