Program comprehension

Last updated

Program comprehension (also program understanding or [source] code comprehension) is a domain of computer science concerned with the ways software engineers maintain existing source code. The cognitive and other processes involved are identified and studied. [1] The results are used to develop tools and training. [2]

Contents

Software maintenance tasks have five categories: adaptive maintenance, corrective maintenance, perfective maintenance, code reuse, and code leverage.

Theories of program comprehension

Titles of works on program comprehension include

Computer scientists pioneering program comprehension include Ruven Brooks, Ted J. Biggerstaff, and Anneliese von Mayrhauser.

Related Research Articles

<span class="mw-page-title-main">Computing</span> Activity involving calculations or computing machinery

Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and development of both hardware and software. Computing has scientific, engineering, mathematical, technological and social aspects. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology and software engineering.

In computer programming and software design, code refactoring is the process of restructuring existing computer code—changing the factoring—without changing its external behavior. Refactoring is intended to improve the design, structure, and/or implementation of the software, while preserving its functionality. Potential advantages of refactoring may include improved code readability and reduced complexity; these can improve the source code's maintainability and create a simpler, cleaner, or more expressive internal architecture or object model to improve extensibility. Another potential goal for refactoring is improved performance; software engineers face an ongoing challenge to write programs that perform faster or use less memory.

Software engineering is an engineering-based approach to software development. A software engineer is a person who applies the engineering design process to design, develop, test, maintain, and evaluate computer software. The term programmer is sometimes used as a synonym, but may emphasize software implementation over design and can also lack connotations of engineering education or skills.

In computer science, static program analysis is the analysis of computer programs performed without executing them, in contrast with dynamic program analysis, which is performed on programs during their execution.

Software maintenance in software engineering is the modification of a software product after delivery to correct faults, to improve performance or other attributes.

<span class="mw-page-title-main">James Cordy</span> Canadian computer scientist and educator

James Reginald Cordy is a Canadian computer scientist and educator who is Professor Emeritus in the School of Computing at Queen's University. As a researcher he is most recently active in the fields of source code analysis and manipulation, software reverse and re-engineering, and pattern analysis and machine intelligence. He has a long record of previous work in programming languages, compiler technology, and software architecture.

<span class="mw-page-title-main">Mental model</span> Explanation of someones thought process about how something works in the real world

A mental model in psychology is an internal representation of external reality, hypothesized to play a major role in cognition, reasoning and decision-making. The term was coined by Kenneth Craik in 1943 who suggested that the mind constructs "small-scale models" of reality that it uses to anticipate events.

Software visualization or software visualisation refers to the visualization of information of and related to software systems—either the architecture of its source code or metrics of their runtime behavior—and their development process by means of static, interactive or animated 2-D or 3-D visual representations of their structure, execution, behavior, and evolution.

Application discovery and understanding (ADU) is the process of automatically analyzing artifacts of a software application and determining metadata structures associated with the application in the form of lists of data elements and business rules. The relationships discovered between this application and a central metadata registry is then stored in the metadata registry itself.

Rigi is an interactive graph editor tool for software reverse engineering using the white box method, i.e. necessitating source code, thus it is mainly aimed at program comprehension. Rigi is distributed by its main author, Hausi A. Müller and the Rigi research group at the University of Victoria.

Software evolution is the continual development of a piece of software after its initial release to address changing stakeholder and/or market requirements. Software evolution is important because organizations invest large amounts of money in their software and are completely dependent on this software. Software evolution helps software adapt to changing businesses requirements, fix defects, and integrate with other changing systems in a software system environment.

The Bauhaus project is a software research project collaboration among the University of Stuttgart, the University of Bremen, and a commercial spin-off company Axivion formerly called Bauhaus Software Technologies. The Bauhaus project serves the fields of software maintenance and software reengineering.

<span class="mw-page-title-main">DRAKON</span> Algorithm mapping tool

DRAKON is a free and open source algorithmic visual programming and modeling language developed as part of the defunct Soviet Union Buran space program in 1986 following the need in increase of software development productivity. The visual language provides a uniform way to represent processes in flowcharts.

In software development, effort estimation is the process of predicting the most realistic amount of effort required to develop or maintain software based on incomplete, uncertain and noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets, investment analyses, pricing processes and bidding rounds.

Reverse engineering is a process or method through which one attempts to understand through deductive reasoning how a previously made device, process, system, or piece of software accomplishes a task with very little insight into exactly how it does so. Depending on the system under consideration and the technologies employed, the knowledge gained during reverse engineering can help with repurposing obsolete objects, doing security analysis, or learning how something works.

Software archaeology or source code archeology is the study of poorly documented or undocumented legacy software implementations, as part of software maintenance. Software archaeology, named by analogy with archaeology, includes the reverse engineering of software modules, and the application of a variety of tools and processes for extracting and understanding program structure and recovering design information. Software archaeology may reveal dysfunctional team processes which have produced poorly designed or even unused software modules, and in some cases deliberately obfuscatory code may be found. The term has been in use for decades.

Software diagnosis refers to concepts, techniques, and tools that allow for obtaining findings, conclusions, and evaluations about software systems and their implementation, composition, behaviour, and evolution. It serves as means to monitor, steer, observe and optimize software development, software maintenance, and software re-engineering in the sense of a business intelligence approach specific to software systems. It is generally based on the automatic extraction, analysis, and visualization of corresponding information sources of the software system. It can also be manually done and not automatic.

Software Intelligence is insight into the inner workings and structural condition of software assets produced by software designed to analyze database structure, software framework and source code to better understand and control complex software systems in Information Technology environments. Similarly to Business Intelligence (BI), Software Intelligence is produced by a set of software tools and techniques for the mining of data and the software's inner-structure. Results are automatically produced and feed a knowledge base containing technical documentation and make it available to all to be used by business and software stakeholders to make informed decisions, measure the efficiency of software development organizations, communicate about the software health, prevent software catastrophes.

This glossary of computer science is a list of definitions of terms and concepts used in computer science, its sub-disciplines, and related fields, including terms relevant to software, data science, and computer programming.

References

  1. Letovsky, Stanley (1987-12-01). "Cognitive processes in program comprehension". Journal of Systems and Software. 7 (4): 325–339. doi:10.1016/0164-1212(87)90032-X. ISSN   0164-1212.
  2. Storey, Margaret-Anne (2005-05-15). "Theories, methods and tools in program comprehension: Past, present and future". 13th International Workshop on Program Comprehension (IWPC'05). IWPC '05. USA: IEEE Computer Society. pp. 181–191. doi:10.1109/WPC.2005.38. ISBN   978-0-7695-2254-8.

See also