This article needs additional citations for verification .(May 2018) |
Brownfield development is a term commonly used in the information technology industry to describe problem spaces needing the development and deployment of new software systems in the immediate presence of existing (legacy) software applications/systems. This implies that any new software architecture must take into account and coexist with live software already in situ.
In contemporary civil engineering, brownfield land means a property, the expansion, redevelopment, or reuse of which may be complicated by the presence or potential presence of a hazardous substance, pollutant, or contaminant.
Brownfield development adds a number of improvements to conventional software engineering practices. These traditionally assume a "clean sheet of paper", tabula rasa or "greenfield land" target environment throughout the design and implementation phases of software development. Brownfield extends such traditions by insisting that the context (local landscape) of the system being created be factored into any development exercise. This requires a detailed knowledge of the systems, services and data in the immediate vicinity of the solution under construction.
Reliably re-engineering existing business and IT environments into modern competitive, integrated architectures is non-trivial. The complexity of business and IT environments has been accumulating almost unchecked for forty years making changes ever more expensive. This is because:
As a result, an increasing proportion of the effort of developing new business capabilities is spent on understanding and integrating with the existing complex system and business landscape rather than delivering value. It has been observed[ by whom? ] that up to 75% of overall project effort is now spent on software integration and migration rather than new functionality.[ citation needed ]
The IT industry as a whole has a poor success rate at delivering such large scale change for its clients. The CHAOS survey from the Standish Group has tracked an overall improvement in IT project delivery success over the last twenty years, but even in 2006 large IT projects still failed more often than succeeded. Engineering changes and in such environments has many parallels with the concerns of the construction industry in redeveloping industrial or contaminated sites. They are full of hazards, unexpected complexities and tend to be risky and expensive to redevelop. The accumulated complexity of IT environments has made them “Brownfield” sites.
It is not the complexity of the new function or any new system characteristics that are the root of large project failures – it is our[ whose? ] understanding and communication of the overall requirement (as identified in The Mythical Man Month). To succeed, the requirements need to include a precise and thorough understanding of the constraints of the existing business and IT. Current “Greenfield” tooling and methods use early, informal and often imprecise abstractions that essentially ignore such complexity. Early, poorly informed abstractions are usually wrong and are often detected late in construction, resulting in delays, expensive rework and even failed developments. A Brownfield-oriented approach embraces existing complexity, and is used to reliably accelerate the overall solution engineering process, including enabling phased, incremental change wherever possible.
Brownfield takes the standard OMG model/pattern-driven approach and turns it on its head. Rather than taking the conventional approach of starting with a Conceptual model and driving down to Platform Specific Models and code generation, Brownfield starts by harvesting code and other existing artifacts and uses patterns to formally abstract upwards towards the Architecture and Business tier.
Standard Greenfield techniques are then used in combination to define the preferred business target. This “meet in the middle” technique is familiar from other development methods, but the extensive use of formal abstraction and the use of patterns for both discovery and generation is novel.
The underlying conceptual architecture of all Brownfield tools is known as VITA. VITA stands for Views, Inventory, Transformation and Artifacts. In a VITA architecture, the problem definition of the target space can be maintained as separate (though related) native "headfulls" of knowledge known as Views. The core advantage of a View is that it can be based on pretty much any formal tool. Brownfield does not impose a single tool or language on a problem space – a core tenet is that the headfulls continue to be maintained in their native forms and tools.
Native Views are then brought together and linked into a single Inventory. The Inventory is then used with a series of Transformation capabilities to produce the Artifacts that the solution needs.
Views can currently be imported from a wide variety of sources including UML, XML sources, DDL, spreadsheets, etc. The Analysis and Renovation Catalyst tool from IBM has taken this capability even further via the use of formal grammars and Abstract Syntax Trees to enable almost any program to be parsed and tokenized into a View for inclusion into the Inventory.
The rapid cyclic nature of the discovery, re-engineer, generate and test cycle used in this approach means that solutions can be refined iteratively in terms of their logical and physical definitions as more of the constraints become known and the solution architecture is refined.
Iterative Brownfield development can allow the gradual refinement of logical and physical architectures and incremental testing for the whole approach, resulting in development acceleration, improved solution quality and cheaper defect removal. Brownfield can also be used to generate solution documentation, ensuring it is always up to date and consistent across different viewpoints.
The Inventory that is created through Brownfield processed may be highly complex, being an interconnected multi-dimensional semantic network. The level of knowledge in the Inventory can be very fine grained, highly detailed and interrelated. Such things are hard to understand and can provide barriers to communication, however. Brownfield solves this problem by abstracting concepts via an artisan’s best guess, using known patterns in its Inventories to extract and infer higher level relationships.
Formal abstractions enable the complexity of the Inventory to be translated into simpler, but inherently accurate, representations for easier consumption by those that need to understand the problem space. These abstracted Inventory models can be used to automatically render multi-layered architecture representations in tools such as Second Life.
Such visualizations enable complex information to be shared and experienced by multiple individuals from around the globe in real time. This enhances both understanding and a sense of a single team.
In computing, a legacy system is an old method, technology, computer system, or application program, "of, relating to, or being a previous or outdated computer system", yet still in use. Often referencing a system as "legacy" means that it paved the way for the standards that would follow it. This can also imply that the system is out of date or in need of replacement.
Software architecture is the set of structures needed to reason about a software system and the discipline of creating such structures and systems. Each structure comprises software elements, relations among them, and properties of both elements and relations.
In software engineering, a software design pattern is a general, reusable solution to a commonly occurring problem within a given context in software design. It is not a finished design that can be transformed directly into source or machine code. Rather, it is a description or template for how to solve a problem that can be used in many different situations. Design patterns are formalized best practices that the programmer can use to solve common problems when designing an application or system.
Software design is the process by which an agent creates a specification of a software artifact intended to accomplish goals, using a set of primitive components and subject to constraints. The term is sometimes used broadly to refer to "all the activity involved in conceptualizing, framing, implementing, commissioning, and ultimately modifying" the software, or more specifically "the activity following requirements specification and before programming, as ... [in] a stylized software engineering process."
Software development is the process of conceiving, specifying, designing, programming, documenting, testing, and bug fixing involved in creating and maintaining applications, frameworks, or other software components. Software development involves writing and maintaining the source code, but in a broader sense, it includes all processes from the conception of the desired software through the final manifestation, typically in a planned and structured process often overlapping with software engineering. Software development also includes research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities that result in software products.
A modeling language is any artificial language that can be used to express information or knowledge or systems in a structure that is defined by a consistent set of rules. The rules are used for interpretation of the meaning of components in the structure Programing language.
In computer programming, a software framework is an abstraction in which software, providing generic functionality, can be selectively changed by additional user-written code, thus providing application-specific software. It provides a standard way to build and deploy applications and is a universal, reusable software environment that provides particular functionality as part of a larger software platform to facilitate the development of software applications, products and solutions.
Software prototyping is the activity of creating prototypes of software applications, i.e., incomplete versions of the software program being developed. It is an activity that can occur in software development and is comparable to prototyping as known from other fields, such as mechanical engineering or manufacturing.
Object-oriented analysis and design (OOAD) is a technical approach for analyzing and designing an application, system, or business by applying object-oriented programming, as well as using visual modeling throughout the software development process to guide stakeholder communication and product quality.
A software factory is a structured collection of related software assets that aids in producing computer software applications or software components according to specific, externally defined end-user requirements through an assembly process. A software factory applies manufacturing techniques and principles to software development to mimic the benefits of traditional manufacturing. Software factories are generally involved with outsourced software creation.
Model-driven engineering (MDE) is a software development methodology that focuses on creating and exploiting domain models, which are conceptual models of all the topics related to a specific problem. Hence, it highlights and aims at abstract representations of the knowledge and activities that govern a particular application domain, rather than the computing concepts.
Legacy modernization, also known as software modernization or platform modernization, refers to the conversion, rewriting or porting of a legacy system to modern computer programming languages, architectures, software libraries, protocols or hardware platforms. Legacy transformation aims to retain and extend the value of the legacy investment through migration to new platforms to benefit from the advantage of the new technologies.
Knowledge Discovery Metamodel (KDM) is a publicly available specification from the Object Management Group (OMG). KDM is a common intermediate representation for existing software systems and their operating environments, that defines common metadata required for deep semantic integration of Application Lifecycle Management tools. KDM was designed as the OMG's foundation for software modernization, IT portfolio management and software assurance. KDM uses OMG's Meta-Object Facility to define an XMI interchange format between tools that work with existing software as well as an abstract interface (API) for the next-generation assurance and modernization tools. KDM standardizes existing approaches to knowledge discovery in software engineering artifacts, also known as software mining.
Enterprise engineering is the body of knowledge, principles, and practices used to design all or part of an enterprise. An enterprise is a complex socio-technical system that comprises people, information, and technology that interact with each other and their environment in support of a common mission. One definition is: "an enterprise life-cycle oriented discipline for the identification, design, and implementation of enterprises and their continuous evolution", supported by enterprise modelling. The discipline examines each aspect of the enterprise, including business processes, information flows, material flows, and organizational structure. Enterprise engineering may focus on the design of the enterprise as a whole, or on the design and integration of certain business components.
Enterprise life cycle (ELC) in enterprise architecture is the dynamic, iterative process of changing the enterprise over time by incorporating new business processes, new technology, and new capabilities, as well as maintenance, disposition and disposal of existing elements of the enterprise.
A view model or viewpoints framework in systems engineering, software engineering, and enterprise engineering is a framework which defines a coherent set of views to be used in the construction of a system architecture, software architecture, or enterprise architecture. A view is a representation of the whole system from the perspective of a related set of concerns.
Model Driven Interoperability (MDI) is a methodological framework, which provides a conceptual and technical support to make interoperable enterprises using ontologies and semantic annotations, following model driven development (MDD) principles.
Systems-oriented design (S.O.D.) uses system thinking in order to capture the complexity of systems addressed in design practice. The main mission of S.O.D. is to build the designers' own interpretation and implementation of systems thinking. S.O.D. aims at enabling systems thinking to fully benefit from design thinking and practice, and design thinking and practice to fully benefit from systems thinking. S.O.D. addresses design for human activity systems, and can be applied to any kind of design problem ranging from product design and interaction design, through architecture to decision making processes and policy design.
The Knowledge Based Software Assistant (KBSA) was a research program funded by the United States Air Force. The goal of the program was to apply concepts from artificial intelligence to the problem of designing and implementing computer software. Software would be described by models in very high level languages (essentially equivalent to first order logic) and then transformation rules would transform the specification into efficient code. The air force hoped to be able to generate the software to control weapons systems and other command and control systems using this method. As software was becoming ever more critical to USAF weapons systems it was realized that improving the quality and productivity of the software development process could have significant benefits for the military, as well as for information technology in other major US industries.
In software engineering, a microservice architecture is a variant of the service-oriented architecture structural style. It is an architectural pattern that arranges an application as a collection of loosely coupled, fine-grained services, communicating through lightweight protocols. One of its goals is that teams can develop and deploy their services independently of others. This is achieved by the reduction of several dependencies in the code base, allowing developers to evolve their services with limited restrictions from users, and for additional complexity to be hidden from users. As a consequence, organizations are able to develop software with fast growth and size, as well as use off-the-shelf services more easily. Communication requirements are reduced. These benefits come at a cost to maintaining the decoupling. Interfaces need to be designed carefully and treated as a public API. One technique that is used is having multiple interfaces on the same service, or multiple versions of the same service, so as to not disrupt existing users of the code.