Software intelligence

Last updated

Software Intelligence is insight into the inner workings and structural condition of software assets produced by software designed to analyze database structure, software framework and source code to better understand and control complex software systems in information technology environments. [1] [2] Similarly to business intelligence (BI), software intelligence is produced by a set of software tools and techniques for the mining of data and the software's inner-structure. Results are automatically produced and feed a knowledge base containing technical documentation and blueprints of the innerworking of applications, [3] and make it available to all to be used by business and software stakeholders to make informed decisions, [4] measure the efficiency of software development organizations, communicate about the software health, prevent software catastrophes. [5]

Contents

History

Software intelligence has been used by Kirk Paul Lafler, an American engineer, entrepreneur, and consultant, and founder of Software Intelligence Corporation in 1979. At that time, it was mainly related to SAS activities, in which he has been an expert since 1979. [6]

In the early 1980s, Victor R. Basili participated in different papers detailing a methodology for collecting valid software engineering data relating to software engineering, evaluation of software development, and variations. [7] [8] In 2004, different software vendors in software analysis start using the terms as part of their product naming and marketing strategy.

Then in 2010, Ahmed E. Hassan and Tao Xie defined software intelligence as a "practice offering software practitioners up-to-date and pertinent information to support their daily decision-making processes and Software Intelligence should support decision-making processes throughout the lifetime of a software system". They go on by defining software intelligence as a "strong impact on modern software practice" for the upcoming decades. [9]

Capabilities

Because of the complexity and wide range of components and subjects implied in software, software intelligence is derived from different aspects of software:

Components

The capabilities of software intelligence platforms include an increasing number of components:

User Aspect

Some considerations must be made in order to successfully integrate the usage of software Intelligence systems in a company. Ultimately the software intelligence system must be accepted and utilized by the users in order for it to add value to the organization. If the system does not add value to the users' mission, they simply don't use it as stated by M. Storey in 2003. [20]

At the code level and system representation, software intelligence systems must provide a different level of abstractions: an abstract view for designing, explaining and documenting and a detailed view for understanding and analyzing the software system. [21]

At the governance level, the user acceptance for software intelligence covers different areas related to the inner functioning of the system as well as the output of the system. It encompasses these requirements:

Applications

Software intelligence has many applications in all businesses relating to the software environment, whether it is software for professionals, individuals, or embedded software. Depending on the association and the usage of the components, applications will relate to:

Marketplace

The software intelligence is a high-level discipline and has been gradually growing covering applications listed above. There are several markets driving the need for it:

Related Research Articles

<span class="mw-page-title-main">Software architecture</span> High level structures of a software system

Software architecture is the set of structures needed to reason about a software system and the discipline of creating such structures and systems. Each structure comprises software elements, relations among them, and properties of both elements and relations.

An intrusion detection system is a device or software application that monitors a network or systems for malicious activity or policy violations. Any intrusion activity or violation is typically either reported to an administrator or collected centrally using a security information and event management (SIEM) system. A SIEM system combines outputs from multiple sources and uses alarm filtering techniques to distinguish malicious activity from false alarms.

In the context of software engineering, software quality refers to two related but distinct notions:

Enterprise architecture (EA) is a business function concerned with the structures and behaviours of a business, especially business roles and processes that create and use business data. The international definition according to the Federation of Enterprise Architecture Professional Organizations is "a well-defined practice for conducting enterprise analysis, design, planning, and implementation, using a comprehensive approach at all times, for the successful development and execution of strategy. Enterprise architecture applies architecture principles and practices to guide organizations through the business, information, process, and technology changes necessary to execute their strategies. These practices utilize the various aspects of an enterprise to identify, motivate, and achieve these changes."

<span class="mw-page-title-main">Component-based software engineering</span> Branch of software engineering

Component-based software engineering (CBSE), also called component-based development (CBD), is a style of software engineering that aims to build software out of loosely-coupled, modular components. It emphasizes the separation of concerns among different parts of a software system.

Software visualization or software visualisation refers to the visualization of information of and related to software systems—either the architecture of its source code or metrics of their runtime behavior—and their development process by means of static, interactive or animated 2-D or 3-D visual representations of their structure, execution, behavior, and evolution.

Software assurance (SwA) is a critical process in software development that ensures the reliability, safety, and security of software products. It involves a variety of activities, including requirements analysis, design reviews, code inspections, testing, and formal verification. One crucial component of software assurance is secure coding practices, which follow industry-accepted standards and best practices, such as those outlined by the Software Engineering Institute (SEI) in their CERT Secure Coding Standards (SCS).

Process mining is a family of techniques used to analyze event data in order to understand and improve operational processes. Part of the fields of data science and process management, process mining is generally built on logs that contain case id, a unique identifier for a particular process instance; an activity, a description of the event that is occurring; a timestamp; and sometimes other information such as resources, costs, and so on.

Legacy modernization, also known as software modernization or platform modernization, refers to the conversion, rewriting or porting of a legacy system to modern computer programming languages, architectures, software libraries, protocols or hardware platforms. Legacy transformation aims to retain and extend the value of the legacy investment through migration to new platforms to benefit from the advantage of the new technologies.

Search-based software engineering (SBSE) applies metaheuristic search techniques such as genetic algorithms, simulated annealing and tabu search to software engineering problems. Many activities in software engineering can be stated as optimization problems. Optimization techniques of operations research such as linear programming or dynamic programming are often impractical for large scale software engineering problems because of their computational complexity or their assumptions on the problem structure. Researchers and practitioners use metaheuristic search techniques, which impose little assumptions on the problem structure, to find near-optimal or "good-enough" solutions.

<span class="mw-page-title-main">Cloud computing</span> Form of shared Internet-based computing

Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. Large clouds often have functions distributed over multiple locations, each of which is a data center. Cloud computing relies on sharing of resources to achieve coherence and typically uses a pay-as-you-go model, which can help in reducing capital expenses but may also lead to unexpected operating expenses for users.

DevOps is a methodology in the software development and IT industry. Used as a set of practices and tools, DevOps integrates and automates the work of software development (Dev) and IT operations (Ops) as a means for improving and shortening the systems development life cycle.

Software-defined networking (SDN) is an approach to network management that enables dynamic and programmatically efficient network configuration to improve network performance and monitoring in a manner more akin to cloud computing than to traditional network management. SDN is meant to improve the static architecture of traditional networks and may be employed to centralize network intelligence in one network component by disassociating the forwarding process of network packets from the routing process. The control plane consists of one or more controllers, which are considered the brains of the SDN network, where the whole intelligence is incorporated. However, centralization has certain drawbacks related to security, scalability and elasticity.

The term is used for two different things:

  1. In computer science, in-memory processing (PIM) is a computer architecture in which data operations are available directly on the data memory, rather than having to be transferred to CPU registers first. This may improve the power usage and performance of moving data between the processor and the main memory.
  2. In software engineering, in-memory processing is a software architecture where a database is kept entirely in random-access memory (RAM) or flash memory so that usual accesses, in particular read or query operations, do not require access to disk storage. This may allow faster data operations such as "joins", and faster reporting and decision-making in business.
<span class="mw-page-title-main">CAST (company)</span>

CAST is a technology corporation headquartered in New York City and France, near Paris. It was founded in 1990 in Paris, France, by Vincent Delaroche.

A software map represents static, dynamic, and evolutionary information of software systems and their software development processes by means of 2D or 3D map-oriented information visualization. It constitutes a fundamental concept and tool in software visualization, software analytics, and software diagnosis. Its primary applications include risk analysis for and monitoring of code quality, team activity, or software development progress and, generally, improving effectiveness of software engineering with respect to all related artifacts, processes, and stakeholders throughout the software engineering process and software maintenance.

Privacy engineering is an emerging field of engineering which aims to provide methodologies, tools, and techniques to ensure systems provide acceptable levels of privacy. Its focus lies in organizing and assessing methods to identify and tackle privacy concerns within the engineering of information systems.

Static application security testing (SAST) is used to secure software by reviewing the source code of the software to identify sources of vulnerabilities. Although the process of statically analyzing the source code has existed as long as computers have existed, the technique spread to security in the late 90s and the first public discussion of SQL injection in 1998 when Web applications integrated new technologies like JavaScript and Flash.

Marvin Victor Zelkowitz is an American computer scientist and engineer.

It is a common software engineering practice to develop software by using different components. Using software components segments the complexity of larger elements into smaller pieces of code and increases flexibility by enabling easier reuse of components to address new requirements. The practice has widely expanded since the late 1990s with the popularization of open-source software (OSS) to help speed up the software development process and reduce time to market.

References

  1. Dąbrowski R. (2012) On Architecture Warehouses and Software Intelligence. In: Kim T., Lee Y., Fang W. (eds) Future Generation Information Technology. FGIT 2012. Lecture Notes in Computer Science, vol 7709. Springer, Berlin, Heidelberg
  2. Hinchey, Mike; Jain, Amit; Kaushik, Manju; Misra, Sanjay (Jan 2023). "Guest Editorial: Intelligence for systems and software engineering". Innovations in Systems and Software Engineering. Springer. 19 (1): 1–4. doi:10.1007/s11334-023-00526-1. PMC   9886201 . PMID   36744022.
  3. Bartoszuk, C., Dąbrowski, R., Stencel, K., & Timoszuk, G. "On quick comprehension and assessment of software.", In Proceedings of the 14th International Conference on Computer Systems and Technologies, June 2013, pp. 161-168 doi : 10.1145/2516775.2516806
  4. Raymond PL Buse, and Thomas Zimmermann. "Information needs for software development analytics." 2012 34th International Conference on Software Engineering (ICSE). IEEE, June 2012, pp. 987-996 doi : 10.1109/ICSE.2012.6227122
  5. Ahmed E. Hassan and Tao Xie. 2010. Software intelligence: the future of mining software engineering data. In Proceedings of the FSE/SDP workshop on Future of software engineering research (FoSER '10). ACM, New York, NY, USA, 161–166
  6. "Mr. Kirk Paul Lafler". 21 December 2015.
  7. Basili, Victor R. (1981). Data collection, validation and analysis. Software Metrics: An Analysis and Evaluation (PDF). MIT Press. p. 143. ISBN   0-262-16083-8.
  8. Basili, Victor R.; Weiss, David M. (Nov 1984). "A Methodology for Collecting Valid Software Engineering Data". IEEE Transactions on Software Engineering. IEEE Trans. Softw. Eng. 10, 6 (November 1984) (6): 728–738. doi:10.1109/TSE.1984.5010301. hdl: 1903/7513 .
  9. Ahmed E. Hassan and Tao Xie. 2010. Software intelligence: the future of mining software engineering data. In Proceedings of the FSE/SDP workshop on Future of software engineering research (FoSER '10). ACM, New York, NY, USA, 161–166. doi : 10.1145/1882362.1882397
  10. Nierstrasz, Oscar, and Theo Dirk Meijler. "Research directions in software composition." ACM Computing Surveys 27.2 (1995): 262-264 doi : 10.1145/210376.210389
  11. Kanashiro, L., et al. "Predicting software flaws with low complexity models based on static analysis data." Journal of Information Systems Engineering & Management 3.2 (2018): 17 doi : 10.20897/jisem.201817
  12. "ISO 25000:2005" (PDF). Archived (PDF) from the original on 2013-04-14. Retrieved 2013-10-18.
  13. Boehm, Barry W., and Kevin J. Sullivan. "Software economics: a roadmap." Proceedings of the conference on The future of Software engineering. 2000. doi : 10.1145/336512.336584
  14. Renato Novais, José Amancio Santos, Manoel Mendonça, Experimentally assessing the combination of multiple visualization strategies for software evolution analysis, Journal of Systems and Software, Volume 128, 2017, pp. 56–71, ISSN   0164-1212, doi : 10.1016/j.jss.2017.03.006.
  15. Rolia, Jerome A., and Kenneth C. Sevcik. "The method of layers." IEEE transactions on software engineering 21.8,1995, 689-700, doi : 10.1109/32.403785
  16. "Software Engineering Rules on code quality". Object Management Group, Inc. 2023. Retrieved 15 December 2023.
  17. Balalaie, Armin, , Abbas Heydarnoori, and Pooyan Jamshidi. "Microservices architecture enables devops: Migration to a cloud-native architecture." Ieee Software 33.3 ,May–June 2016, 42-52, doi : 10.1109/MS.2016.64
  18. Q. Feng, R. Kazman, Y. Cai, R. Mo and L. Xiao, "Towards an Architecture-Centric Approach to Security Analysis," 2016 13th Working IEEE/IFIP Conference on Software Architecture (WICSA), Venice, 2016, pp. 221-230, doi : 10.1109/WICSA.2016.41
  19. R. Haas, R. Niedermayr and E. Juergens, "Teamscale: Tackle Technical Debt and Control the Quality of Your Software," 2019 IEEE/ACM International Conference on Technical Debt (TechDebt), Montreal, QC, Canada, 2019, pp. 55-56, doi : 10.1109/TechDebt.2019.00016
  20. Storey MA. (2003) Designing a Software Exploration Tool Using a Cognitive Framework. In: Zhang K. (eds) Software Visualization. The Springer International Series in Engineering and Computer Science, vol 734. Springer, Boston, MA.
  21. Seonah Lee, Sungwon Kang, What situational information would help developers when using a graphical code recommender?, Journal of Systems and Software, Volume 117, 2016, pp. 199–217, ISSN   0164-1212, doi : 10.1016/j.jss.2016.02.050.
  22. Linda G. Wallace, Steven D. Sheetz, The adoption of software measures: A technology acceptance model (TAM) perspective, Information & Management, Volume 51, Issue 2, 2014, pp. 249–259, ISSN   0378-7206, doi : 10.1016/j.im.2013.12.003
  23. Lippert, S.K.; Forman, H. (August 2005). "Utilization of information technology: examining cognitive and experiential factors of post-adoption behavior". IEEE Transactions on Engineering Management. pp. 363–381. Retrieved 8 December 2023.
  24. Banker, R.D.; Kemerer, C.F. (December 1992). "Performance Evaluation Metrics for Information Systems Development: A Principal-Agent Model". Information Systems Research. 3 (4): 379–400. Retrieved 8 December 2023.
  25. Crowne, M. (9 July 2003). "Why software product startups fail and what to do about it. Evolution of software product development in startup companies". IEEE International Engineering Management Conference. pp. 338–343. doi:10.1109/IEMC.2002.1038454 . Retrieved 8 December 2023.
  26. Parnas, David Lorge, Precise Documentation: The Key to Better Software, The Future of Software Engineering, 2011, 125–148, doi : 10.1007/978-3-642-15187-3_8
  27. LaValle, S.; Lesser, E.; Shockley, R.; Hopkins, M.S. (21 December 2010). "Big data, analytics and the path from insights to value". MIT Sloan Management Review. pp. 21–32. Retrieved 8 December 2023.
  28. Janez Prašnikar; Žiga Debeljak; Aleš Ahčan (3 December 2010). "Benchmarking as a tool of strategic management". Total Quality Management & Business Excellence. 16 (2): 257–275. doi:10.1080/14783360500054400 . Retrieved 8 December 2023.
  29. "Gartner Glossary - Applications Portfolio Analysis (APA)". Gartner, Inc. 2023. Retrieved 7 December 2023.
  30. "Gartner Research - Effective Strategies to Deliver Sustainable Cost Optimization in Application Services". Gartner, Inc. 4 October 2017. Retrieved 7 December 2017.
  31. "About the Automated Function Points Specification Version 1.0". Object Management Group. December 2013. Retrieved 7 December 2023.