Crowdsourcing software development

Last updated

Crowdsourcing software development or software crowdsourcing is an emerging area of software engineering. It is an open call for participation in any task of software development, including documentation, design, coding and testing. These tasks are normally conducted by either members of a software enterprise or people contracted by the enterprise. But in software crowdsourcing, all the tasks can be assigned to or are addressed by members of the general public. Individuals and teams may also participate in crowdsourcing contests . [1]

Contents

Goals

Software crowdsourcing may have multiple goals. [2] [3]

Quality software: Crowdsourcing organizers need to define specific software quality goals and their evaluation criteria. Quality software often comes from competent contestants who can submit good solutions for rigorous evaluation.

Rapid acquisition: Instead of waiting for software to be developed, crowdsourcing organizers may post a competition hoping that something identical or similar has been developed already. This is to reduce software acquisition time.

Talent identification: A crowdsourcing organizer may be mainly interested in identifying talents as demonstrated by their performance in the competition.

Cost reduction: A crowdsourcing organizer may acquire software at a low cost by paying a small fraction of development cost as the price for award may include recognition awards.

Solution diversity: As teams will turn in different solutions for the same problem, the diversity in these solutions will be useful for fault-tolerant computing.

Ideas creation: One goal is to get new ideas from contestants and these ideas may lead to new directions.

Broadening participation: One goal is to recruit as many participants as possible to get best solution or to spread relevant knowledge.

Participant education: Organizers are interested in educating participants new knowledge. One example is nonamesite.com sponsored by DARPA to teach STEM Science, Technology, Engineering, and Mathematics.

Fund leveraging: The goal is to stimulate other organizations to sponsor similar projects to leverage funds.

Marketing: Crowdsourcing projects can be used for brand recognition among participants.

Ecosystem

Architecture support

A crowdsourcing support system needs to include 1) Software development tools: requirement tools, design tools, coding tools, compilers, debuggers, IDE, performance analysis tools, testing tools, and maintenance tools. 2) Project management tools: ranking, reputation, and award systems for products and participants. 3) Social network tools: allow participants to communicate and support each other. 4) Collaborating tools: For example, a blackboard platform where participants can see a common area and suggest ideas to improve the solutions presented in the common area.

Social networks

Social networks can provide communication, documentation, blogs, twitters, wikis, comments, feedbacks, and indexing.

Organization

Processes

Any phase of software development can be crowdsourced, and that phase can be requirements (functional, user interface, performance), design (algorithm, architecture), coding (modules and components), testing (including security testing, user interface testing, user experience testing), maintenance, user experience, or any combination of these. [4]

Existing software development processes can be modified to include crowdsourcing: 1) Waterfall model; 2) Agile processes; 3) Model-driven approach; 4) Open-Sourced approach; 5) Software-as-a-Service (SaaS) approach where service components can be published, discovered, composed, customized, simulated, and tested; 6) formal methods: formal methods can be crowdsourced.

The crowdsourcing can be competitive or non-competitive. In competitive crowdsourcing, only selected participants will win, and in highly competitive projects, many contestants will compete but few will win. In non-competitive manner, either single individuals will participate in crowdsourcing or multiple individuals can collaborate to create software. Products produced can be cross evaluated to ensure the consistency and quality of products and to identify talents, and the cross evaluation can be evaluated by crowdsourcing.

Items developed by crowdsourcing can be evaluated by crowdsourcing to determine the work produced, and evaluation of evaluation can be crowdsourced to determine the quality of evaluation.

Notable crowdsourcing processes include AppStori and Topcoder processes.

Pre-selection of participants is important for quality software crowdsourcing. In competitive crowdsourcing, a low-ranked participant should not compete against a high-ranked participant.

Platforms

Software crowdsourcing platforms including Apple Inc.'s App Store, Topcoder, and uTest demonstrate the advantage of crowdsourcing in terms of software ecosystem expansion and product quality improvement. Apple’s App Store is an online iOS application market, where developers can directly deliver their creative designs and products to smartphone customers. These developers are motivated to contribute innovative designs for both reputation and payment by the micro-payment mechanism of the App Store. Within less than four years, Apple's App Store has become a huge mobile application ecosystem with 150,000 active publishers, and generated over 700,000 IOS applications. Around the App Store, there are many community-based, collaborative platforms for the smart-phone applications incubators. For example, AppStori introduces a crowd funding approach to build an online community for developing promising ideas about new iPhone applications. IdeaScale is another platform for software crowdsourcing. [5]

Another crowdsourcing example—Topcoder—creates a software contest model where programming tasks are posted as contests and the developer of the best solution wins the top prize. Following this model, Topcoder has established an online platform to support its ecosystem and gathered a virtual global workforce with more than 1 million registered members and nearly 50,000 active participants. All these Topcoder members compete against each other in software development tasks such as requirement analysis, algorithm design, coding, and testing.

Sample processes

The Topcoder Software Development Process consists of a number of different phases, and within each phase there can be different competition types:[ citation needed ]

  1. Architecture;
  2. Component Production;
  3. Application Assembly;
  1. Deployment
Topcoder competition types and phases TopCoder competition types and phases.png
Topcoder competition types and phases

Each step can be a crowdsourcing competition.

BugFinders testing process: [6]

  1. Engage BugFinders;
  2. Define Projects;
  3. Managed by BugFinders;
  4. Review Bugs;
  5. Get Bugs Fixed; and
  6. Release Software.

Theoretical issues

Game theory has been used in the analysis of various software crowdsourcing projects. [2]

Information theory can be a basis for metrics.

Economic models can provide incentives for participation in crowdsourcing efforts.

Reference architecture

Crowdsourcing software development may follow different software engineering methodologies using different process models, techniques, and tools. It also has specific crowdsourcing processes involving unique activities such as bidding tasks, allocating experts, evaluating quality, and integrating software.[ citation needed ] To support outsourcing process and facilitate community collaboration, a platform is usually built to provide necessary resources and services. For example, Topcoder follows the traditional software development process with competition rules embedded, and AppStori allow flexible processes and crowd may be involved in almost all aspects of software development including funding, project concepts, design, coding, testing, and evaluation.

The reference architecture hence defines umbrella activities and structure for crowd-based software development by unifying best practices and research achievements. In general, the reference architecture will address the following needs:[ citation needed ]

  1. Customizable to support typical process models;
  2. Configurable to compose different functional components;
  3. Scalable to facilitate problem solution of varied size.

Particularly, crowdsourcing is used to develop large and complex software in a virtualized, decentralized manner. Cloud computing is a colloquial expression used to describe a variety of different types of computing concepts that involve a large number of computers connected through a real-time communication network (typically the Internet). Many advantages are to be found when moving crowdsourcing applications to the cloud: focus on project development rather than on the infrastructure that supports this process, foster the collaboration between geographically distributed teams, scale resources to the size of the projects, work in a virtualized, distributed, and collaborative environment.

Reference Architecture for Software Crowdsourcing Reference architecture of crowdsourcing software development.jpg
Reference Architecture for Software Crowdsourcing

The demands on software crowdsourcing systems are ever evolving as new development philosophies and technologies gain favor. The reference architecture presented above is designed to encompass generality in many dimensions including, for example different software development methodologies, incentive schemes, and competitive/collaborative approaches. There are several clear research directions that could be investigated to enhance the architecture such as data analytics, service based delivery, and framework generalization. As systems grow understanding the use of the platform is an important consideration, data regarding users, projects, and interaction between the two can all be explored to investigate performance. These data may also provide helpful insights when developing tasks or selecting participants. Many of the components designed in the architecture are general purpose and could be delivered as hosted services. By hosting these services the barriers for entry would be significantly reduced. Finally, through deployments of this architecture there is potential to derive a general purpose framework that could be used for different software development crowdsourcing projects or more widely for other crowdsourcing applications. The creation of such frameworks has had transformative effects in other domains for instance the predominant use of BOINC in volunteer computing.

Aspects and metrics

Crowdsourcing in general is a multifaceted research topic. The use of crowdsourcing in software development is associated with a number of key tension points, or facets, which should be considered (see the figure below). At the same time, research can be conducted from the perspective of the three key players in crowdsourcing: the customer, the worker, and the platform. [7]

Research framework for crowdsourcing software development Researchframework.png
Research framework for crowdsourcing software development

Task decomposition:

Coordination and communication:

Planning and scheduling:

Quality assurance: A software crowdsourcing process can be described in a game process, where one party tries to minimize an objective function, yet the other party tries to maximize the same objective function as though both parties compete with each other in the game. For example, a specification team needs to produce quality specifications for the coding team to develop the code; the specification team will minimize the software bugs in the specification, while the coding team will identify as many bugs as possible in the specification before coding.

The min-max process is important as it is a quality assurance mechanism and often a team needs to perform both. For example, the coding team needs to maximize the identification of bugs in the specification, but it also needs to minimize the number of bugs in the code it produces.

Bugcrowd showed that participants will follow the prisoner's dilemma to identify bugs for security testing. [8]

Knowledge and Intellectual Property:

Motivation and Remuneration:

Levels

There are the following levels of crowdsourcing:[ citation needed ]

Level 1: single persons, well-defined modules, small size, limited time span (less than 2 months), quality products, current development processes such as the one by Topcoder and uTest. At this level, coders are ranked, websites contains online repository crowdsourcing materials, software can be ranked by participants, have communication tools such as wiki, blogs, comments, software development tools such as IDE, testing, compilers, simulation, modeling, and program analysis.

Level 2: teams of people (< 10), well-defined systems, medium large, medium time span (3 to 4 months), adaptive development processes with intelligent feedback in a blackboard architecture. At this level, a crowdsourcing website may support adaptive development process and even concurrent development processes with intelligent feedback with the blackboard architecture; intelligent analysis of coders, software products, and comments; multi-phase software testing and evaluation; Big Data analytics, automated wrapping software services into SaaS (Software-as-a-Service), annotate with ontology, cross reference to DBpedia, and Wikipedia; automated analysis and classification of software services; ontology annotation and reasoning such as linking those service with compatible input/output.

Level 3: teams of people (< 100 and > 10), well-defined system, large systems, long time span (< 2 years), automated cross verification and cross comparison among contributions. A crowdsourcing website at this level may contain automated matching of requirements to existing components including matching of specification, services, and tests; automated regression testing.

Level 4: multinational collaboration of large and adaptive systems. A crowdsourcing website at this level may contain domain-oriented crowdsourcing with ontology, reasoning, and annotation; automated cross verification and test generation processes; automated configuration of crowdsourcing platform; and may restructure the platform as SaaS with tenant customization.

Significant events

Microsoft crowdsourcing Windows 8 development. In 2011, Microsoft started blogs to encourage discussions among developers and general public. [9] In 2013, Microsoft also started crowdsourcing their mobile devices for Windows 8. [10] In June 2013, Microsoft also announced crowdsourcing software testing by offering $100K for innovative techniques to identify security bugs, and $50K for a solution to the problem identified. [11]

In 2011 the United States Patent and Trademark Office launching a crowdsourcing challenge under the America COMPETES Act on the Topcoder platform to develop for image processing algorithms and software to recognize figure and part labels in patent documents with a prize pool of $50,000 USD. [12] The contest resulted in 70 teams collectively making 1,797 code submissions. The solution of the contest winner achieved high accuracy in terms of recall and precision for the recognition of figure regions and part labels. [13]

Oracle uses crowdsourcing in their CRM projects. [14]

Conferences and workshops

A software crowdsourcing workshop was held at Dagstuhl, Germany in September 2013. [15]

See also

Related Research Articles

<span class="mw-page-title-main">Software testing</span> Checking software against a standard

Software testing is the act of checking whether software satisfies expectations.

A software company is an organisation — owned either by the state or private — established for profit whose primary products are various forms of software, software technology, distribution, and software product development. They make up the software industry.

Unit testing, a.k.a. component or module testing, is a form of software testing by which isolated source code is tested to validate expected behavior.

The rational unified process (RUP) is an iterative software development process framework created by the Rational Software Corporation, a division of IBM since 2003. RUP is not a single concrete prescriptive process, but rather an adaptable process framework, intended to be tailored by the development organizations and software project teams that will select the elements of the process that are appropriate for their needs. RUP is a specific implementation of the Unified Process.

Software development is the process of designing and implementing a software solution to satisfy a user. The process is more encompassing than programming, writing code, in that it includes conceiving the goal, evaluating feasibility, analyzing requirements, design, testing and release. The process is part of software engineering which also includes organizational management, project management, configuration management and other aspects.

Topcoder is a crowdsourcing company with an open global community of designers, developers, data scientists, and competitive programmers. Topcoder pays community members for their work on the projects and sells community services to corporate, mid-size, and small-business clients. Topcoder also organizes the annual Topcoder Open tournament and a series of smaller regional events.

Application security includes all tasks that introduce a secure software development life cycle to development teams. Its final goal is to improve security practices and, through that, to find, fix and preferably prevent security issues within applications. It encompasses the whole application life cycle from requirements analysis, design, implementation, verification as well as maintenance.

Behavior-driven development (BDD) involves naming software tests using domain language to describe the behavior of the code.

A software factory is a structured collection of related software assets that aids in producing computer software applications or software components according to specific, externally defined end-user requirements through an assembly process. A software factory applies manufacturing techniques and principles to software development to mimic the benefits of traditional manufacturing. Software factories are generally involved with outsourced software creation.

Internet-Speed development is an Agile Software Development development method using a combined spiral model/waterfall model with daily builds aimed at developing a product with high speed.

Test management most commonly refers to the activity of managing a testing process. A test management tool is software used to manage tests that have been previously specified by a test procedure. It is often associated with automation software. Test management tools often include requirement and/or specification management modules that allow automatic generation of the requirement test matrix (RTM), which is one of the main metrics to indicate functional coverage of a system under test (SUT).

<span class="mw-page-title-main">Crowdsourcing</span> Sourcing services or funds from a group

Crowdsourcing involves a large group of dispersed participants contributing or producing goods or services—including ideas, votes, micro-tasks, and finances—for payment or as volunteers. Contemporary crowdsourcing often involves digital platforms to attract and divide work between participants to achieve a cumulative result. Crowdsourcing is not limited to online activity, however, and there are various historical examples of crowdsourcing. The word crowdsourcing is a portmanteau of "crowd" and "outsourcing". In contrast to outsourcing, crowdsourcing usually involves less specific and more public groups of participants.

Agile testing is a software testing practice that follows the principles of agile software development. Agile testing involves all members of a cross-functional agile team, with special expertise contributed by testers, to ensure delivering the business value desired by the customer at frequent intervals, working at a sustainable pace. Specification by example is used to capture examples of desired and undesired behavior and guide coding.

Azure DevOps Server, formerly known as Team Foundation Server (TFS) and Visual Studio Team System (VSTS), is a Microsoft product that provides version control, reporting, requirements management, project management, automated builds, testing and release management capabilities. It covers the entire application lifecycle and enables DevOps capabilities. Azure DevOps can be used as a back-end to numerous integrated development environments (IDEs) but is tailored for Microsoft Visual Studio and Eclipse on all platforms.

Linaro is an engineering organization that works on free and open-source software such as the Linux kernel, the GNU Compiler Collection (GCC), QEMU, power management, graphics and multimedia interfaces for the ARM family of instruction sets and implementations thereof as well as for the Heterogeneous System Architecture (HSA). The company provides a collaborative engineering forum for companies to share engineering resources and funding to solve common problems on ARM software. In addition to Linaro's collaborative engineering forum, Linaro also works with companies on a one-to-one basis through its Services division.

OpenText ALM (Application Lifecycle Management) is a comprehensive solution designed to support and enhance the entire lifecycle of application development and management. It provides robust tools for planning, development, testing, deployment, and maintenance, ensuring that software projects are delivered efficiently and effectively.

<span class="mw-page-title-main">Crowdsourced Testing (company)</span> Crowdsourcing platform

Crowdsourced Testing is a crowdsourcing platform which provides functional, localization, usability and Beta testing through crowdsourcing.

<span class="mw-page-title-main">Topcoder Open</span>

Topcoder Open (TCO) was an annual design, software development, data science and competitive programming championship organized by Topcoder, and hosted in different venues around the United States. In the first two years, 2001 and 2002, the tournament was titled TopCoder Invitational.

References

  1. Riedl, Christoph; Woolley, Anita (December 2016). "Teams vs. Crowds: A Field Test of the Relative Contribution of Incentives, Member Ability, and Collaboration to Crowd-Based Problem Solving Performance". Academy of Management Discoveries. in press (4): 382–403. doi:10.5465/amd.2015.0097.
  2. 1 2 Wu, Wenjun; W. T. Tsai; Wei Li (2013). "An Evaluation Framework for Software Crowdsourcing". Frontiers of Computer Science. 7 (5): 694–709. doi:10.1007/s11704-013-2320-2. S2CID   3352701.
  3. Stol, Klaas-Jan; Fitzgerald, Brian (2014). Two's Company, Three's a Crowd: A Case Study of Crowdsourcing Software Development. 36th International Conference on Software Engineering. ACM. pp. 187–198. doi:10.1145/2568225.2568249. hdl: 10344/3982 .
  4. Wu, Wenjun; W. T. Tsai; Wei Li (2013). "Creative Software Crowdsourcing". International Journal of Creative Computing . 1: 57. doi: 10.1504/IJCRC.2013.056925 .
  5. "Crowdsourcing Software Gathers Stronger Ideas". IdeaScale. Retrieved 2016-03-19.
  6. Bugfinders. "Software Testing in the Real World" . Retrieved June 21, 2013.
  7. Stol, K. J.; Fitzgerald, B. (2014). "Researching crowdsourcing software development: Perspectives and concerns". Proceedings of the 1st International Workshop on Crowd Sourcing in Software Engineering - CSI-SE 2014. p. 7. doi:10.1145/2593728.2593731. hdl:10344/3853. ISBN   9781450328579. S2CID   7531317.
  8. "Crowdsourcing & the Prisoner's Dilemma — Delling Advisory". Dellingadvisory.com. 11 April 2013. Retrieved 2016-03-19.
  9. Thomas, Stuart (August 16, 2011). "Microsoft launches crowdsourcing blog for Windows 8". Memeburn. Retrieved June 21, 2013.
  10. Simpson, Scott (June 10, 2013). "CROWDSOURCE YOUR NEXT WINDOWS 8 DEVICE?" . Retrieved June 21, 2013.
  11. Bell, Lee (June 20, 2013). "Microsoft offers a $100,000 bug bounty for cracking Windows 8.1". Archived from the original on June 25, 2013. Retrieved June 20, 2013.{{cite news}}: CS1 maint: unfit URL (link)
  12. Steffen, Robynn Sturm (16 December 2011). "New center for excellence fuels prize to help modernize tools for patent examination". The White House Blog. Retrieved 30 March 2016.
  13. Riedl, C.; Zanibbi, R.; Hearst, M. A.; Zhu, S.; Menietti, M.; Crusan, J.; Metelsky, I.; Lakhani, K. (20 February 2016). "Detecting Figures and Part Labels in Patents: Competition-Based Development of Image Processing Algorithms". International Journal on Document Analysis and Recognition . 19 (2): 155–172. arXiv: 1410.6751 . doi:10.1007/s10032-016-0260-8. S2CID   11873638.
  14. Diana, Alison (March 16, 2011). "Oracle Integrates Crowdsourcing Into CRM". InformationWeek. Retrieved June 21, 2013.
  15. Huhns, Michael N.; Li, Wei; Tsai, Wei-Tek (2013). "Schloss Dagstuhl : Seminar Homepage". Dagstuhl Reports. 3 (9). Dagstuhl.de: 34–58. doi: 10.4230/DagRep.3.9.34 . Retrieved 2016-03-19.

Further reading