Conflict and Mediation Event Observations

Last updated

Conflict and Mediation Event Observations (CAMEO) is a framework for coding event data (typically used for events that merit news coverage, and generally applied to the study of political news and violence). [1] [2] It is a more recent alternative to the WEIS coding system developed by Charles A. McClelland and the Conflict and Peace Data Bank (COPDAB) coding system developed by Edward Azar.

Contents

History

Work on CAMEO began in 2000 at the University of Kansas with financial support from the National Science Foundation. The first paper on the subject, by Deborah J. Gerner was written for the March 2002 Annual Meeting of the International Studies Association in New Orleans. [3] In the paper, the authors noted that they worked on creating the new CAMEO system rather than continue using the existing WEIS coding system for a combination of reasons, including previously known weaknesses of WEIS and some difficulties that emerge when trying to automate the WEIS coding process. The coding software used for CAMEO, as well as for the automated WEIS implementation that CAMEO was compared with, was the Textual Analysis by Augmeted Replacement Instructions (TABARI) software developed by co-author Philip A. Schrodt in 2000, and was in turn based on the Kansas Event Data System (KEDS) developed in 1994. [3]

The CAMEO manual describes the following key stages of the history of work on the project: [4]

Alternatives

One of the alternatives to CAMEO is Integrated Data for Events Analysis (IDEA), an outgrowth of work by the PANDA project. [5] Predecessors to CAMEO include the World Interaction/Event Survey (WEIS) coding system by Charles A. McClelland and the Conflict and Peace Data Bank (COPDAB) by Edward Azar. [1]

Some key differences between CAMEO and IDEA are: [1]

Reception

Academic reception

CAMEO has been the subject of a number of academic papers comparing it with other coding frameworks. [6] [7]

Datasets that use CAMEO coding

Related Research Articles

The Semantic Web is an extension of the World Wide Web through standards set by the World Wide Web Consortium (W3C). The goal of the Semantic Web is to make Internet data machine-readable.

In computer science and information science, an ontology encompasses a representation, formal naming and definition of the categories, properties and relations between the concepts, data and entities that substantiate one, many, or all domains of discourse. More simply, an ontology is a way of showing the properties of a subject area and how they are related, by defining a set of concepts and categories that represent the subject.

Software development is the process of conceiving, specifying, designing, programming, documenting, testing, and bug fixing involved in creating and maintaining applications, frameworks, or other software components. Software development involves writing and maintaining the source code, but in a broader sense, it includes all processes from the conception of the desired software through to the final manifestation of the software, typically in a planned and structured process. Software development also includes research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities that result in software products.

IDEF

IDEF, initially an abbreviation of ICAM Definition and renamed in 1999 as Integration Definition, is a family of modeling languages in the field of systems and software engineering. They cover a wide range of uses from functional modeling to data, simulation, object-oriented analysis and design, and knowledge acquisition. These definition languages were developed under funding from U.S. Air Force and, although still most commonly used by them and other military and United States Department of Defense (DoD) agencies, are in the public domain.

In software testing, test automation is the use of software separate from the software being tested to control the execution of tests and the comparison of actual outcomes with predicted outcomes. Test automation can automate some repetitive but necessary tasks in a formalized testing process already in place, or perform additional testing that would be difficult to do manually. Test automation is critical for continuous delivery and continuous testing.

The Gene Ontology (GO) is a major bioinformatics initiative to unify the representation of gene and gene product attributes across all species. More specifically, the project aims to: 1) maintain and develop its controlled vocabulary of gene and gene product attributes; 2) annotate genes and gene products, and assimilate and disseminate annotation data; and 3) provide tools for easy access to all aspects of the data provided by the project, and to enable functional interpretation of experimental data using the GO, for example via enrichment analysis. GO is part of a larger classification effort, the Open Biomedical Ontologies, being one of the Initial Candidate Members of the OBO Foundry.

In computing and data management, data mapping is the process of creating data element mappings between two distinct data models. Data mapping is used as a first step for a wide variety of data integration tasks, including:

Deborah Louise McGuinness is an American computer scientist and Professor at Rensselaer Polytechnic Institute where she holds an endowed chair in the Tetherless World Research Constellation. She is working in the field of artificial intelligence, specifically in knowledge representation and reasoning, description logics, the semantic web, explanation, and trust.

Cameo or CAMEO may refer to:

A Biositemap is a way for a biomedical research institution of organisation to show how biological information is distributed throughout their Information Technology systems and networks. This information may be shared with other organisations and researchers.

Charles Armor McClelland was an American political scientist, systems analyst and Professor International Relations at the San Francisco State University, who was among the first to introduce General Systems Theory in the field of International Relations.

Security information and event management Computer security

Security information and event management (SIEM) is a field within the field of computer security, where software products and services combine security information management (SIM) and security event management (SEM). They provide real-time analysis of security alerts generated by applications and network hardware. Vendors sell SIEM as software, as appliances, or as managed services; these products are also used to log security data and generate reports for compliance purposes. The term and the initialism SIEM was coined by Mark Nicolett and Amrit Williams of Gartner in 2005.

The Center for International Political Analysis (CIPA) is a research center at the Policy Research Institute (PRI) at the University of Kansas. The primary project focus at present is on gathering information regarding the nature of inter-state relations during times of conflict in several localized areas: the Middle East, Southern, Western, and Central Africa, and the Balkans.

The Integrated Crisis Early Warning System (ICEWS) combines a database of political events and a system using these to provide conflict early warnings. It is supported by the Defense Advanced Research Projects Agency in the United States. The database as well as the model used by Lockheed Martin Advanced Technology Laboratories are currently undergoing operational test and evaluation by the United States Southern Command and United States Pacific Command.

Crowdsourcing software development or software crowdsourcing is an emerging area of software engineering. It is an open call for participation in any task of software development, including documentation, design, coding and testing. These tasks are normally conducted by either members of a software enterprise or people contracted by the enterprise. But in software crowdsourcing, all the tasks can be assigned to or are addressed by members of the general public. Individuals and teams may also participate in crowdsourcing contests.

The GDELT Project, or Global Database of Events, Language, and Tone, created by Kalev Leetaru of Yahoo! and Georgetown University, along with Philip Schrodt and others, describes itself as "an initiative to construct a catalog of human societal-scale behavior and beliefs across all countries of the world, connecting every person, organization, location, count, theme, news source, and event across the planet into a single massive network that captures what's happening around the world, what its context is and who's involved, and how the world is feeling about it, every single day." Early explorations leading up to the creation of GDELT were described by co-creator Philip Schrodt in a conference paper in January 2011. The dataset is available on Google Cloud Platform.

Kalev Leetaru

Kalev Hannes Leetaru is an American internet entrepreneur, academic, and senior fellow at the George Washington University School of Engineering and Applied Science Center for Cyber & Homeland Security in Washington, D.C. He was a former Yahoo! Fellow in Residence of International Values, Communications Technology & the Global Internet at the Institute for the Study of Diplomacy in the Edmund A. Walsh School of Foreign Service at Georgetown University, before moving to George Washington University.

The Worldwide Atrocities Dataset is a dataset collected by the Computational Event Data System at Pennsylvania State University and sponsored by the Political Instability Task Force (PITF) that is, in turn, funded by the Central Intelligence Agency in the United States.

Philip Andrew "Phil" Schrodt is a political scientist known for his work in automated data and event coding for political news. On August 1, 2013, he announced that he was leaving his job as professor at Pennsylvania State University to become a full-time consultant. Schrodt is currently a senior research scientist at the statistical consulting firm Parus Analytical Systems.

Privacy engineering is an emerging field of engineering which aims to provide methodologies, tools, and techniques to ensure systems provide acceptable levels of privacy.

References

  1. 1 2 3 "CAMEO Event Data Codebook". Computational Event Data System. Retrieved June 21, 2014.
  2. "CAMEO Code Wiki" . Retrieved June 21, 2014.
  3. 1 2 Gerner, Deborah J.; Schrodt, Philip A.; Abu-Jabr, Rajaa; Yilmaz, Omur. "Conflict and Mediation Event Observations (CAMEO): A New Event Data Framework for the Analysis of Foreign Policy Interactions" (PDF).
  4. Schrodt, Philip A. (March 2012). "Conflict and Media Event Observations: Event and Actor Codebook, Version 1.1b3" (PDF). Retrieved June 21, 2014.
  5. Bond, Doug; Bond, Joe; Oh, Churl; Jenkins, J. Craig; Taylor, Charles Lewis (2016). "Integrated Data for Events Analysis (IDEA): An Event Typology for Automated Events Data Development". Journal of Peace Research . 40 (6): 733–745. doi:10.1177/00223433030406009.
  6. Yonamine, James E. "Working with Event Data: A Guide to Aggregation Choices".Missing or empty |url= (help)
  7. 1 2 Schrodt, Philip A.; Van Brackle, David. "Automated Coding of Political Event Data".Missing or empty |url= (help)
  8. "W-ICEWS iData". Lockheed Martin . Retrieved June 21, 2014.
  9. 1 2 mdwardlab (October 17, 2013). "GDELT and ICEWS, a short comparison". Predictive Heuristics. Archived from the original on July 17, 2014. Retrieved June 21, 2014.
  10. "Data: CAMEO (Documentation section)". Global Database of Events, Language, and Tone . Retrieved June 21, 2014.