Filippo Menczer | |
---|---|
Born | 16 May 1965 59) Rome, Italy | (age
Alma mater | Sapienza University of Rome University of California, San Diego |
Scientific career | |
Fields | Cognitive science Computer science Physics |
Institutions | Indiana University Bloomington |
Website | cnets |
Filippo Menczer (born 16 May 1965) is an American and Italian academic. He is a University Distinguished Professor and the Luddy Professor of Informatics and Computer Science at the Luddy School of Informatics, Computing, and Engineering, Indiana University. Menczer is the Director of the Observatory on Social Media, [1] a research center where data scientists and journalists study the role of media and technology in society and build tools to analyze and counter disinformation and manipulation on social media. Menczer holds courtesy appointments in Cognitive Science and Physics, is a founding member and advisory council member of the IU Network Science Institute, [2] a former director the Center for Complex Networks and Systems Research, [3] a senior research fellow of the Kinsey Institute, a fellow of the Center for Computer-Mediated Communication, [4] and a former fellow of the Institute for Scientific Interchange in Turin, Italy. In 2020 he was named a Fellow of the ACM.
Menczer holds a Laurea in physics from the Sapienza University of Rome and a PhD in computer science and cognitive science from the University of California, San Diego. He used to be an assistant professor of management sciences at the University of Iowa, and a fellow-at-large of the Santa Fe Institute. At Indiana University Bloomington since 2003, he served as division chair in the Luddy School in 2009–2011. Menczer has been the recipient of Fulbright, Rotary Foundation, and NATO fellowships, and a CAREER Award from the National Science Foundation. He holds editorial positions for the journals Network Science, [5] EPJ Data Science, [6] PeerJ Computer Science, [7] and HKS Misinformation Review . [8] He has served as program or track chair for various conferences including The Web Conference and the ACM Conference on Hypertext and Social Media. He was general chair of the ACM Web Science 2014 Conference [9] and general co-chair of the NetSci 2017 Conference.
Menczer's research focuses on Web science, social networks, social media, social computation, Web mining, data science, distributed and intelligent Web applications, and modeling of complex information networks. He introduced the idea of topical and adaptive Web crawlers, a specialized and intelligent type of Web crawler. [10] [11]
Menczer is also known for his work on social phishing, [12] [13] a type of phishing attacks that leverage friendship information from social networks, yielding over 70% success rate in experiments (with Markus Jakobsson); semantic similarity measures for information and social networks; [14] [15] [16] [17] models of complex information and social networks (with Alessandro Vespignani and others); [18] [19] [20] [21] search engine censorship; [22] [23] and search engine bias. [24] [25]
The group led by Menczer has analyzed and modeled how memes, information, and misinformation spread through social media in domains such as the Occupy movement, [26] [27] the Gezi Park protests, [28] and political elections. [29] Data and tools from Menczer's lab have aided in finding the roots of the Pizzagate conspiracy theory [30] and the disinformation campaign targeting the White Helmets, [31] and in taking down voter-suppression bots on Twitter. [32] Menczer and coauthors have also found a link between online COVID-19 misinformation and vaccination hesitancy. [33]
Analysis by Menczer's team demonstrated the echo-chamber structure of information-diffusion networks on Twitter during the 2010 United States elections. [34] The team found that conservatives almost exclusively retweeted other conservatives while liberals retweeted other liberals. Ten years later, this work received the Test of Time Award at the 15th International AAAI Conference on Web and Social Media (ICWSM). [35] As these patterns of polarization and segregation persist, [36] Menczer's team has developed a model that shows how social influence and unfollowing accelerate the emergence of online echo chambers. [37]
Menczer and colleagues have advanced the understanding of information virality, and in particular the prediction of what memes will go viral based on the structure of early diffusion networks [38] [39] and how competition for finite attention helps explain virality patterns. [40] [41] In a 2018 paper in Nature Human Behaviour, Menczer and coauthors used a model to show that when agents in a social networks share information under conditions of high information load and/or low attention, the correlation between quality and popularity of information in the system decreases. [42] An erroneous analysis in the paper suggested that this effect alone would be sufficient to explain why fake news are as likely to go viral as legitimate news on Facebook. When the authors discovered the error, they retracted the paper. [43]
Following influential publications on the detection of astroturfing [44] [45] [46] [47] [48] and social bots , [49] [50] Menczer and his team have studied the complex interplay between cognitive, social, and algorithmic factors that contribute to the vulnerability of social media platforms and people to manipulation, [51] [52] [53] [54] and focused on developing tools to counter such abuse. [55] [56] Their bot detection tool, Botometer, was used to assess the prevalence of social bots [57] [58] and their sharing activity. [59] Their tool to visualize the spread of low-credibility content, Hoaxy, [60] [61] [62] [63] was used in conjunction with Botometer to reveal the key role played by social bots in spreading low-credibility content during the 2016 United States presidential election. [64] [65] [66] [67] [68] Menczer's team also studied perceptions of partisan political bots, finding that Republican users are more likely to confuse conservative bots with humans, whereas Democratic users are more likely to confuse conservative human users with bots. [69] Using bot probes on Twitter, Menczer and coauthors demonstrated a conservative political bias on the platform. [70]
As social media have increased their countermeasures against malicious automated accounts, Menczer and coauthors have shown that coordinated campaigns by inauthentic accounts continue to threaten information integrity on social media, and developed a framework to detect these coordinated networks. [71] They also demonstrated new forms of social media manipulation by which bad actors can grow influence networks [72] and hide high-volume of content with which they flood the network. [73]
Menczer and colleagues have shown that political audience diversity can be used as an indicator of news source reliability in algorithmic ranking. [74]
The textbook A First Course in Network Science by Menczer, Fortunato, and Davis was published by Cambridge University Press in 2020. [75] The textbook has been translated into Japanese, Chinese, and Korean.
Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previous published communications which have a bearing on the subject of the new publication. The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers. Citations in, say, a review paper bring together many sources, often recent, in one place.
Astroturfing is the deceptive practice of hiding the sponsors of an orchestrated message or organization to make it appear as though it originates from, and is supported by, unsolicited grassroots participants. It is a practice intended to give the statements or organizations credibility by withholding information about the source's financial backers.
Semantic similarity is a metric defined over a set of documents or terms, where the idea of distance between items is based on the likeness of their meaning or semantic content as opposed to lexicographical similarity. These are mathematical tools used to estimate the strength of the semantic relationship between units of language, concepts or instances, through a numerical description obtained according to the comparison of information supporting their meaning or describing their nature. The term semantic similarity is often confused with semantic relatedness. Semantic relatedness includes any relation between two terms, while semantic similarity only includes "is a" relations. For example, "car" is similar to "bus", but is also related to "road" and "driving".
An Internet bot, web robot, robot or simply bot, is a software application that runs automated tasks (scripts) on the Internet, usually with the intent to imitate human activity, such as messaging, on a large scale. An Internet bot plays the client role in a client–server model whereas the server role is usually played by web servers. Internet bots are able to perform simple and repetitive tasks much faster than a person could ever do. The most extensive use of bots is for web crawling, in which an automated script fetches, analyzes and files information from web servers. More than half of all web traffic is generated by bots.
Misinformation is incorrect or misleading information. Misinformation can exist without specific malicious intent; disinformation is distinct in that it is deliberately deceptive and propagated. Misinformation can include inaccurate, incomplete, misleading, or false information as well as selective or half-truths.
Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.
The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.
Urban computing is an interdisciplinary field which pertains to the study and application of computing technology in urban areas. This involves the application of wireless networks, sensors, computational power, and data to improve the quality of densely populated areas. Urban computing is the technological framework for smart cities.
Human dynamics refer to a branch of complex systems research in statistical physics such as the movement of crowds and queues and other systems of complex human interactions including statistical modelling of human networks, including interactions over communications networks.
The term twitter bomb or tweet bomb refers to posting numerous Tweets with the same hashtags and other similar content, including @messages, from multiple accounts, with the goal of advertising a certain meme, usually by filling people's Tweet feeds with the same message, and making it a "trending topic" on Twitter. This may be done by individual users, fake accounts, or both.
A Twitter bot is a type of software bot that controls a Twitter account via the Twitter API. The social bot software may autonomously perform actions such as tweeting, retweeting, liking, following, unfollowing, or direct messaging other accounts. The automation of Twitter accounts is governed by a set of automation rules that outline proper and improper uses of automation. Proper usage includes broadcasting helpful information, automatically generating interesting or creative content, and automatically replying to users via direct message. Improper usage includes circumventing API rate limits, violating user privacy, spamming, and sockpuppeting. Twitter bots may be part of a larger botnet. They can be used to influence elections and in misinformation campaigns.
A paper with delayed recognition is a publication that received very little attention shortly after publication, but later receives a dramatic increase in citations. For example, an 1884 article by Charles Sanders Peirce was rarely cited until about the year 2000, but has since garnered many citations.
Social media mining is the process of obtaining data from user-generated content on social media in order to extract actionable patterns, form conclusions about users, and act upon the information. Mining supports targeting advertising to users or academic research. The term is an analogy to the process of mining for minerals. Mining companies sift through raw ore to find the valuable minerals; likewise, social media mining sifts through social media data in order to discern patterns and trends about matters such as social media usage, online behaviour, content sharing, connections between individuals, buying behaviour. These patterns and trends are of interest to companies, governments and not-for-profit organizations, as such organizations can use the analyses for tasks such as design strategies, introduce programs, products, processes or services.
Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.
A social bot, also described as a social AI or social algorithm, is a software agent that communicates autonomously on social media. The messages it distributes can be simple and operate in groups and various configurations with partial human control (hybrid) via algorithm. Social bots can also use artificial intelligence and machine learning to express messages in more natural human dialogue.
Media Bias/Fact Check (MBFC) is an American website founded in 2015 by Dave M. Van Zandt. It considers four main categories and multiple subcategories in assessing the "political bias" and "factual reporting" of media outlets, relying on a self-described "combination of objective measures and subjective analysis".
This article presents a detailed timeline of events in the history of computing from 2020 to the present. For narratives explaining the overall developments, see the history of computing.
Roberta Sinatra is an Italian scientist and associate professor at the IT University of Copenhagen. She is known for her work in network science and conducts research on quantifying success in science.
PoliticusUSA is an American progressive website offering independent, commentary, and opinion. Its content and reliability has been described by academic studies and journalistic reports as "high" and left- leaning. Some of PoliticusUSA's content has been described as "clickbait" "fake news", "unreliable" or "misleading".