Brandon Anderson (entrepreneur)

Last updated

Brandon D. Anderson is an American sociologist and entrepreneur. He founded Raheem.ai, a chatbot which helps the public monitor police interactions. He was the 2018 Echoing Green Black Male Achievement Fellow and is a 2019 TED fellow. In August 2024, several of Anderson's claims about his chatbot and personal history were questioned in an investigative article in The New York Times.

Contents

Early life

Anderson was born in Oklahoma. [1] His mother is a rental car clerk and his father a forklift truck driver. [2] [3] He has described his childhood as being "characterised by violence". [4] He was kicked out of his grandparents house as a teenager and made homeless. [5] Anderson ran away with his best friend, with whom he later fell in love. [5] Anderson enlisted in the Army in 2003, where he worked as a satellite engineer. [1] [6] In 2007, while Anderson was serving as an engineer in the army overseas, he alleged that his partner was shot and killed by a police officer during a routine traffic stop. [5] [7] An August 2024 article in The New York Times by David Fahrenthold suggested Anderson fabricated this origin story. [8] Anderson was discharged from the Army once he disclosed his sexuality. [9]

Education and career

Anderson became a community activist and organiser, earning a degree at Georgetown University in 2015. [5] At Georgetown he studied sociology and philosophy. [6] He served as a Racial Equity Fellow at the Washington, D.C. Center for the Study of Social Policy. [4] Anderson learned that the majority of people don't report negative interactions with police officers because they "do not trust the system". [10]

In 2014 Anderson was awarded money from Fast ForWord and the My Brother's Keeper Challenge to build Raheem.ai, a Facebook messenger chatbot that eliminates barriers to reporting police misconduct. [1] [11] [12] The chatbot allowed the public to evaluate police interactions and offers follow-on support for users. [13] [14] Raheem.ai was inspired by Waze, who, alongside offering navigation information, used user-generated information to inform local government about fill potholes. [12] The chatbot asked questions about recent interactions with the police, anonymized the data that was collected, and shared them in real-time to a public dashboard on police performance. [15] [16] [17] Raheem.ai published reports about where police are working well and where they are failing communities. [1] [18] It aimed to reach all fifty states by 2020. [19] With Raheem.ai, Anderson looked to build the first crowdsourced database of police interactions. [20] [21] [22]

In 2016 Anderson delivered a TED talk at Georgetown, where he discussed what it means to be vulnerable. [23] He was named as one of the National Black Justice Coalition 100 Black LGBTQ/SGL Emerging Leaders. [24] Anderson was made an Echoing Green Fellow in 2018. [25] [26]

However, Raheem.ai was never able to overcome a fundamental problem: that the US' thousands of separate police agencies have their own individual ways of preferred contact. David Fahrenthold of The New York Times wrote in August 2024: [8]

Mr. Anderson's complaint system — "Yelp for police," he called it — did not work. His website collected more than 2,700 stories from users about their interactions with police — accounts of unjustified traffic stops, physical assaults and harassment. But the work had little impact because Raheem was unable to solve a mind-bending technical problem.

There are 18,000 police departments in America. Some accept complaints online, but many require people to make a phone call or go to a police station. Raheem failed because it never offered a one-stop way for users to file their complaints directly with police.

[...] For now, his nonprofit appears legally active, but functionally dead. Several donors pulled their funding. Three employees were left out of work. [8]

In August 2024, The New York Times scrutinized Anderson's financial practices while heading Raheem.ai. They included large sums for hotels and clothing that were billed to the non-profit. [8] The Washington DC Attorney General's Office filed an injunction and lawsuit saying that Anderson and Raheem.ai misused funds for the lavish lifestyle of Anderson. [27]

In November 2024, Attorney General Attorney General Brian L. Schwalb sued Raheem AI, a nonprofit created to improve transparency and accountability in policing, and its founder and Executive Director Brandon Anderson for violating the District’s nonprofit and workers’ rights laws. Anderson used Raheem AI’s charitable funds for his own personal benefit - specifically to support his luxurious lifestyle - while the organization failed to monitor spending or implement basic nonprofit governance requirements. Anderson and Raheem AI also failed to pay the organization’s sole District-based employee the wages she had earned and required her to sign an illegal non-compete clause. [28]

Related Research Articles

<span class="mw-page-title-main">ELIZA</span> Early natural language processing computer program

ELIZA is an early natural language processing computer program developed from 1964 to 1967 at MIT by Joseph Weizenbaum. Created to explore communication between humans and machines, ELIZA simulated conversation by using a pattern matching and substitution methodology that gave users an illusion of understanding on the part of the program, but had no representation that could be considered really understanding what was being said by either party. Whereas the ELIZA program itself was written (originally) in MAD-SLIP, the pattern matching directives that contained most of its language capability were provided in separate "scripts", represented in a lisp-like representation. The most famous script, DOCTOR, simulated a psychotherapist of the Rogerian school, and used rules, dictated in the script, to respond with non-directional questions to user inputs. As such, ELIZA was one of the first chatterbots and one of the first programs capable of attempting the Turing test.

In computer science, the ELIZA effect is a tendency to project human traits — such as experience, semantic comprehension or empathy — onto rudimentary computer programs having a textual interface. ELIZA was a symbolic AI chatbot developed in 1966 by Joseph Weizenbaum and imitating a psychotherapist. Many early users were convinced of ELIZA's intelligence and understanding, despite its basic text-processing approach and the explanations of its limitations.

<span class="mw-page-title-main">Chatbot</span> Program that simulates conversation

A chatbot is a software application or web interface designed to have textual or spoken conversations. Modern chatbots are typically online and use generative artificial intelligence systems that are capable of maintaining a conversation with a user in natural language and simulating the way a human would behave as a conversational partner. Such chatbots often use deep learning and natural language processing, but simpler chatbots have existed for decades.

<span class="mw-page-title-main">Customer service</span> Provision of service to customers

Customer service is the assistance and advice provided by a company through phone, online chat, mail, and e-mail to those who buy or use its products or services. Each industry requires different levels of customer service, but towards the end, the idea of a well-performed service is that of increasing revenues. The perception of success of the customer service interactions is dependent on employees "who can adjust themselves to the personality of the customer". Customer service is often practiced in a way that reflects the strategies and values of a firm. Good quality customer service is usually measured through customer retention.

<span class="mw-page-title-main">Leonard Leo</span> American lawyer (born 1965)

Leonard Anthony Leo is an American lawyer and conservative legal activist. He was the longtime vice president of the Federalist Society and is currently, along with Steven Calabresi, the co-chairman of the organization's board of directors.

<span class="mw-page-title-main">Murder of Jennifer Ann Crecente</span> 2006 murder in Austin, Texas

Jennifer Ann Crecente was an 18-year-old high school student who was shot and killed in southwest Austin, Texas, by Justin Crabbe, her ex-boyfriend, on February 15, 2006. Crecente's murder was the first in Austin in 2006. In response to her murder, two charitable organizations have been formed, a memorial grant created in her name, and legislation passed in Texas to prevent teen dating violence.

<span class="mw-page-title-main">Virtual assistant</span> Software agent

A virtual assistant (VA) is a software agent that can perform a range of tasks or services for a user based on user input such as commands or questions, including verbal ones. Such technologies often incorporate chatbot capabilities to simulate human conversation, such as via online chat, to facilitate interaction with their users. The interaction may be via text, graphical interface, or voice - as some virtual assistants are able to interpret human speech and respond via synthesized voices.

<span class="mw-page-title-main">Holden Karnofsky</span> American nonprofit executive

Holden Karnofsky is an American nonprofit executive. He is a co-founder and Director of AI Strategy of the research and grantmaking organization Open Philanthropy. Karnofsky co-founded the charity evaluator GiveWell with Elie Hassenfeld in 2007 and is vice chair of its board of directors.

<span class="mw-page-title-main">Mustafa Suleyman</span> British entrepreneur and activist

Mustafa Suleyman is a British artificial intelligence (AI) entrepreneur. He is the CEO of Microsoft AI, and the co-founder and former head of applied AI at DeepMind, an AI company acquired by Google. After leaving DeepMind, he co-founded Inflection AI, a machine learning and generative AI company, in 2022.

<span class="mw-page-title-main">Tay (chatbot)</span> Chatbot developed by Microsoft

Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. According to Microsoft, this was caused by trolls who "attacked" the service as the bot made replies based on its interactions with people on Twitter. It was replaced with Zo.

<span class="mw-page-title-main">David Fahrenthold</span> American journalist

David A. Fahrenthold is an American journalist who writes for The New York Times. Previously he wrote for The Washington Post. He has also served as a political analyst for NBC News and MSNBC. In 2017, he was awarded the Pulitzer Prize for National Reporting for his coverage of Donald Trump and his alleged charitable givings, including the 2016 United States presidential election.

DoNotPay is an American company specializing in online legal services and chatbots. The product provides a "robot lawyer" service that claims to make use of artificial intelligence to contest parking tickets and provide various other legal services, with a subscription cost of $36 for three months.

<span class="mw-page-title-main">Haptik</span> Indian enterprise conversational AI platform

Haptik is an Indian enterprise conversational AI platform founded in August 2013, and acquired by Reliance Industries Limited in 2019. The company develops technology to enable enterprises to build conversational AI systems that allow users to converse with applications and electronic devices in free-format, natural language, using speech or text. The company has been accorded numerous accolades including the Frost & Sullivan Award, NASSCOM's Al Game Changer Award, and serves Fortune 500 brands globally in industries such as financial, insurance, healthcare, technology and communications.

LaMDA is a family of conversational large language models developed by Google. Originally developed and introduced as Meena in 2020, the first-generation LaMDA was announced during the 2021 Google I/O keynote, while the second generation was announced the following year.

<span class="mw-page-title-main">Brian Schwalb</span> Attorney General of the District of Columbia (2023-)

Brian Lawrence Schwalb is an American attorney and politician serving as the attorney general of the District of Columbia. Prior to becoming attorney general, Schwalb was the partner-in-charge of Venable LLP's D.C. office. He is a member of the Democratic Party.

Character.ai is a neural language model chatbot service that can generate human-like text responses and participate in contextual conversation. Constructed by previous developers of Google's LaMDA, Noam Shazeer and Daniel de Freitas, the beta model was made available to use by the public in September 2022. The beta model has since been retired on September 24, 2024, and can no longer be used.

<span class="mw-page-title-main">ChatGPT</span> Chatbot developed by OpenAI

ChatGPT is a generative artificial intelligence chatbot developed by OpenAI and launched in 2022. It is currently based on the GPT-4o large language model (LLM). ChatGPT can generate human-like conversational responses and enables users to refine and steer a conversation towards a desired length, format, style, level of detail, and language. It is credited with accelerating the AI boom, which has led to ongoing rapid investment in and public attention to the field of artificial intelligence (AI). Some observers have raised concern about the potential of ChatGPT and similar programs to displace human intelligence, enable plagiarism, or fuel misinformation.

<span class="mw-page-title-main">Replika</span> AI chatbot app

Replika is a generative AI chatbot app released in November 2017. The chatbot is trained by having the user answer a series of questions to create a specific neural network. The chatbot operates on a freemium pricing strategy, with roughly 25% of its user base paying an annual subscription fee.

Manoel Horta Ribeiro is a computer scientist and a professor at Princeton University. Ribeiro is recognized for his work on online platforms, studying recommender systems and the impact of generative AI.

References

  1. 1 2 3 4 Kolodny, Lora (2017-09-13). "Raheem.ai: Yelp or Amazon Reviews for police interactions". www.cnbc.com. Retrieved 2019-02-27.
  2. "Brandon D. Anderson". Conference on World Affairs. 2017-12-01. Retrieved 2019-02-27.
  3. "My Origin Story: Brandon Anderson". Generation Titans. Retrieved 2019-02-27.
  4. 1 2 "Raheem". SIPS Fund. Retrieved 2019-02-27.
  5. 1 2 3 4 "Raheem is a Chatbot for Anonymously Rating Experiences with Police". Fast Forward. 2017-10-23. Retrieved 2019-02-27.
  6. 1 2 "Brandon Anderson". Halcyon. 2017-06-28. Retrieved 2019-02-27.
  7. Whats Your Revolution 10 24 18 Brandon Anderson Founder of Raheem AI , retrieved 2019-02-27
  8. 1 2 3 4 Fahrenthold, David A. (2024-08-25). "Would a Group Opposed to Police Blow the Whistle on Its Founder?" . The New York Times. Retrieved 2024-08-27.
  9. Gray, Christopher. "Brandon Anderson's RAHEEM Has Leveraged Technology And Data To Help Thousands Of Black People Report Police Misconduct". Forbes. Retrieved 2020-10-28.
  10. Mathew, Teresa (18 June 2018). "Positive or Negative: Rate Your Latest Police Encounter". Bloomberg.com. Retrieved 2019-02-27.
  11. Schwartz, Elena (2018-06-13). "Can Artificial Intelligence Hold Police Accountable?". The Crime Report. Retrieved 2019-02-27.
  12. 1 2 "FEATURE: Young Black Entrepreneur Brandon Anderson creates app to monitor police brutality". AFROPUNK. 2016-04-18. Retrieved 2019-02-27.
  13. "Brandon Anderson". Wonder Women Tech. Retrieved 2019-02-27.
  14. Farley, Shannon (2017-06-22). "Nonprofits, not Silicon Valley startups, are creating AI apps for the greater good". Recode. Retrieved 2019-02-27.
  15. "Raheem Ai - Tech Nonprofit". Fast Forward. Retrieved 2019-02-27.
  16. Fast Forward (2018-03-29), Brandon Anderson, Founder of Raheem | AGG 2018 , retrieved 2019-02-27
  17. Peters, Adele (2017-10-02). "This Chatbot Makes It Easy To Document Your Interactions With The Police". Fast Company. Retrieved 2019-02-27.
  18. "Meet the chatbots helping users anonymously report social injustices". VentureBeat. 2018-03-18. Retrieved 2019-02-27.
  19. "Brandon Anderson". Camelback Ventures. Retrieved 2019-02-27.
  20. "The AI Agenda". The Economist Events. Retrieved 2019-02-27.
  21. contributor, Julia Airey / (2016-11-04). "Can this new chatbot increase police accountability?". Technical.ly DC. Retrieved 2019-02-27.{{cite web}}: |last= has generic name (help)
  22. "Gay Man's Software Holds Police Accountable". www.intomore.com. 15 September 2017. Retrieved 2019-02-27.
  23. TEDx Talks (2016-03-14), Make Space | Brandon Anderson | TEDxGeorgetown , retrieved 2019-02-27
  24. "100 to Watch | National Black Justice Coalition". www.nbjc.org. Retrieved 2019-02-27.
  25. "Brandon Anderson". www.echoinggreen.org. Retrieved 2019-02-27.
  26. Echoing Green (2018-06-18), To Live and to Love in a World Free of Police Violence , retrieved 2019-02-27
  27. https://oag.dc.gov/release/attorney-general-schwalb-sues-nonprofit
  28. "Attorney General Schwalb Sues Police Accountability Nonprofit & Executive for Misusing Charitable Funds, Violating Labor Laws". oag.dc.gov. 2024-11-25. Retrieved 2024-12-17.