Ritika Dutt

Last updated
Ritika Dutt
Ritika Dutt.jpg
NationalityCanadian
Alma mater McGill University
OccupationEntrepreneur
Known forCo-founder and CEO of Botler AI
Awards Forbes 30 Under 30 - Law & Policy

Ritika Dutt is a Canadian entrepreneur. She is the CEO and co-founder of the legal artificial intelligence company Botler AI. In December 2019, Dutt was named as the only Canadian in the Forbes' 30 Under 30 2020 Law and Policy List. [1]

Contents

Early life and education

Dutt was born in India, spent her childhood in Hong Kong and her teenage years in Singapore. [2] In 2009, she moved to Montréal, Canada to attend McGill University and graduated with a bachelor's degree in Economics and Political Science in 2013. [3]

Early career

After graduating from McGill, Dutt headed the marketing department of a Y Combinator startup. [4] She then took over the internal operations of Notman House, Montreal's Google for Entrepreneurs tech hub, where she fostered the local startup community by supporting and promoting innovative ventures and initiatives. [5] [6]

Using her background in economics and innovation, Dutt then co-founded Botler AI in 2017 to enhance accessibility to the legal system, through artificial intelligence. [7]

Experience with workplace sexual harassment

Prior to co-founding Botler, Dutt found herself faced with a stalker during a terrifying, months-long ordeal. [8] The man showed up at her workplace everyday, stalked her on social media to learn of her location, and even followed her to her home. [8] Though fearful, Dutt found herself making excuses thinking "It’s all in my head" or "I don't know if something is really wrong or if I'm too sensitive". [9] She didn't know what her rights were, what she should do, or if the man's actions were illegal. [3] The experience left her feeling trapped and Dutt struggled to call it what it was: stalking, or criminal harassment in Canadian law. [8] [10]

Months later, after the Harvey Weinstein sexual abuse allegations and ensuing spread of the #MeToo movement, Dutt started researching the relevant legal codes and learned what had happened to her was a crime. [11] She gained confidence from learning there was a legal basis to what she had felt and that she had been justified with her discomfort. [12]

Dutt realized that sexual harassment was a far bigger issue than imagined and found herself angered thinking “How many people think they can do this and get away with it?". [13] [10]

Botler AI

In December 2017, motivated to take action by her personal experiences, Dutt led Botler AI to launch a free tool to help survivors of sexual harassment determine whether their rights had been violated. [11] The tool was aimed as an impartial resource to empower the average person through information and education, without fear of judgment. [10]

Dutt's premise was that, unlike humans, a robot has no prejudice of race, gender, sexual orientation or socio-economic background, would never ask “What were you wearing?” or “How many drinks did you have?”, and therefore provided an emotion- and judgement-free neutral tool to complainants. [14] [15] The Artificial Intelligence system, which also used deep learning, was trained using over 300,000 court documents from Canada and the United States. [16] [11] Natural language processing was used to determine whether an incident described by the user could be classified as sexual harassment.

The user was provided with a summary of the relevant legal codes, based on their jurisdiction, and a detailed report of the incident which could be handed over to the relevant authorities, from HR to the police, if desired. [17] [16] [12] The goal was not to let the user know whether they could win a case in court, but rather to empower them with confidence grounded in legal doctrine. [12] Dutt stressed, “Once people have the information then it’s up to them what they want to do with it... maybe they feel comfortable to approach somebody like HR…or maybe it makes them feel better that it’s not just in my head, and I have the right to stand up to my abuser because I have rights in this situation.” [18]

Dutt also commented “This is just the first step” and revealed plans to expand Botler to connect users with resources appropriate with their situation, including legal representation. [19] [15] [16]

See also

Related Research Articles

<span class="mw-page-title-main">Restraining order</span> Legal order prohibiting certain entities from specified actions

A restraining order or protective order, abbreviated PFA, is an order used by a court to protect a person in a situation involving alleged domestic violence, child abuse, assault, harassment, stalking, or sexual assault.

<span class="mw-page-title-main">Sexual harassment</span> Unwanted sexual attention or advances

Sexual harassment is a type of harassment involving the use of explicit or implicit sexual overtones, including the unwelcome and inappropriate promises of rewards in exchange for sexual favors. Sexual harassment includes a range of actions from verbal transgressions to sexual abuse or assault. Harassment can occur in many different social settings such as the workplace, the home, school, or religious institutions. Harassers or victims may be of any sex or gender.

<span class="mw-page-title-main">Gretchen Carlson</span> American journalist (born 1966)

Gretchen Elizabeth Carlson is an American broadcast journalist, author, and television personality. Carlson appeared as the host of numerous television programs, most notably on the Saturday edition of The Early Show on CBS News from 2002 to 2005, Fox News's morning show Fox & Friends from 2005 to 2013, and The Real Story with Gretchen Carlson on Fox News from 2013 to 2016.

<span class="mw-page-title-main">Street harassment</span> Harassment occurring in a public setting

Street harassment is a form of harassment, primarily sexual harassment that consists of unwanted sexualised comments, provocative gestures, honking, wolf-whistlings, indecent exposures, stalking, persistent sexual advances, and touching by strangers, in public areas such as streets, shopping malls and public transportation.

Internet safety or online safety or cyber safety and E-Safety is trying to be safe on the internet and is the act of maximizing a user's awareness of personal safety and security risks to private information and property associated with using the internet, and the self-protection from computer crime.

Cyberstalking is the use of the Internet or other electronic means to stalk or harass an individual, group, or organization. It may include false accusations, defamation, slander and libel. It may also include monitoring, identity theft, threats, vandalism, solicitation for sex, doxing, or blackmail.

Doxing or doxxing is the act of publicly providing personally identifiable information about an individual or organization, usually via the Internet. Historically, the term has been used interchangeably to refer to both the aggregration of this information from public source or record databases and social media websites, as well as the publication of previously private information obtained through criminal or otherwise fraudulent means such as hacking and social engineering. The aggregration and provision of previously published material as distinct from publication is generally a legal practice though subject to laws concerning stalking and intimidation. Doxing may be carried out for reasons such as online shaming, extortion, and vigilante aid to law enforcement. It also may be associated with hacktivism.

Revenge porn is the distribution of sexually explicit images or videos of individuals without their consent. The material may have been made by a partner in an intimate relationship with the knowledge and consent of the subject at the time, or it may have been made without their knowledge. The subject may have experienced sexual violence during the recording of the material, in some cases facilitated by narcotics such as date rape drugs which also cause a reduced sense of pain and involvement in the sexual act, dissociative effects and amnesia. The possession of the material may be used by the perpetrators to blackmail the subjects into performing other sex acts, to coerce them into continuing a relationship or to punish them for ending one, to silence them, to damage their reputation, and/or for financial gain. In the wake of civil lawsuits and the increasing numbers of reported incidents, legislation has been passed in a number of countries and jurisdictions to outlaw the practice, though approaches have varied. The practice has also been described as a form of psychological abuse and domestic violence, as well as a form of sexual abuse.

<span class="mw-page-title-main">MeToo movement</span> Social movement against sexual abuse and harassment

#MeToo is a social movement against sexual abuse, sexual harassment, and rape culture, in which people publicize their experiences of sexual abuse or sexual harassment. The phrase "Me Too" was initially used in this context on social media in 2006, on Myspace, by sexual assault survivor and activist Tarana Burke. Harvard University published a case study on Burke, called "Leading with Empathy: Tarana Burke and the Making of the Me Too Movement".

<span class="mw-page-title-main">Tabitha Goldstaub</span> British tech entrepreneur

Tabitha Goldstaub is a British tech entrepreneur who specialises in communicating the impact of artificial intelligence. She is the co-founder of CogX, a festival and online platform. She is also the chair of the UK government's AI Council, a member of the DCMS Digitial Economy Council and on the TechUK board. A serial entrepreneur, she was the co-founder of video distribution company Rightster. Tabitha is the author of How To Talk To Robots - A Girls' Guide to a World Dominated by AI. She's also an advisor to Tortoise Media, Raspberry Pi, CarbonRe, Monumo, Cambridge Innovation Capital and The Alan Turing Institute.

<span class="mw-page-title-main">Joy Buolamwini</span> Computer scientist and digital activist

Joy Adowaa Buolamwini is a Ghanaian-American-Canadian computer scientist and digital activist based at the MIT Media Lab. Buolamwini introduces herself as a poet of code, daughter of art and science. She founded the Algorithmic Justice League, an organization that works to challenge bias in decision-making software, using art, advocacy, and research to highlight the social implications and harms of artificial intelligence (AI).

<span class="mw-page-title-main">Carrie Goldberg</span> American attorney

Carrie Goldberg is an American lawyer specializing in sex crimes with her law firm C.A. Goldberg PLLC. She has represented: five clients who described sexual abuse committed by Harvey Weinstein; the former Democrat Member of Congress Katie Hill after her naked photos were published in the media; and the author Emma Cline after an ex-partner sued for plagiarism. Her legal cases with low-profile individuals—involving revenge porn, intimate partner violence and online abuse—often draw national media attention.

<span class="mw-page-title-main">Doina Precup</span> Romanian researcher of artificial intelligence

Doina Precup is a Romanian researcher currently living in Montreal, Canada. She specializes in artificial intelligence (AI). Precup is associate dean of research at the faculty of science at McGill University, Canada research chair in machine learning and a senior fellow at the Canadian Institute for Advanced Research. She also heads the Montreal office of Deepmind.

Animashree (Anima) Anandkumar is the Bren Professor of Computing at California Institute of Technology. She is a director of Machine Learning research at NVIDIA. Her research considers tensor-algebraic methods, deep learning and non-convex problems.

<span class="mw-page-title-main">Element AI</span> Canadian software company

Element AI is an artificial intelligence company based in Montreal, Quebec.

<span class="mw-page-title-main">Akancha Srivastava Foundation</span>

Akancha Srivastava, is the founder of Akancha Srivastava Foundation. The Foundation was formed in 2017. It is an 80G certified, non-profit organization in India that works for the education and empowerment of people by imparting the knowledge of cyber safety via its initiative Akancha Against Harassment.

<span class="mw-page-title-main">Kiwi Farms</span> Internet forum

Kiwi Farms, formerly known as CWCki Forums, is an Internet forum that facilitates the discussion and harassment of online figures and communities. Their targets are often subject to organized group trolling and stalking, as well as doxxing and real-life harassment. These actions have tied Kiwi Farms to the suicides of three people targeted by members of the forum.

Julie S. Lalonde is a Franco-Ontarian women's rights advocate, author, and educator. She has created multiple feminist organizations and education campaigns, and has offered many training sessions surrounding sexual violence, harassment, and bystander intervention. Her first book, Resilience is Futile: The Life and Death and Life of Julie S. Lalonde, was published in February 2020.

<span class="mw-page-title-main">Liz Fong-Jones</span> American activist and software developer

Liz Fong-Jones is a site reliability engineer and developer advocate known for labor activism with her contributions to the Never Again pledge and her role in leading Google worker organization efforts. She is the president of the board of directors of the Solidarity Fund by Coworker, which she seeded with her own money. She is Honeycomb's field Chief Technology Officer.

<i>Nobodys Victim</i> 2019 non-fiction book

Nobody's Victim: Fighting Psychos, Stalkers, Pervs, and Trolls is a 2019 book by Carrie Goldberg, co-written with Jeannine Amber. It describes incidents of sexual violence experienced by Goldberg's clients and herself, as well as other famous cases. The acts of violence include rape and sexual assault, revenge porn, "doxing", "swatting", "sextortion", and abusive messages. Goldberg categorizes perpetrators as "assholes", "psychos", "pervs" or "trolls" depending on their nature, though "assholes" was omitted in the book's subtitle to avoid profanity.

References

  1. "Ritika Dutt, 28". Forbes .
  2. "*Ritika on Instagram: "I was born in India, spent my childhood in Hong Kong & my teenage years in Singapore. 10 years ago today, I set foot in Canada for the…"". Instagram. Archived from the original on 2021-12-26. Retrieved 2019-08-24.
  3. 1 2 "Applying AI to the #MeToo landscape". mcgillnews.mcgill.ca. Retrieved 2018-01-30.
  4. "Ritika Dutt". Startupfest. Retrieved 2018-08-24.
  5. Montreal.TV. "Soirée Startup et Croissance à Montréal". Montreal.TV (in French). Retrieved 2018-08-24.
  6. Faggella, Daniel. "The State of AI in Montreal – Startups, Investment, and What it Means for the City". Emerj. Retrieved 2019-07-11.
  7. Startupfest (2018-07-06). "Cutting through the legal jargon — Ritika Dutt on how AI can help navigate the system". Medium. Retrieved 2018-08-24.
  8. 1 2 3 "New technology aims to give victims of sexual violence a more positive reporting experience" . Retrieved 2018-10-30.
  9. "Une IA pour aider les victimes de harcèlement sexuel". Siècle Digital (in French). 2017-12-07. Retrieved 2018-03-19.
  10. 1 2 3 "Victims of Sexual Harassment Have a New Resource: AI". MIT Technology Review. Retrieved 2018-10-12.
  11. 1 2 3 Desmond, John (2018-07-26). "Montreal-Toronto AI Startups Have Wide Range of Focus". AI Trends. Retrieved 2018-08-21.
  12. 1 2 3 "Botler.ai launches sexual harassment detection bot for U.S. and Canada". VentureBeat. 2017-12-06. Retrieved 2018-02-10.
  13. "Sexual Harassment Inc: How the #MeToo movement is sparking a wave of start-ups". Washington Post. Retrieved 2018-10-30.
  14. "Technology gives victims new reporting options". 2018-03-20. Retrieved 2018-03-30.
  15. 1 2 ICI.Radio-Canada.ca, Zone Techno-. "Un " avocat robot " pour aider les victimes d'agressions sexuelles". Radio-Canada.ca (in Canadian French). Retrieved 2019-06-13.
  16. 1 2 3 Mason, Quinn (2017-12-06). "Botler.ai brings justice to sexual harassment victims". Montreal in Technology. Retrieved 2018-09-18.
  17. "How a Montreal-made online tool helps sexual harassment victims navigate the legal system". CBC. 2017-12-27. Retrieved 2018-01-30.
  18. "Botler.AI's new chatbot analyzed 300,000 court documents to help sexual harassment and assault survivors". BetaKit. 2017-12-06. Retrieved 2019-04-23.
  19. "Botler AI Uses Deep Learning to Empower Sexual Harassment Victims". Techvibes. Retrieved 2019-09-23.