Those in Peril

Last updated
Those in Peril
Thoseinperillge.jpeg
Hardcover edition
Author Wilbur Smith
CountrySouth Africa
LanguageEnglish
Subject Somalian pirates
GenreFiction
Publisher Pan Macmillan
Publication date
2011
Media type Print
Pages386 pp.
ISBN 978-0230529267
Preceded by Assegai  

Those in Peril is a book by the author Wilbur Smith. The book focuses on the lives of billionaire Hazel Bannock, who is the owner of the Bannock Oil Corp and Major Hector Cross, an ex-SAS operative and the owner of a security company Cross Bow Security. This company has been contracted to protect Hazel Bannock and her business interest and the story unfolds when Hazel's daughter is hijacked by Somali pirates.

Contents

Background

Smith described the novel as "another pretty tale of sex, violence and mayhem with a few belly laughs thrown in for good measure." [1]

Smith wanted to write a stand-alone book about a mother being deprived of her adored child, then having to get help from one of his tough guys. "I wanted her to be tough, too," he says, "because I'm a feminist." [2]

When he owned an island in the Seychelles, he and his crew once came across Somali pirates.

They didn't acknowledge us – rare among sailors – but I've never forgotten this guy, tall and very handsome, as hypnotic as a black mamba, just standing there. His eyes were dead. I guess I was lucky they weren't kidnapping that day; I could so easily have been taken, but that image has stayed with me. I knew I would use it one day.... [Modern day pirates are ] using sophisticated radar equipment and offshore accounts. I think the disturbing thing about Somalia is the fact this is a country where there are few economic opportunities, so the pirates are seen as glamorous, and often held in awe by young boys who aspire to their lifestyle. [2]

Reception

In a review, Christopher Bray from the Express described the plot as fairly predictable, with some clichéd descriptions but also defies anyone who has picked it up to put it down as the story is exciting. [3]

The reviewer of Publishers Weekly said that "the author's vast legions of fans should embrace the lurid action, the larger-than-life characters, and the heated prose with their usual enthusiasm." [4]

The film rights for Those in Peril have been acquired by Reelart Media in 2012 but no film has resulted. [5]

Involvement in AI

The book was included in the Books3 dataset, part of The Pile, which a number of large language models were trained on, including EleutherAI’s GPT-J, [6] Microsoft’s Megatron-Turing NLG, [7] and Meta’s LLaMA. [8] It was then used as an example [9] by the Rights Alliance to take copies of The Pile down through DMCA notices. [10]

Related Research Articles

<span class="mw-page-title-main">Chatbot</span> Program that simulates conversation

A chatbot is a software application or web interface that aims to mimic human conversation through text or voice interactions. Modern chatbots are typically online and use artificial intelligence (AI) systems that are capable of maintaining a conversation with a user in natural language and simulating the way a human would behave as a conversational partner. Such technologies often utilize aspects of deep learning and natural language processing, but more simplistic chatbots have been around for decades prior.

<span class="mw-page-title-main">Artificial general intelligence</span> Hypothetical human-level or stronger AI

An artificial general intelligence (AGI) is a hypothetical type of intelligent agent. If realized, an AGI could learn to accomplish any intellectual task that human beings or animals can perform. Alternatively, AGI has been defined as an autonomous system that surpasses human capabilities in the majority of economically valuable tasks. Creating AGI is a primary goal of some artificial intelligence research and of companies such as OpenAI, DeepMind, and Anthropic. AGI is a common topic in science fiction and futures studies.

A language model is a probabilistic model of a natural language that can generate probabilities of a series of words, based on text corpora in one or multiple languages it was trained on. Large language models, as their most advanced form, are a combination of feedforward neural networks and transformers. They have superseded recurrent neural network-based models, which had previously superseded the pure statistical models, such as word n-gram language model.

<i>Vicious Circle</i> (novel) 2013 novel by Wilbur Smith

Vicious Circle is a 2013 novel by Wilbur Smith.

<span class="mw-page-title-main">OpenAI</span> Artificial intelligence research organization

OpenAI is an American artificial intelligence (AI) research laboratory consisting of the non-profit OpenAI, Inc. and its for-profit subsidiary corporation OpenAI, L.P.. OpenAI conducts research on artificial intelligence with the declared intention of developing "safe and beneficial" artificial general intelligence, which it defines as "highly autonomous systems that outperform humans at most economically valuable work".

<span class="mw-page-title-main">AI alignment</span> Conformance to the intended objective

In the field of artificial intelligence (AI), AI alignment research aims to steer AI systems towards humans' intended goals, preferences, or ethical principles. An AI system is considered aligned if it advances the intended objectives. A misaligned AI system pursues some objectives, but not the intended ones.

<span class="mw-page-title-main">GPT-3</span> 2020 large language model

Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor GPT-2, it is a decoder-only transformer model of deep neural network, which uses attention in place of previous recurrence- and convolution-based architectures. Attention mechanisms allow the model to selectively focus on segments of input text it predicts to be the most relevant. It uses a 2048-tokens-long context and then-unprecedented size of 175 billion parameters, requiring 800GB to store. The model demonstrated strong zero-shot and few-shot learning on many tasks.

<span class="mw-page-title-main">GPT-2</span> 2019 text-generating language model

Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on BookCorpus, a dataset of over 7,000 unpublished fiction books from various genres, and trained on a dataset of 8 million web pages. It was partially released in February 2019, followed by full release of the 1.5-billion-parameter model on November 5, 2019.

Prompt engineering, primarily used in communication with a text-to-text model, is the process of structuring text that can be interpreted and understood by a generative AI model. Prompt engineering is enabled by in-context learning, defined as a model's ability to temporarily learn from prompts. The ability for in-context learning is an emergent ability of large language models.

LaMDA is a family of conversational large language models developed by Google. Originally developed and introduced as Meena in 2020, the first-generation LaMDA was announced during the 2021 Google I/O keynote, while the second generation was announced the following year. In June 2022, LaMDA gained widespread attention when Google engineer Blake Lemoine made claims that the chatbot had become sentient. The scientific community has largely rejected Lemoine's claims, though it has led to conversations about the efficacy of the Turing test, which measures whether a computer can pass for a human. In February 2023, Google announced Bard, a conversational artificial intelligence chatbot powered by LaMDA, to counter the rise of OpenAI's ChatGPT.

<span class="mw-page-title-main">ChatGPT</span> AI chatbot developed by OpenAI

ChatGPT, which stands for Chat Generative Pre-trained Transformer, is a large language model-based chatbot developed by OpenAI and launched on November 30, 2022, notable for enabling users to refine and steer a conversation towards a desired length, format, style, level of detail, and language used. Successive prompts and replies, known as prompt engineering, are considered at each conversation stage as a context.

<span class="mw-page-title-main">Hallucination (artificial intelligence)</span> Confident unjustified claim by an AI

In the field of artificial intelligence (AI), a hallucination or artificial hallucination is a confident response by an AI that does not seem to be justified by its training data. For example, a hallucinating chatbot might, when asked to generate a financial report for a company, falsely state that the company's revenue was $13.6 billion.

<span class="mw-page-title-main">GPT-4</span> 2023 text-generating language model

Generative Pre-trained Transformer 4 (GPT-4) is a multimodal large language model created by OpenAI, and the fourth in its series of GPT foundation models. It was initially released on March 14, 2023, and has been made publicly available via the paid chatbot product ChatGPT Plus, and via OpenAI's API. As a transformer-based model, GPT-4 uses a paradigm where pre-training using both public data and "data licensed from third-party providers" is used to predict the next token. After this step, the model was then fine-tuned with reinforcement learning feedback from humans and AI for human alignment and policy compliance.

<span class="mw-page-title-main">Generative pre-trained transformer</span> Type of large language model

Generative pre-trained transformers (GPT) are a type of large language model (LLM) and a prominent framework for generative artificial intelligence. The first GPT was introduced in 2018 by OpenAI. GPT models are artificial neural networks that are based on the transformer architecture, pre-trained on large data sets of unlabelled text, and able to generate novel human-like content. As of 2023, most LLMs have these characteristics and are sometimes referred to broadly as GPTs.

<span class="mw-page-title-main">GPT-J</span> Open source artificial intelligence text generating language model developed by EleutherAI

GPT-J or GPT-J-6B is an open-source large language model (LLM) developed by EleutherAI in 2021. As the name suggests, it is a generative pre-trained transformer model designed to produce human-like text that continues from a prompt. The optional "6B" in the name refers to the fact that it has 6 billion parameters.

<span class="mw-page-title-main">EleutherAI</span> Artificial intelligence research collective

EleutherAI is a grass-roots non-profit artificial intelligence (AI) research group. The group, considered an open-source version of OpenAI, was formed in a Discord server in July 2020 to organize a replication of GPT-3. In early 2023, it formally incorporated as the EleutherAI Foundation, a non-profit research institute.

<span class="mw-page-title-main">Large language model</span> Neural network with billions of weights

A large language model (LLM) is a language model characterized by its large size. Their size is enabled by AI accelerators, which are able to process vast amounts of text data, mostly scraped from the Internet. The artificial neural networks which are built can contain from tens of millions and up to billions of weights and are (pre-)trained using self-supervised learning and semi-supervised learning. Transformer architecture contributed to faster training. Alternative architectures include the mixture of experts (MoE), which has been proposed by Google, starting with sparsely-gated ones in 2017, Gshard in 2021 to GLaM in 2022.

The Pile is an 886.03GB diverse, open-source dataset of English text created as a training dataset for large language models (LLMs). It was constructed by EleutherAI in 2020 and publicly released on December 31 of that year. It is composed of 22 smaller datasets, including 14 new ones.

<span class="mw-page-title-main">Generative artificial intelligence</span> AI system capable of generating content in response to prompts

Generative artificial intelligence (AI) is artificial intelligence capable of generating text, images, or other media, using generative models. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics.

LLaMA is a family of large language models (LLMs), released by Meta AI starting in February 2023.

References