Backlink

Last updated

A backlink is a link from some other website (the referrer) to that web resource (the referent). [1] A web resource may be (for example) a website, web page, or web directory. [1]

Contents

A backlink is a reference comparable to a citation. [2] The quantity, quality, and relevance of backlinks for a web page are among the factors that search engines like Google evaluate in order to estimate how important the page is. [3] [4] PageRank calculates the score for each web page based on how all the web pages are connected among themselves, and is one of the variables that Google Search uses to determine how high a web page should go in search results. [5] This weighting of backlinks is analogous to citation analysis of books, scholarly papers, and academic journals. [1] [4] A Topical PageRank has been researched and implemented as well, which gives more weight to backlinks coming from the page of a same topic as a target page. [6]

Some other words for backlink are incoming link, inbound link, inlink, inward link, and citation. [1]

Wikis

Backlinks are offered in Wikis, but usually only within the bounds of the Wiki itself and enabled by the database backend. [7] MediaWiki specifically offers the "What links here" tool, some older Wikis, especially the first WikiWikiWeb, had the backlink functionality exposed in the page title. [8]

Search engines often use the number of backlinks that a website has as one of the most important factors for determining that website's search engine ranking, popularity and importance. [9] Google's description of its PageRank system (January 1998), for instance, noted that "Google interprets a link from page A to page B as a vote, by page A, for page B." [10] Knowledge of this form of search engine rankings has fueled a portion of the search engine optimization (SEO) industry commonly termed linkspam, where a company attempts to place as many inbound links as possible to their site regardless of the context of the originating site. [11] In January 2017, Google launched Penguin 4 update which devalued such link spam practices. [11]

The significance of search engine rankings is high, and it is regarded as a crucial parameter in online business and the conversion rate of visitors to any website, particularly when it comes to online shopping. [12] Blog commenting, guest blogging, article submission, press release distribution, social media engagements, and forum posting can be used to increase backlinks.

Websites often employ SEO techniques to increase the number of backlinks pointing to their website. Some methods are free for use by everyone whereas some methods, like linkbaiting, require quite a bit of planning and marketing to work. [13] There are also paid techniques to increase the number of backlinks to a target site. For example, private blog networks can be used to purchase backlinks. It has been estimated that the average cost of buying a link in 2019 was $291.55 and $391.55, when marketing blogs were excluded from the calculation. [14]

There are several factors that determine the value of a backlink. Backlinks from authoritative sites on a given topic are highly valuable. If both sites and pages have content geared toward the topic, the backlink is considered relevant and believed to have strong influence on the search engine rankings of the web page granted the backlink. [15] A backlink represents a favorable 'editorial vote' for the receiving webpage from another granting webpage. Another important factor is the anchor text of the backlink. Anchor text is the descriptive labeling of the hyperlink as it appears on a web page. [16] Search engine bots (i.e., spiders, crawlers, etc.) examine the anchor text to evaluate how relevant it is to the content on a webpage. Backlinks can be generated by submissions, such as directory submissions, forum submission, social bookmarking, business listing, blog submissions, etc. Anchor text and webpage content congruency are highly weighted in search engine results page (SERP) rankings of a webpage with respect to any given keyword query by a search engine user. [17]

Changes to the algorithms that produce search engine rankings can place a heightened focus on relevance to a particular topic. While some backlinks might be from sources containing highly valuable metrics, they could also be unrelated to the consumer's query or interest. [18] An example of this would be a link from a popular shoe blog (with valuable metrics) to a site selling vintage pencil sharpeners. While the link appears valuable, it provides little to the consumer in terms of relevance.

See also

Related Research Articles

Meta elements are tags used in HTML and XHTML documents to provide structured metadata about a Web page. They are part of a web page's head section. Multiple Meta elements with different attributes can be used on the same page. Meta elements can be used to specify page description, keywords and any other metadata not provided through the other head elements and attributes.

Spamdexing is the deliberate manipulation of search engine indexes. It involves a number of methods, such as link building and repeating unrelated phrases, to manipulate the relevance or prominence of resources indexed in a manner inconsistent with the purpose of the indexing system.

Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines. SEO targets unpaid traffic rather than direct traffic or paid traffic. Unpaid traffic may originate from different kinds of searches, including image search, video search, academic search, news search, and industry-specific vertical search engines.

<span class="mw-page-title-main">Link farm</span> Group of websites that link to each other

On the World Wide Web, a link farm is any group of websites that all hyperlink to other sites in the group for the purpose of increasing SEO rankings. In graph theoretic terms, a link farm is a clique. Although some link farms can be created by hand, most are created through automated programs and services. A link farm is a form of spamming the index of a web search engine. Other link exchange systems are designed to allow individual websites to selectively exchange links with other relevant websites, and are not considered a form of spamdexing.

Findability is the ease with which information contained on a website can be found, both from outside the website and by users already on the website. Although findability has relevance outside the World Wide Web, the term is usually used in that context. Most relevant websites do not come up in the top results because designers and engineers do not cater to the way ranking algorithms work currently. Its importance can be determined from the first law of e-commerce, which states "If the user can’t find the product, the user can’t buy the product." As of December 2014, out of 10.3 billion monthly Google searches by Internet users in the United States, an estimated 78% are made to research products and services online.

<span class="mw-page-title-main">Anchor text</span> Visible, clickable text in a hyperlink

The anchor text, link label or link text is the visible, clickable text in an HTML hyperlink. The term "anchor" was used in older versions of the HTML specification for what is currently referred to as the a element, or <a>. The HTML specification does not have a specific term for anchor text, but refers to it as "text that the a element wraps around". In XML terms, the anchor text is the content of the element, provided that the content is text.

Keyword stuffing is a search engine optimization (SEO) technique, considered webspam or spamdexing, in which keywords are loaded into a web page's meta tags, visible content, or backlink anchor text in an attempt to gain an unfair rank advantage in search engines. Keyword stuffing may lead to a website being temporarily or permanently banned or penalized on major search engines. The repetition of words in meta tags may explain why many search engines no longer use these tags. Nowadays, search engines focus more on the content that is unique, comprehensive, relevant, and helpful that overall makes the quality better which makes keyword stuffing useless, but it is still practiced by many webmasters.

Click-through rate (CTR) is the ratio of clicks on a specific link to the number of times a page, email, or advertisement is shown. It is commonly used to measure the success of an online advertising campaign for a particular website, as well as the effectiveness of email campaigns.

Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) primarily through paid advertising. SEM may incorporate search engine optimization (SEO), which adjusts or rewrites website content and site architecture to achieve a higher ranking in search engine results pages to enhance pay per click (PPC) listings and increase the Call to action (CTA) on the website.

A scraper site is a website that copies content from other websites using web scraping. The content is then mirrored with the goal of creating revenue, usually through advertising and sometimes by selling user data.

The Sandbox effect is a theory about the way Google ranks web pages in its index. It is the subject of much debate—its existence has been written about since 2004, but not confirmed, with several statements to the contrary.

An SEO contest is a prize activity that challenges search engine optimization (SEO) practitioners to achieve high ranking under major search engines such as Google, Yahoo, and MSN using certain keyword(s). This type of contest is controversial because it often leads to massive amounts of link spamming as participants try to boost the rankings of their pages by any means available. The SEO competitors hold the activity without the promotion of a product or service in mind, or they may organize a contest in order to market something on the Internet. Participants can showcase their skills and potentially discover and share new techniques for promoting websites.

A search engine results page (SERP) is a webpage that is displayed by a search engine in response to a query by a user. The main component of a SERP is the listing of results that are returned by the search engine in response to a keyword query.

Keyword research is a practice search engine optimization (SEO) professionals use to find and analyze search terms that users enter into search engines when looking for products, services, or general information. Keywords are related to search queries.

In the field of search engine optimization (SEO), link building describes actions aimed at increasing the number and quality of inbound links to a webpage with the goal of increasing the search engine rankings of that page or website. Briefly, link building is the process of establishing relevant hyperlinks to a website from external sites. Link building can increase the number of high-quality links pointing to a website, in turn increasing the likelihood of the website ranking highly in search engine results. Link building is also a proven marketing tactic for increasing brand awareness.

The domain authority of a website describes its relevance for a specific subject area or industry. Domain Authority is a search engine ranking score developed by Moz. This relevance has a direct impact on its ranking by search engines, trying to assess domain authority through automated analytic algorithms. The relevance of domain authority on website-listing in the Search Engine Results Page (SERPs) of search engines led to the birth of a whole industry of Black-Hat SEO providers, trying to feign an increased level of domain authority. The ranking by major search engines, e.g., Google’s PageRank is agnostic of specific industry or subject areas and assesses a website in the context of the totality of websites on the Internet. The results on the SERP page set the PageRank in the context of a specific keyword. In a less competitive subject area, even websites with a low PageRank can achieve high visibility in search engines, as the highest ranked sites that match specific search words are positioned on the first positions in the SERPs.

Website audit is a full analysis of all the factors that affect a website's visibility in search engines. This standard method gives a complete insight into any website, overall traffic, and individual pages. Website audit is completed solely for marketing purposes. The goal is to detect weak points in campaigns that affect web performance.

User intent, otherwise known as query intent or search intent, is the identification and categorization of what a user online intended or wanted to find when they typed their search terms into an online web search engine for the purpose of search engine optimisation or conversion rate optimisation. Examples of user intent are fact-checking, comparison shopping or navigating to other websites.

Local search engine optimization is similar to (national) SEO in that it is also a process affecting the visibility of a website or a web page in a web search engine's unpaid results often referred to as "natural", "organic", or "earned" results. In general, the higher ranked on the search results page and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers. Local SEO, however, differs in that it is focused on optimizing a business's online presence so that its web pages will be displayed by search engines when users enter local searches for its products or services. Ranking for local search involves a similar process to general SEO but includes some specific elements to rank a business for local search.

<span class="mw-page-title-main">Google Lighthouse</span> Open-source, automated tool for measuring the quality of web pages

Google Lighthouse is an open-source, automated tool for measuring the quality of web pages. It can be run against any web page, public or, requiring authentication. Google Lighthouse audits performance, accessibility, and search engine optimization factors of web pages, this is the major difference from Google PageSpeed, the Google Lighthouse provides more detail information. It also includes the ability to test progressive web applications for compliance with standards and best practices. Google Lighthouse is developed by Google and aims to help web developers, the tool can be run by using Chrome browser extension or by using terminal (command) for batch auditing a list of URLs. Google's recommendation is for using the online version of Page Speed Insights as of 15th May 2015.

References

  1. 1 2 3 4 Björneborn, Lennart; Ingwersen, Peter (2004). "Toward a Basic Framework for Webometrics". Journal of the Association for Information Science and Technology . 55 (14): 1218. CiteSeerX   10.1.1.94.1691 . doi:10.1002/asi.20077. Archived from the original on 2014-09-10. Retrieved 2020-05-27.
  2. Goh, Dion; Foo, Schubert (2007). Social Information Retrieval Systems: Emerging Technologies and Applications for Searching the Web Effectively: Emerging Technologies and Applications for Searching the Web Effectively. Information Science Reference. p. 132. ISBN   978-1-59904-543-6.
  3. "About Search". Archived from the original on 2011-11-04. Retrieved 2016-04-20.
  4. 1 2 Lingras, Pawan; Akerkar, Rajendra (10 March 2010). "Web Structure Mining § PageRank Algorithm". Building an Intelligent Web: Theory and Practice. Jones & Bartlett Publishers. p. 294. ISBN   978-1-4496-6322-3.
  5. Olsen, Martin (20 May 2010). "Maximizing PageRank with New Backlinks". In Diaz, Josep; Calamoneri, Tiziana (eds.). Algorithms and Complexity: 7th International Conference, CIAC 2010, Rome, Italy, May 26–28, 2010, Proceedings. Berlin: Springer Science & Business Media. p. 37. ISBN   978-3-642-13072-4. OCLC   873382847.
  6. Nie, Lan; Davison, Brian D.; Qi, Xiaoguang (2006). "Topical link analysis for web search" . Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval. SIGIR '06. New York, NY, US: ACM. pp.  91–98. doi:10.1145/1148170.1148189. ISBN   978-1595933690. S2CID   2877831.
  7. Sáez-Trumper, Diego (2020-04-16). "Open data and COVID-19: Wikipedia as an informational resource during the pandemic". Medium . Retrieved 2020-05-27.
  8. Ebersbach, Anja; Glaser, Markus; Heigl, Richard (2006). Wiki: Web Collaboration. Springer-Verlag Berlin Heidelberg. p. 111. doi:10.1007/3-540-29267-5. ISBN   978-3-540-25995-4.
  9. Jones, Kristopher (2018-01-24). "How to Push Great Content that Isn't Ranking Well". searchenginejournal.com. Retrieved 2020-05-27.
  10. "Google's overview of PageRank" (PDF). Archived from the original (PDF) on 10 March 2020. Retrieved 6 October 2014.
  11. 1 2 Misra, Parth (2017-01-27). "The Invisible Threat of 'Black Hat' SEO to Your Company's Reputation". Entrepreneur . Retrieved 2020-05-27.
  12. Chasinov, Nick (2019-04-05). "How Entrepreneurs Can Beat Amazon at Organic Search". Entrepreneur . Retrieved 2020-05-27.
  13. Taylor, Gabriela (2013). Give Your Marketing a Digital Edge. Global & Digital. p. 171. ISBN   978-1-909924-30-7.
  14. Bucciachio, Vincent (2019-12-08). "What's the Cost of Buying Links in 2020? We Contacted 1,950 Blogs to Uncover the Truth". sociallyinfused.com. Retrieved 2020-05-27.
  15. Griffin, Fran (2019-06-07). "What does the modern PR professional look like in 2019?". PRWeek . Retrieved 2020-05-27.
  16. Loop, Matthew (2016). Social Media Made Me Rich: Here's How it Can do the Same for You. Morgan James Publishing. p. 129. ISBN   978-1-63047-793-6.
  17. "Anchor Text As A Google Ranking Factor: Everything You Need to Know". Search Engine Journal. 2021-10-03. Retrieved 2023-06-25.
  18. Glazier, Alan (2011). Searchial Marketing:: How Social Media Drives Search Optimization in Web 3.0. Author House. p. 80. ISBN   978-1-4567-3892-1.