This article includes a list of general references, but it lacks sufficient corresponding inline citations .(October 2014) |
In the field of search engine optimization (SEO), link building describes actions aimed at increasing the number and quality of inbound links to a webpage with the goal of increasing the search engine rankings of that page or website. [1] Briefly, link building is the process of establishing relevant hyperlinks (usually called links) to a website from external sites. Link building can increase the number of high-quality links pointing to a website, in turn increasing the likelihood of the website ranking highly in search engine results. Link building is also a proven marketing tactic for increasing brand awareness. [2]
Editorial links are the links not acquired from paying money, asking, trading or exchanging. These links are attracted because of the good content and marketing strategies of a website. These are the links that the website owner does not need to ask for as they are naturally given by other website owners. [3]
Resource links are a category of links, which can be either one-way or two-way, usually referenced as "Resources" or "Information" in navbars, but sometimes, especially in the early, less compartmentalized years of the Web, simply called "links". Basically, they are hyperlinks to a website or a specific web page containing content believed to be beneficial, useful and relevant to visitors of the site establishing the link.
In recent years, resource links have grown in importance because most major search engines have made it plain that—in Google's words—"quantity, quality, and relevance of links count towards your rating". [4]
Search engines measure a website's value and relevance by analyzing the links to the site from other websites. The resulting “link popularity” is a measure of the number and quality of links to a website. It is an integral part of a website's ranking in search engines. Search engines examine each of the links to a particular website to determine its value. Although every link to a website is a vote in its favor, not all votes are counted equally. A website with similar subject matter to the website receiving the inbound link carries more weight than an unrelated site, and a well-regarded website (such as a university) has higher link quality than an unknown or disreputable website. [5] [ self-published source? ]
The text of links helps search engines categorize a website. The engines' insistence on resource links being relevant and beneficial developed because many artificial link building methods were employed solely to spam search engines, i.e. to "fool" the engines' algorithms into awarding the sites employing these unethical devices undeservedly high page ranks and/or return positions.
Google has cautioned site developers to avoid "free-for-all" links, link-popularity schemes, and the submission of a site to thousands of search engines, given that these tactics are typically useless exercises that do not affect the ranking of a site in the results of the major search engines. [6] For many years now, the major [ which? ] search engines have deployed technology designed to "red flag" and potentially penalize sites employing such practices. [7]
These are the links acquired by the website owner through payment or distribution. They are also known as organically obtained links. Such links include link advertisements, paid linking, article distribution, directory links and comments on forums, blogs, articles and other interactive forms of social media. [8]
A reciprocal link is a mutual link between two objects, commonly between two websites, to ensure mutual traffic. For example, Alice and Bob have websites. If Bob's website links to Alice's website and Alice's website links to Bob's website, the websites are reciprocally linked. Website owners often submit their sites to reciprocal link exchange directories in order to achieve higher rankings in the search engines. Reciprocal linking between websites is no longer an important part of the search engine optimization process. In 2005, with their Jagger 2 update, Google stopped giving credit to reciprocal links as it does not indicate genuine link popularity. [9]
User-generated content such as blog and forum comments with links can drive valuable referral traffic if it's well-thought-out and pertains to the discussion of the post on the blog. [10] However, these links almost always contain the Nofollow or the newer ugc attribute which signal that Google shouldn't take these into its ranking considerations. [11]
Website directories are lists of links to websites which are sorted into categories. Website owners can submit their site to many of these directories. Some directories accept payment for listing in their directory while others are free.
Social bookmarking is a way of saving and categorizing web pages in a public location on the web. Because bookmarks have anchor text and are shared and stored publicly, they are scanned by search engine crawlers and have search engine optimization value.
Image linking is a way of submitting images, such as infographics, to image directories and linking them back to a specific URL.
Also known as guest posting, is a popular SEO technique that consists of writing a piece of content for another website with the goal of getting more visibility and possibly link back to the author's website. According to Google, such links are considered unnatural and should be generally containing the Nofollow attribute. [12]
In early incarnations, when Google's algorithm relied on incoming links as an indicator of website success, Black Hat SEOs manipulated website rankings by creating link-building schemes, such as building subsidiary websites to send links to a primary website. With an abundance of incoming links, the prime website outranked many reputable sites. However, the conflicts of being devalued by major search engines while building links could be caused by web owners using other black hat strategies. Black hat link building refers explicitly to the process of acquiring as many links as possible with minimal effort.
The Penguin algorithm was created to eliminate this type of abuse. At the time, Google clarified its definition of a "bad" link: “Any links intended to manipulate a site’s ranking in Google search results may be considered part of a link scheme.”
With Penguin, it wasn't the quantity of links that improved a site's rankings but the quality. Since then, Google's web spam team has attempted to prevent the manipulation of their search results through link building. Major brands including J.C. Penney, BMW, Forbes , Overstock.com, and many others have received severe penalties to their search rankings for employing spammy and non-user friendly link building tactics. [13]
On October 5, 2014, Google launched a new algorithm update Penguin 3.0 to penalize those sites who use black hat link building tactics to build unnatural links to manipulate search engines. The update affected 0.3% English Language queries all over the world. [14]
Black hat SEO could also be referred to as Spamdexing, which utilizes other black SEO strategies and link building tactics. [15] Some black hat link building strategies include getting unqualified links from and participating in Link farm, link schemes and Doorway page. [6] Black Hat SEO could also refer to "negative SEO," the practice of deliberately harming another website's performance.
White hat link building strategies are those strategies that add value to end users, abide by Google's term of service and produce good results that could be sustained for a long time. White hat link building strategies focus on producing high-quality as well as relevant links to the website. Although more difficult to acquire, white hat link building tactics are widely implemented by website owners because such kind of strategies are not only beneficial to their websites' long-term developments but also good to the overall online environment.
Meta elements are tags used in HTML and XHTML documents to provide structured metadata about a Web page. They are part of a web page's head
section. Multiple Meta elements with different attributes can be used on the same page. Meta elements can be used to specify page description, keywords and any other metadata not provided through the other head
elements and attributes.
Spamdexing is the deliberate manipulation of search engine indexes. It involves a number of methods, such as link building and repeating unrelated phrases, to manipulate the relevance or prominence of resources indexed in a manner inconsistent with the purpose of the indexing system.
A web directory or link directory is an online list or catalog of websites. That is, it is a directory on the World Wide Web of the World Wide Web. Historically, directories typically listed entries on people or businesses, and their contact information; such directories are still in use today. A web directory includes entries about websites, including links to those websites, organized into categories and subcategories. Besides a link, each entry may include the title of the website, and a description of its contents. In most web directories, the entries are about whole websites, rather than individual pages within them. Websites are often limited to inclusion in only a few categories.
Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines. SEO targets unpaid traffic rather than direct traffic or paid traffic. Unpaid traffic may originate from different kinds of searches, including image search, video search, academic search, news search, and industry-specific vertical search engines.
On the World Wide Web, a link farm is any group of websites that all hyperlink to other sites in the group for the purpose of increasing SEO rankings. In graph theoretic terms, a link farm is a clique. Although some link farms can be created by hand, most are created through automated programs and services. A link farm is a form of spamming the index of a web search engine. Other link exchange systems are designed to allow individual websites to selectively exchange links with other relevant websites, and are not considered a form of spamdexing.
Googlebot is the web crawler software used by Google that collects documents from the web to build a searchable index for the Google Search engine. This name is actually used to refer to two different types of web crawlers: a desktop crawler and a mobile crawler.
Website promotion is the continuing process used by webmasters to improve content and increase exposure of a website to bring more visitors. Many techniques such as search engine optimization and search engine submission are used to increase a site's traffic once content is developed.
A backlink is a link from some other website to that web resource. A web resource may be a website, web page, or web directory.
The anchor text, link label, or link text is the visible, clickable text in an HTML hyperlink. The term "anchor" was used in older versions of the HTML specification for what is currently referred to as the a element, or <a>
. The HTML specification does not have a specific term for anchor text, but refers to it as "text that the a element wraps around". In XML terms, the anchor text is the content of the element, provided that the content is text.
Keyword stuffing is a search engine optimization (SEO) technique, considered webspam or spamdexing, in which keywords are loaded into a web page's meta tags, visible content, or backlink anchor text in an attempt to gain an unfair rank advantage in search engines. Keyword stuffing may lead to a website being temporarily or permanently banned or penalized on major search engines. The repetition of words in meta tags may explain why many search engines no longer use these tags. Nowadays, search engines focus more on the content that is unique, comprehensive, relevant, and helpful that overall makes the quality better which makes keyword stuffing useless, but it is still practiced by many webmasters.
Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) primarily through paid advertising. SEM may incorporate search engine optimization (SEO), which adjusts or rewrites website content and site architecture to achieve a higher ranking in search engine results pages to enhance pay per click (PPC) listings and increase the Call to action (CTA) on the website.
Web content development is the process of researching, writing, gathering, organizing, and editing information for publication on websites. Website content may consist of prose, graphics, pictures, recordings, movies, or other digital assets that could be distributed by a hypertext transfer protocol server, and viewed by a web browser.
The Sandbox effect is a theory about the way Google ranks web pages in its index. It is the subject of much debate—its existence has been written about since 2004, but not confirmed, with several statements to the contrary.
nofollow is a setting on a web page hyperlink that directs search engines not to use the link for page ranking calculations. It is specified in the page as a type of link relation; that is: <a rel="nofollow" ...>
. Because search engines often calculate a site's importance according to the number of hyperlinks from other sites, the nofollow
setting allows website authors to indicate that the presence of a link is not an endorsement of the target site's importance.
Danny Sullivan is an American technologist, journalist, and entrepreneur. He is the founder of Search Engine Watch in 1997, one of the earliest online publications about search engine marketing. He also launched Search Engine Strategies, one of the earliest search marketing trade shows. After selling both companies in 2006, he co-founded Search Engine Land, another search marketing publication. In 2017, he joined Google as an adviser at the search division of the company.
PageRank (PR) is an algorithm used by Google Search to rank web pages in their search engine results. It is named after both the term "web page" and co-founder Larry Page. PageRank is a way of measuring the importance of website pages. According to Google:
PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.
Google Penguin was a codename for a Google algorithm update that was first announced on April 24, 2012. The update was aimed at decreasing search engine rankings of websites that violate Google's Webmaster Guidelines by using now declared Grey Hat SEM techniques involved in increasing artificially the ranking of a webpage by manipulating the number of links pointing to the page. Such tactics are commonly described as link schemes. According to Google's John Mueller, as of 2013, Google announced all updates to the Penguin filter to the public.
The domain authority of a website describes its relevance for a specific subject area or industry. Domain Authority is a search engine ranking score developed by Moz. This relevance has a direct impact on its ranking by search engines, trying to assess domain authority through automated analytic algorithms. The relevance of domain authority on website-listing in the Search Engine Results Page (SERPs) of search engines led to the birth of a whole industry of Black-Hat SEO providers, trying to feign an increased level of domain authority. The ranking by major search engines, e.g., Google’s PageRank is agnostic of specific industry or subject areas and assesses a website in the context of the totality of websites on the Internet. The results on the SERP page set the PageRank in the context of a specific keyword. In a less competitive subject area, even websites with a low PageRank can achieve high visibility in search engines, as the highest ranked sites that match specific search words are positioned on the first positions in the SERPs.
Website audit is a full analysis of all the factors that affect a website's visibility in search engines. This standard method gives a complete insight into any website, overall traffic, and individual pages. Website audit is completed solely for marketing purposes. The goal is to detect weak points in campaigns that affect web performance.
Local search engine optimization is similar to (national) SEO in that it is also a process affecting the visibility of a website or a web page in a web search engine's unpaid results often referred to as "natural", "organic", or "earned" results. In general, the higher ranked on the search results page and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers. Local SEO, however, differs in that it is focused on optimizing a business's online presence so that its web pages will be displayed by search engines when users enter local searches for its products or services. Ranking for local search involves a similar process to general SEO but includes some specific elements to rank a business for local search.