Selasa, 19 Juni 2018

Sponsored Links

Search Engine Optimization Tips - Blogging Guru
src: mariomarino.eu

Search engine optimization ( SEO ) is the process of influencing the visibility of an online website or webpage in the unpaid results of a web search engine - often referred to as "natural," "organic" , or "result" results. In general, earlier (or higher on search result pages), and more often the website appears in the search results list, the more visitors will receive from search engine users; These visitors can then be converted into customers. SEO can target different types of searches, including image search, video search, academic search, news search, and industry-specific vertical search engines. SEO is different from local search engine optimization because the latter is focused on optimizing the business's online presence so that its webpage will be displayed by search engines when the user enters a local search for his product or service. The former focuses more on national or international searches.

As an Internet marketing strategy, SEO considers the workings of search engines, computer programmed algorithms that determine search engine behavior, what people search for, actual search terms or keywords typed into search engines, and which search engines are preferred by targeted audiences. Optimizing a website may involve editing its content, adding content, performing HTML, and related codes to increase its relevance to certain keywords and removing barriers to search engine indexing activity. Promoting sites to increase the number of backlinks, or incoming links, is another SEO tactic. Until May 2015, mobile search has gone beyond desktop search. In 2015, it was reported that Google is developing and promoting mobile searches as a key feature in future products. In response, many brands are beginning to take different approaches to their Internet marketing strategies.


Video Search engine optimization



History

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engine cataloged the initial Web. Initially, all webmasters only need to submit page addresses, or URLs, to machines that will send "spiders" to "crawl" the page, extract links to other pages from them, and return information found on the page. for indexing. This process involves search engine spiders downloading pages and saving them on their own search engine servers. The second program, known as an indexer, extracts information about the page, like the words it contains, where it is located, and any weight for certain words, and all the links on the page. All this information is then placed into the scheduler for future crawling.

Website owners recognize high rankings and visibility in search engine results, creating opportunities for white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" may come into use in 1997. Sullivan considers Bruce Clay one of the first to popularize the term. On May 2, 2007, Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona that SEO was a "process" involving the manipulation of keywords and not "marketing services."

Initial version of the search algorithm depends on the information that the webmaster provides such as meta tags of keywords or index files in machines like ALIWEB. Meta tags provide guidance for each page content. However, using metadata to index pages is less reliable, since the choice of keyword webmasters in meta tags is potentially an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent meta tag data can and does cause pages to rank for irrelevant searches. The web content provider also manipulates some of the attributes within the HTML source of the page in an attempt to get a good ranking in search engines. In 1997, search engine designers acknowledged that webmasters were trying to get good rankings in their search engines, and that some webmasters even manipulated their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating ratings.

By relying on so many factors such as keyword density that is exclusively within the webmaster's control, the early search engines suffered misuse and ranking manipulation. To give better results to their users, search engines must adapt to ensure their results page shows the most relevant search results, rather than unrelated pages filled with many keywords by unscrupulous webmasters. This means moving away from heavy dependence on the term density to a more holistic process for printing semantic signals. Because the success and popularity of search engines is determined by its ability to produce results that are most relevant to any given search, poor quality or irrelevant search results may lead users to search for other search sources. Search engines respond by developing more complex ranking algorithms, taking into account additional factors that are more difficult to manipulate by webmasters. In 2005, the annual AIRWeb Conference, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.

Companies that use overly aggressive techniques can make their client's website banned from search results. In 2005, Wall Street Journal reported to the company, Traffic Power, which allegedly used high-risk techniques and failed to disclose the risk to its clients. Wired Magazine reported that the same company sued blogger and SEO Aaron Wall to write about the ban. Google Matt Cutts then confirmed that Google really prohibits Traffic Power and some of its clients.

Some search engines also reach the SEO industry, and often sponsors and guests at SEO conferences, webchats, and seminars. The major search engines provide information and guidance to help website optimization. Google has a Sitemap program to help webmasters learn if Google is having trouble indexing their websites and also providing data about Google traffic to websites. Webmaster Tools Bing provides a way for webmasters to submit sitemaps and web feeds, allowing users to define "crawl rate", and track the index status of web pages.

Relationship with Google

In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on mathematical algorithms to assess the benefits of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of incoming links. PageRank predicts the possibility that certain pages will be reached by web users randomly browsing the web, and follow links from one page to another. As a result, this means that some links are stronger than others, since the higher PageRank page is more likely to be achieved by random web surfers.

Page and Brin founded Google in 1998. Google attracted loyal followers amongst more and more Internet users, who liked its simple design. Factors outside the page (such as PageRank and hyperlink analysis) are also considered factors on the page (such as the frequency of keywords, meta tags, titles, links and site structure) to allow Google to avoid the kind of manipulation seen in search engines that only consider the factors on the page for their rankings. Although PageRank is harder to play, webmasters have developed linkage tools and schemes to influence Inktomi search engines, and this method proves to be the same for game PageRank. Many sites focus on the exchange, purchase, and sale of links, often on a large scale. Some of these schemes, or link farms, involve the creation of thousands of sites for the sole purpose of link spam.

In 2004, search engines have incorporated various undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times Saul Hansell stated Google's ranking site uses more than 200 different signals. The leading search engines, Google, Bing, and Yahoo, do not reveal the algorithms they use to rank pages. Some SEO practitioners have learned various approaches to search engine optimization, and have shared their personal opinions. Patents related to search engines can provide information to better understand search engines. In 2005, Google began personalizing search results for each user. Depending on your previous search history, Google generated results for signed-in users.

In 2007, Google announced a campaign against paid links that transfer PageRank. On June 15, 2009, Google revealed that they have taken steps to reduce the effect of PageRank sculpture by using the nofollow attribute on the link. Matt Cutts, a renowned software engineer at Google, announces that Google Bot will no longer treat nofollow links in the same way, to prevent SEO service providers using nofollow to sculpt PageRank. As a result of this change the use of nofollow causes the PageRank vaporization. To avoid the above, SEO engineers develop alternative techniques that replace nofollow tags with unclear Javascript and thus allow PageRank sculpting. Additionally some solutions have been suggested which include the use of iframes, Flash and Javascript.

In December 2009, Google announced it would use its web search history to fill its search results. On June 8, 2010, a new web indexing system called Google Caffeine was announced. Designed to allow users to find news, forum posts, and other content faster after publishing than ever, Google's caffeine is a change in how Google updates its index to make things appear faster on Google than ever before. According to Carrie Grimes, software designer who announces Caffeine for Google, "Caffeine delivers 50 percent fresher results for web search than our last index..." Google Instant, a real-time search, introduced in late 2010 in an effort to create search results more timely and relevant. Historically site administrators have spent months or even years optimizing websites to improve search rankings. With the growing popularity of social media sites and blogs, leading engines are making changes to their algorithms to allow fresh content to get ranked quickly in search results.

In February 2011, Google announced Panda updates, which penalize websites containing duplicated content from websites and other sources. Historically websites have copied content from each other and benefited in search engine rankings by engaging in this practice. However Google implements a new system that penalizes sites whose content is not unique. Google Penguin 2012 seeks to penalize websites that use manipulative techniques to improve their ranking in search engines. While Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spam links by measuring the quality of sites on which links originate. The Google Hummingbird 2013 update features algorithmic changes designed to improve natural language processing and semantic understanding of web pages. The Hummingbird language processing system is under the newly known term 'Conversation Conversation' where the system is paying more attention to every word in the query to better match the page for query meaning rather than a few words. In relation to changes made to Search Engine Optimization, for publishers and content authors, Hummingbird is intended to resolve issues by removing irrelevant content and spam, which allows Google to produce high-quality content and rely on it to be a 'trusted' author.

Maps Search engine optimization



Method

Get index

Top search engines, like Google, Bing, and Yahoo !, use crawlers to find pages for their algorithmic search results. Pages linked from other search engine index pages do not need to be sent as they are found automatically. The Yahoo! Directories and DMOZ, two major directories that closed in 2014 and 2017, both require manual submission and human editorial review. Google offers Google Search Console, where XML Sitemap feeds can be created and submitted for free to ensure that all pages are found, especially those that can not be found by following links automatically from their URL submission console. Yahoo! previously operating a paid delivery service guaranteed to crawl for cost per click; However, this practice was discontinued in 2009.

Search engine crawlers can see a number of different factors while crawling the site. Not every page is indexed by search engines. The page spacing of the site root directory can also be a factor in whether the page can be crawled or not.

Today, most people search on Google using a mobile device. In November 2016, Google announced major changes to how to crawl a website and start creating their mobile index first, which means that the mobile version of your website becomes a starting point for what Google includes in their index.

Prevent crawling

To avoid unwanted content in the search index, webmasters can instruct spiders not to crawl certain files or directories through a standard robots.txt file in the domain root directory. In addition, pages can be explicitly excluded from search engine databases by using special meta tags for robots. When a search engine visits the site, the robots.txt located in the root directory is the first crawled file. The robots.txt file is then parsed and will instruct the robots for which pages are not crawled. Because search engine crawlers can keep a copy of this file cached, it may sometimes crawl pages that webmasters do not want crawled. Pages that are usually prevented from crawling include login-specific pages such as shopping carts and user-specific content such as search results from internal search. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because the page is considered a search spam.

Increase excellence

Various methods can improve the benefits of web pages in search results. Cross-linking between pages of the same website to provide more links to important pages can increase its visibility. Writing content that includes frequently searched keyword phrases, so relevant to a variety of search queries will tend to increase traffic. Updating content so that making search engines crawled back can often give additional weight to the site. Adding relevant keywords to the web page meta data, including title tags and meta descriptions, will likely increase the relevance of the site search list, thereby increasing traffic. Normalizing web page URLs accessible through multiple urls, using canonical link elements or through 301 redirects can help ensure links to different url versions, all count towards the link popularity score of the page.

White hat versus black hat technique

SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and techniques not approved by search engines ("black hats"). Search engines seek to minimize the latter effect, among them spamdexing. Industry commentators have classified these methods, and the practitioners who hired them, either as white hat SEO, or black hat SEO. White hats tend to produce lasting results, while black hats anticipate that their sites may eventually be banned either temporarily or permanently after search engines discover what they are doing.

The SEO technique is considered a white hat if it matches the search engine guidelines and does not involve fraud. Since search engine guidelines are not written as a set of rules or commands, this is an important distinction to note. White hat SEO is not just about the following guides, but about making sure that the content of the search engine index and then ranking is the same content the users will see. White hat suggestions are generally summarized as creating content for users, not for search engines, and then making the content easily accessible to online "spider" algorithms, rather than trying to trick the algorithm out of the intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility, though both are not identical.

Black hat SEO tries to improve rankings in a way that is not approved by search engines, or involves fraud. One black hat technique uses hidden text, either as a colored text similar to a background, in an invisible div, or positioned off-screen. Other methods provide different pages depending on whether the page is requested by human visitors or search engines, a technique known as cloaking. Another category that is sometimes used is a gray hat SEO. This is between a black hat approach and a white hat, where the method used avoids a punished site, but does not act in producing the best content for the user. Gray hat SEO is fully focused on improving search engine rankings.

Search engines can punish sites that they find using black hat methods, either by reducing their rankings or eliminating their lists from their database altogether. Such penalties may be applied automatically by search engine algorithms, or through manual site reviews. One example is the removal of Google in February 2006 from BMW Germany and Ricoh Germany for the use of deceptive practices. Both companies, however, quickly apologized, corrected offensive pages, and returned them to Google lists.

eight Essential Search engine optimization Strategies That Your ...
src: giveuselife.org


As a marketing strategy

SEO is not the right strategy for any website, and other Internet marketing strategies can be more effective such as paid advertisements through pay per click (PPC) campaigns, depending on the destination of the site operator. Search engine marketing (SEM), is the practice of designing, running, and optimizing search engine advertising campaigns. The difference from the simplest SEO is described as the difference between paid priority rank and unpaid in search results. The goal is more prominent than relevance; website developers should regard SEM with great importance considering the PageRank visibility as most navigate to their main list of searches. Successful Internet marketing campaigns can also rely on creating high-quality web pages to engage and persuade, set up analytics programs to enable site owners to measure results, and increase site conversion rates. In November 2015, Google released a full 160 page version of the Search Quality Ranking Guide to the public, which now shows a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, taking over desktop usage as shown by StatCounter in October 2016 where they analyzed 2.5 million websites and 51.3% of pages loaded by mobile devices. Google has become one of the companies that has taken advantage of the popularity of mobile use by pushing websites to use their Google Search Console, Mobile-Friendly Testing , which allows companies to measure their websites to search engine results and how easy it is for users that.

SEO can generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms are changing, and there is no continuous referral guarantee. Due to this lack of guarantee and certainty, businesses that rely heavily on search engine traffic can suffer huge losses if search engines stop sending visitors. Search engines can change their algorithms, which impact on website placement, which may result in serious traffic loss. According to Google CEO, Eric Schmidt, in 2010, Google made more than 500 algorithm changes - almost 1.5 per day. This is considered a wise business practice for website operators to free themselves from dependence on search engine traffic. In addition to accessibility in terms of web crawlers (mentioned above), user web accessibility is becoming increasingly important for SEO.

Top SEO Tips for Start-ups - Inside The Ingenuity Lab
src: blogs.nottingham.ac.uk


International markets

The optimization technique is particularly suitable for the dominant search engine in the target market. Search engine market shares vary from market to market, just like competition. In 2003, Danny Sullivan stated that Google represents about 75% of all searches. In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide in 2007. In 2006, Google had a 85-90% market share in Germany. Although there were hundreds of SEO companies in the US at the time, there were only about five in Germany. In June 2008, Google's market share in the UK was close to 90% according to Hitwise. The market share was achieved in a number of countries.

As of 2009, there are only a few big markets where Google is not a leading search engine. In many cases, when Google does not lead in a particular market, it lags behind local players. Examples of the most prominent markets are China, Japan, South Korea, Russia and the Czech Republic where Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

Successful search optimization for international markets may require professional translation of web pages, domain name registration with top-level domains in target markets, and web hosting providing local IP addresses. Otherwise, the basic elements of search optimization are essentially the same, regardless of language.

Search Engine Optimization Jabalpur - ANSHIKA TECHNOLOGIES
src: anshikatechnologies.in


Legal precedents

On October 17, 2002, Searchking filed a lawsuit in the United States District Court, Western District of Oklahoma, against the Google search engine. SearchKing's claim is that Google's tactics to prevent spamdexing are a torturous intrusion with contractual relationships. On May 27, 2003, the court granted Google's motion to close the complaint as SearchKing "failed to state a claim for any assistance it may have provided."

In March 2006, KinderStart filed a lawsuit against Google for search engine rankings. The KinderStart website was removed from the Google index before the lawsuit and the amount of traffic to the site fell by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) rejected KinderStart's complaint without permission to amend, and partly provided Google's action to impose sanctions on Rule 11 against KinderStart's lawyers, which required him to pay part of Google's law cost.

Search Engine Optimization (SEO) - Magnus Marketing
src: www.ineedmagnus.com


See also

  • Search for neutrality - the opposite of search manipulation
  • The blog network
  • List of search engines
  • Search engine marketing
  • Tracking repeater
  • Website promotion
  • User intent

Miami Search Engine Optimization Company - SEO Online Marketing
src: www.webdesignerexpress.com


Note


SEO Services India, SEO Company Hyderabad, Enterprise SEO Services
src: digitalprosoft.com


External links

  • Web Development Promotion in Curlie (based on DMOZ)
  • Google Webmaster Guidelines
  • Yahoo! Webmaster Guidelines
  • Bing Webmaster Guidelines
  • "The Little Secrets of Search", an article in The New York Times (February 12, 2011)
  • Google I/O 2010 - SEO site suggestions from experts on YouTube - Technical tutorial on search engine optimization, provided at Google I/O 2010.

Source of the article : Wikipedia

Comments
0 Comments