SEO| SMO| Instant Approval Articles | Tips

Helpful Information about Search Engine Optimization and Social Media Optimization

Posts Tagged ‘Search Engine Optimization’

What is Google PageRank

Posted by selfsmo on December 4, 2009

Every SEO professional must understand the concept of Google PageRank (or, Google PR). Google PageRank is a mathematical technique(based upon a formula) used by Google for determining the relevance or importance of any webpage. The system was developed by Google founders Larry Page and Sergey Brin at Stanford University in 1988. Google PageRank is the single most important factor that effects your search engine rankings in Google (apart from over 100 other factors that Google considers). It is therefore essential to try to increase your Google PageRank as far as possible.

Google PageRank (also called Google PR and a trademark owned by Google) is a Google technology that rates the significance of a webpage on the WWW and is basically a numeric value from 1 to 10 that measures how important a webpage is. A webpage with a Google PR of 2 is more important than a webpage with a Google PR of just 1.

The foundation of the Google PR system of ranking webpages is based upon the principle of voting. When a website links to another website, it passes a vote in favor of the website. As more and more websites link to the other website, the vote keeps increasing, and the Google PR also goes up. Another important factor is – who is linking? Is an important page linking or just a fresh page that has not been indexed? The vote for an important page is more than the vote for a unimportant page. There fore, Google calculates a page’s importance on the basis of the number of votes cast for the Page and who is casting them.

The out-bound links also have a bearing on the Google PageRank. Every webpage has some out-bound links whether they are going off to other sites or to other pages of the same website. The out bound links lead to a decrease in the ability of a page to pass Google PR. Therefore, the more the out-bound links, the less is the amount of Google PR passed by the page. It is quite possible that a link from a Google PR 6 page passes less Google PR than a PR 4 page if the PR6 page has far more out-bound links than the PR4 page. If a webpage has 8 out-bound links, it will only pass one eighth of its available PageRank. If a page has only one out-bound link, it will pass 100% of its available PageRank.

To summarize, Google PR is based on:

  • The quantity of in-bound links to a page
  • The quality of in-bound links to a page (what is the Google PR of the pages on which the links reside)
  • The Google PR only flows from the sender to the receiver. It is not possible to loose Google PR by linking to a low PR site
  • The ability to pass PR also depends on the number of out-bound links on the page
  • The relevance of the content of the page to the subject matter

The Google PR is determined for every page of the website. It may be different for the homepage, and different for the inner pages depending upon the above factors. This is a very simplified explanation of Google PageRank. There are over 100 other factors that determine the PageRank but none are as important as PR.

To check the Google PR of any webpage download the Google Toolbar from

The green bar in the toolbar indicates the value of the PR from 1 to 10. If you are searching for the Google PR of a new site, you may have to wait as it takes up to 3 months to get a PR after being indexed.

Below is a screen shot of the Google Toolbar. The green bar shows the PageRank as 10.

Google Toolbar

Google Toolbar

On the basis of what we have discussed above (which is in no way an exhaustive explanation of Google PR). We can take up a small and simplified scenario to understand Google PageRank.

If you have 2 options for getting inbound links to your site, one is a PR7 page with 3 out-bound links and the other a PR5 page with 2 out bound links, which would you choose? The answer should be the PR 5 page as it will pass 5/2 = 2.5 PR while the PR7 page will pass 7/3=2.3 PR. This scenario has been depicted in the below diagram.

The PageRank of webpage 3 is 4.8 and can be calculated by adding the total PR flowing to it from webpage 1 and webpage 2, which is:

PR of webpage 3 = (PR of webpage 1)/2 + (PR of webpage 2)/3 = 2.5 + 2.3 = 4.8

You can use the following rule of thumb to rate webpages based on their Google PR:

0 – Black listed or new websites or webpages

1 – Very low PR (not much use getting linked from such pages)

2 – Low PR (not much use getting linked from such pages)

3 – Low-average PR (not a very good linking opportunity, but still go ahead with a links exchange)

4 – Average PR (most running sites fall in this category – exchange links)

5 – Good PR (This could be a good links exchange)

6 – Very good PR (Exchanging links with this site would be like finding a rare gem)

7 and above – An excellent opportunity to get linked. Almost never comes for free.

While referring to the above rule of thumb, please keep in mind that you should not leave an opportunity to get linked from a low PR website if you think it could improve in PR or have good future prospects.


Posted in SEO, Tips | Tagged: , , , , , , , , , | 4 Comments »

How do you create a Robots.txt file?

Posted by selfsmo on September 29, 2009

A Robots.txt file can be created in a simple text editor such as notepad. An example of a Robots.txt file is shown below:

User-agent:Name of the robot comes here for eg. Googlebot

Disallow:The name of the file or directory comes here (This instruction disallows the files/directory from being indexed). Here are some examples:

  • Below is an example of a Robots.txt file that disallows all webpages from being indexed:



  •  Below is an example of a Robots.txt file that allows all webpages to be indexed :



  • Below is an example of a Robots.txt file that disallows the Altavista Robot called “scooter” accessing the Admin directory and personal directory:




  • Below is an example of a Robots.txt file that instructs bots not to crawl any file ending in .PDF



The Robots.txt file can also have multiple sets of instructions for more than one bot. Each set of instructions should be seperated by a blank line. There is only one Robots.txt file for a website.

  • Below is an example of a Robots.txt file that disallows Google from crawling any of the dynamically generated pages and allows the altavista scooter bot to access every page.





Posted in SMO | Tagged: , , , , , | 6 Comments »

How to read search results on Google

Posted by selfsmo on June 11, 2009

The goal is to provide you with results that are clear and easy to read. The diagram below points out four features that are important to understanding the search results page:



  1. The title: The first line of any search result is the title of the webpage.
  2. The snippet: A description of or an excerpt from the webpage.
  3. The URL: The webpage’s address.
  4. Cached link: A link to an earlier version of this page. Click here if the page you wanted isn’t available.

All these features are important in determining whether the page is what you need. The title is what the author of the page designated as the best short description of the page.

The snippet is Google’s algorithmic attempt to extract just the part of the page most relevant to your query. The URL tells you about the site in general. Read More>>

Posted in SEO | Tagged: , , , , , | Leave a Comment »