SEO| SMO| Instant Approval Articles | Tips

Helpful Information about Search Engine Optimization and Social Media Optimization

Posts Tagged ‘Increase Traffic’

What is Google PageRank

Posted by selfsmo on December 4, 2009

Every SEO professional must understand the concept of Google PageRank (or, Google PR). Google PageRank is a mathematical technique(based upon a formula) used by Google for determining the relevance or importance of any webpage. The system was developed by Google founders Larry Page and Sergey Brin at Stanford University in 1988. Google PageRank is the single most important factor that effects your search engine rankings in Google (apart from over 100 other factors that Google considers). It is therefore essential to try to increase your Google PageRank as far as possible.

Google PageRank (also called Google PR and a trademark owned by Google) is a Google technology that rates the significance of a webpage on the WWW and is basically a numeric value from 1 to 10 that measures how important a webpage is. A webpage with a Google PR of 2 is more important than a webpage with a Google PR of just 1.

The foundation of the Google PR system of ranking webpages is based upon the principle of voting. When a website links to another website, it passes a vote in favor of the website. As more and more websites link to the other website, the vote keeps increasing, and the Google PR also goes up. Another important factor is – who is linking? Is an important page linking or just a fresh page that has not been indexed? The vote for an important page is more than the vote for a unimportant page. There fore, Google calculates a page’s importance on the basis of the number of votes cast for the Page and who is casting them.

The out-bound links also have a bearing on the Google PageRank. Every webpage has some out-bound links whether they are going off to other sites or to other pages of the same website. The out bound links lead to a decrease in the ability of a page to pass Google PR. Therefore, the more the out-bound links, the less is the amount of Google PR passed by the page. It is quite possible that a link from a Google PR 6 page passes less Google PR than a PR 4 page if the PR6 page has far more out-bound links than the PR4 page. If a webpage has 8 out-bound links, it will only pass one eighth of its available PageRank. If a page has only one out-bound link, it will pass 100% of its available PageRank.

To summarize, Google PR is based on:

  • The quantity of in-bound links to a page
  • The quality of in-bound links to a page (what is the Google PR of the pages on which the links reside)
  • The Google PR only flows from the sender to the receiver. It is not possible to loose Google PR by linking to a low PR site
  • The ability to pass PR also depends on the number of out-bound links on the page
  • The relevance of the content of the page to the subject matter

The Google PR is determined for every page of the website. It may be different for the homepage, and different for the inner pages depending upon the above factors. This is a very simplified explanation of Google PageRank. There are over 100 other factors that determine the PageRank but none are as important as PR.

To check the Google PR of any webpage download the Google Toolbar from http://toolbar.google.com/

The green bar in the toolbar indicates the value of the PR from 1 to 10. If you are searching for the Google PR of a new site, you may have to wait as it takes up to 3 months to get a PR after being indexed.

Below is a screen shot of the Google Toolbar. The green bar shows the PageRank as 10.

Google Toolbar

Google Toolbar

On the basis of what we have discussed above (which is in no way an exhaustive explanation of Google PR). We can take up a small and simplified scenario to understand Google PageRank.

If you have 2 options for getting inbound links to your site, one is a PR7 page with 3 out-bound links and the other a PR5 page with 2 out bound links, which would you choose? The answer should be the PR 5 page as it will pass 5/2 = 2.5 PR while the PR7 page will pass 7/3=2.3 PR. This scenario has been depicted in the below diagram.

The PageRank of webpage 3 is 4.8 and can be calculated by adding the total PR flowing to it from webpage 1 and webpage 2, which is:

PR of webpage 3 = (PR of webpage 1)/2 + (PR of webpage 2)/3 = 2.5 + 2.3 = 4.8

You can use the following rule of thumb to rate webpages based on their Google PR:

0 – Black listed or new websites or webpages

1 – Very low PR (not much use getting linked from such pages)

2 – Low PR (not much use getting linked from such pages)

3 – Low-average PR (not a very good linking opportunity, but still go ahead with a links exchange)

4 – Average PR (most running sites fall in this category – exchange links)

5 – Good PR (This could be a good links exchange)

6 – Very good PR (Exchanging links with this site would be like finding a rare gem)

7 and above – An excellent opportunity to get linked. Almost never comes for free.

While referring to the above rule of thumb, please keep in mind that you should not leave an opportunity to get linked from a low PR website if you think it could improve in PR or have good future prospects.

Posted in SEO, Tips | Tagged: , , , , , , , , , | 4 Comments »

How do you create a Robots.txt file?

Posted by selfsmo on September 29, 2009

A Robots.txt file can be created in a simple text editor such as notepad. An example of a Robots.txt file is shown below:

User-agent:Name of the robot comes here for eg. Googlebot

Disallow:The name of the file or directory comes here (This instruction disallows the files/directory from being indexed). Here are some examples:

  • Below is an example of a Robots.txt file that disallows all webpages from being indexed:

User-agent:*

Disallow:/

  •  Below is an example of a Robots.txt file that allows all webpages to be indexed :

User-agent:*

Disallow:

  • Below is an example of a Robots.txt file that disallows the Altavista Robot called “scooter” accessing the Admin directory and personal directory:

User-agent:scooter

Disallow:/admin/

Disallow:/personal/

  • Below is an example of a Robots.txt file that instructs bots not to crawl any file ending in .PDF

User-agent:*

Disallow:/*.pdf

The Robots.txt file can also have multiple sets of instructions for more than one bot. Each set of instructions should be seperated by a blank line. There is only one Robots.txt file for a website.

  • Below is an example of a Robots.txt file that disallows Google from crawling any of the dynamically generated pages and allows the altavista scooter bot to access every page.

User-agent:Googlebot

Disallow:/*?

User-agent:Scooter

Disallow:

Posted in SMO | Tagged: , , , , , | 6 Comments »

Good practices for description meta tags

Posted by selfsmo on June 12, 2009

 

• Accurately summarize the page’s content – Write a description that would both inform and
interest users if they saw your description meta tag as a snippet in a search result.
Avoid:
• writing a description meta tag that has no relation to the content on the page
• using generic descriptions like “This is a webpage” or “Page about baseball
cards”
• filling the description with only keywords
• copy and pasting the entire content of the document into the description meta
tag
• Use unique descriptions for each page – Having a different description meta tag for each
page helps both users and Google, especially in searches where users may bring up
multiple pages on your domain (e.g. searches using the site: operator). If your site has
thousands or even millions of pages, hand-crafting description meta tags probably isn’t
feasible. In this case, you could automatically generate description meta tags based on
each page’s content.
Avoid:
• using a single description meta tag across all of your site’s pages or a large
group of pages

• Accurately summarize the page’s content – Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result.

Avoid:

  • Writing a description meta tag that has no relation to the content on the page
  • Using generic descriptions like “This is a webpage” or “Page about baseball cards”
  • Filling the description with only keywords
  • Copy and pasting the entire content of the document into the description metatag

• Use unique descriptions for each page – Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (e.g. searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn’t feasible. In this case, you could automatically generate description meta tags based on each page’s content.

Avoid:

  • Using a single description meta tag across all of your site’s pages or a large group of pages

Posted in SEO | Tagged: , , , , , , | Leave a Comment »

Why Social Media is more effective method for online promotional?

Posted by selfsmo on March 28, 2009

Social media optimization (SMO) is a simple and a creative idea for increasing the website traffic and it’s more natural than other online advertising methods. It’s simply a set of methods for generating publicity through social media, online communities and community websites. This is an alternate source of increase the traffic due to your “social potential” and high search engine rankings. And through which we can add RSS feeds, social news buttons, blogging, and incorporating third-party community functionalities like images and videos.

Here are some top social network sites from a marketing perspective, such as:

1> http://www.twitter.com

2> http://www.flickr.com

3> http://www.facebook.com

4> http://www.myspace.com

5> http://www.bebo.com

6> http://www.technorati.com

7> http://www.digg.com

8> http://del.icio.us

9> http://www.technorati.com

10> http://www.perfspot.com

These are the most popular sites for done SMO easily. That will be useful for done SMO easily and getting good result.

Posted in SMO | Tagged: , , , | 1 Comment »