On site optimization. Get more vistiors from Google

On site optimization

On site optimization. Get more vistiors from Google

Hey, dear visitor, you’re probably wondering how to get more search engine traffic for  your website. On site optimization is the thing that is really not as hard as getting more links etc, but it’s effectiveness is simply incredible.

On-site optimization refers to the process of improving various elements of a website to make it more search engine friendly and to enhance the user experience. This includes optimizing the website’s content, structure, and code to make it easier for search engines to understand and crawl the site. On-site optimization also involves improving the website’s speed, mobile responsiveness, and accessibility. By focusing on on-site optimization, businesses can improve their search engine rankings, attract more organic traffic, and provide a better user experience, leading to increased conversions and sales.

Titles and Descriptions

Titles are vital. Make sure every page on yor site/blog has a relevant, keyword rich and unique title.

So if the page is about, say, Real estate in Arizona make sure that the title looks like “Real Estate in Arizona. Best offers on real estate”, notice that the “real estate” keyword is duplicated in the title, this is a very nice technique to optimize your titles.

Make descriptions informative and attention catching, keywords don’t really matter here. Try to fit a short overview of the page’s content in the description.


If your url looks like “http://bestsiteintheworld/sdkakfsjfk213kfjaksjf” it won’t do you no good.

  • Try to make urls human understandable and make them keyword rich.
  • Also keep in mind that search engines ignore word like “to, from, and” etc in the links.
  • Don’t make urls too long, Google hates it.

Cross Linking

Make sure that the pages of your site are cross linked in every way possible with good keyword link anchor text.

Make search engines bot live’s easier try to cross link your content so when the bot hits the frontpage – it indexes every page of the site. Say, you have a news site, just add 2 links to the bottom of every article: previous: Name of article and Next: Name of article. Breadcrumbs are a good thing to have too.

Avoid duplicate content

There’s a real nasty thing @ Google called “Supplemental index”,  where they put web pages that aren’t going to rank for anything important. I have unique content on my site, why would I worry about something like that, you might say. Wrong, there are convenient things for users such as tags, archives etc. which duplicate the content right at your site. Say, you have a “I am Awesome” article at your blog. You tag it with pure awesomeness tag, put it in  3 categories, and it goes to archives. Oh my, that’s hell lot of duplicates for Google’s bot. So it goes to the supplemental index and isn’t ranked the way you wanted it to.

Make sure you deny the search engine bots access to pages like these and you’ll do fine. It can be done either by adding the “nofollow” attribute to the link, or by denying these pages in the robots.txt file (this way is easier better).

Create a robots.txt file and put in the root folder of your site/blog. Read this robots.txt faq for more information how to disallow search engine bots from indexing content you don’t want it to index.

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *