on page seo
Latest Posts SEO

What is On page SEO?

On-page, SEO is also known as On website SEO. It is a technique to optimize web pages to reform website search engine rankings. It helps us to drive organic or natural traffic and also helps in increasing conversion rates.

Just publishing a high-quality post is not enough; there are also other factors also which affect page ranking. On-page, SEO comprises of optimizing your title tag, meta tag, URL, internal linking, alt text, keyword etc.

Also, make sure that your website has a high level of proficiency and fidelity etc.

On-page, SEO must go hand by hand with Off-page SEO to get the best results.

Why On-Page SEO is important?

Today, the internet becomes the need of every person. Everyone is using the net on computer and mobile. We must design a user-friendly website. It means posting good quality content, proper alignment of tags (heading, meta, URL) etc. If we don’t do this we can lose our top search position spot on SERPs( Search Engine Result Page).

Positive user experience matters a lot for us. High-quality content builds the foundation of a profitable business whereas on Page SEO is a vital factor of content marketing.

If we want to survive and maintains goodwill in the online environment, we simply can’t afford On-page SEO.

Top On-Page SEO checklist:

SEO Url Optimization

We have to make a simple and clear URL so that users and search engines can easily understand. 

  1. URLs consist of codes and ID number etc. But we have to use the words that people can easily understand.
  1. URLs should be decisive and succinct. Means users can easily guess results just by looking at the URL.
  1. In URL we have to use a hyphen to separate the words. Also Avoid using underscores, spaces or any other special characters to separate the words.
  1. We have to use small letters. Because capital letters can create problems with the same pages. For example, vivekkblogs.com/about-us and vivekkblogs.com/About-us look two different URLs which can create problems with duplicate content.
  1. Avert the use of URL parameters. They can create problems with tracing etc.

SEO Title Optimization

  1. In your website, each page should have a distinct title.
  2. You have to put your focus keyphrase in every title of each page.
  3. If you want to use your company name, you can use it at the end of the title.
  4. Your title should not exceed 60 characters.
  5. Don’t overdo the keywords. It implies never repeat your keywords more than 2 times in your title.
  6. Remember put your title tag at the beginning of the head section. It also makes easier for the search engines to find the page.

Meta Description Tag

It is an HTML attribute which provides concise information on a web page. Search engines display the Meta description under the title tag in search results.

  1. There is no particular length of Meta description. Search engines truncate them 150-160 characters. But we recommend you to use 50 – 160 characters in your Meta description. And also remember length can vary according to the situation.
  1. You have to clearly define your Meta descriptions. It makes easier for searchers to decide whether the content is useful and contains the information they are looking for.
  1. Create unique meta description tags. Avoid using duplicate tags.
  1. Never use double quotation marks.

Meta Author Tag

This tag defines the owner of the website. It is same for all the web pages.

Robot meta tags

Robots Meta tags are also known as robots Meta directives. By using Meta robots on our pages, we are instructing search engines for how to crawl or index web pages data.

There is a list of different parameters which we can use in our robot Meta tags. These parameters function differently.

  • All: This is the default value of the index, follow.

Code sample: <meta name=”robots” content = “all”>

  • No index: Instructing the search engine not to index the page.

Code sample: <meta name=”robots” content = “noindex”>

  • Index: Instructing the search engine to index the page.

Code sample: <meta name=”robots” content = “index”>

  • Follow: Instructing the search engine to follow all links on the page.

Code sample: <meta name=”robots” content = “follow”>

  • No follow: Instructing the search engine to not follow links on the page.

Code sample: <meta name=”robots” content = nofollow”>

  1. None: Instructing search engine neither follow nor index the page.

Code sample: <meta name=”robots” content = noindex, nofollow”>

Alt text attribute

Alt-text is also known as alternative text. The alternative attribute specifies image functionality and its appearance on the page.

Features of this attribute:

  1. It is mandated to use this attribute in the image tag. This is because sightless people can understand this image by using alt text.
  1. If the image does not display on the page, then it shows alt text.
  1. This attribute also helps the search engines crawlers to understand the image and index them.
  1. Do not use more than 125 characters.
  1. Never do keyword stuffing.
  1. Never use these types of words in your alt text. An example is “image of”, “picture of” etc
  1. If you are using intricate images which requires long description then use long des = ““.

Internal linking Tactics

Internal links are hyperlinks that take the user from one page to another on the same website.

Code sample of internal link:

<a href= “ www.vivekkblogs.net “ title = “ Keyword Text “ > Keyword text </a>

Importance of Internal Linking in SEO

Internal links help the search engines to search, index and comprehend all the new content on your website. This plays an important role in SEO. If we properly use internal linking, we can increase page rank or page authority.

We can also say Internal linking is crucial for any site that wants top ranking in search engines.

Content Optimization

We need to write the content in a way so that it can reach to a huge number of the target audience.

To optimize the content make sure you are using the focused key phrase, including title tags, meta tags and heading tags etc. You can also optimize images and headlines for user engagement and to increase CTRs.

Write good quality content

Always remember you have to produce high-quality content. It must be unique, relevant to the users. And if you are running a business website, write content that describes your product and services etc. Bring new content regularly

Bring new Content Regularly

Another way to optimize your content is to daily publish new content. It helps you to quickly get the top position in SERPs. This is because search engines crawlers love your new content as well as your readers.

Properly usage of headings

Make sure you are using heading and subheadings in your content. This is because search engines like Google love big and bold text.

Boost your text, videos and images

You can easily optimize your text by adding Meta tags, title tags, heading tags. By doing this, search engines can easily notice your content.

Like text searchers, there are also many image searchers and users spend much time searching for photos. Make sure you are adding images within your content. Images should be clean and clear so that it can view and load easily by the user. Use alt text in your image so that sightless people and search engines can also understand.

Try to include your videos within the content. Like text and images, videos also grasp the user’s attention and if you don’t want to upload videos, you can simply provide video link which redirects them to video. It is also a way to drive the sale.

The canonical issue in SEO

A canonical issue appears only when search engines access our website from multiple URLs. This means that a search engine can index our site from distinct URLs. Therefore according to search engines, this site will have duplicate content.

Canonical tags also help the search engines to specify which URL is the main one and should be indexed.

For example, a website page can be load from all of the URLs below:


According to the user, all the above links are 1 and will display the homepage. But because of different URLs, search engines understand it four different pages and this can be a problem for SEO.

How can we fix the canonical issue?

We can fix the canonical issue by any of the following ways:

  1. By using 301 redirects.
  2. By using rel= “canonical” tag.


Sitemaps are a collection of web pages. It also helps the search engines to discover, crawl and index your entire website’ content. Even if our internal linking is not good, sitemaps allow search engines to find the significant web pages.

Types of Sitemaps

There are four types of sitemaps:

Image Sitemap:

It helps the search engines to find all the images on your website.

News Sitemap:

It helps the search engine (google) to discover that content which has been approved for google news.

 Video Sitemap

 It helps the search engines to find all the video content on your website.

 General XML Sitemap

These are ordinary XML sitemaps which contain the information of all the URLs on your website.

 Importance of Sitemaps

Firstly all the search engines like yahoo, Google etc. uses your sitemap to find all the pages on your website.   

It’s not necessary to create sitemaps. If you have done proper internal linking, google crawlers can still discover most of the content in your site.

Still, you are thinking when google crawlers discover almost all pages then why do we need Sitemaps?

Suppose you are running a website with more than 8 million pages. If you have done proper internal linking and has millions of external links, then google needs much time to find all those pages.

That is why sitemaps are significant for SEO.


You can say a favicon is just an image or logo. When you open website on any browser just looks at the browser tab and the image which appears on the tab is known as a favicon. It is a 16*16 pixel icon.

It does provide an advantage to the visitor and helps them to easily locate your page when they are working on multiple tabs. They are very small in size.

Robot.txt file

Robot.txt is also a text file that tells the search engine crawlers which pages they have to crawl or not crawl. The biggest search engines like Google, yahoo, bing etc admires robot.txt appeal.

Basic Structure of a Robots.txt file of a WordPress website:

User-agent: *

Disallow: /wp-admin/

User agents of Search engines:

Every search engine recognizes itself with distinct user agents. There are many user   agents but the most popular user agents are:

 Google uses google bots.

 For google images they use Googlebot-Image.

 Yahoo uses slurp.

 Bing uses Bing bots.

 Baidu uses Baidu spider.

 DuckDuckgo uses DuckDuck bot.

301 and 302 redirects

301 redirects inform the search engine crawlers that particular page has been moved permanently.

Here permanent means that a year or more than a year or maybe forever.

When we have to use 301 redirects?

Let’s take the example of 301 redirects. Suppose you are making a new website in a new CMS and also using different URL. We can create 301 redirects for our old content to indicate to the new content and this will inform the search engines that our content has been shifted from one place to another.

Search engines not only redirect to new location bur also directs ranking and goodwill to the new URL. This is because they think that this is the new location of that content.

302 redirects

302 redirects inform the search engine crawlers that the page or website has been moved temporarily or a particular period.

When we have to use 302 redirects?

We can redirect our users from one page to another for a short period. Let’s say if I am modifying our website page and I need to redirect our users for a few days to a different location, then we can use 302 redirects.