Read complete story

Webkul has been rewarded with Top Selling Extension 2017 award at Magento Imagine 2018

Read more

Optimization Of Robots.txt File In Magento Marketplace.

Do you want to optimize your Magento or Magento2 robots.txt file?

Don’t know.. how and why robots.txt file is important for your SEO? We have an answer. In our this Robots.txt file special article, we will show you how to optimize your Magento or Magento2 robots.txt for SEO and help you understand the significance of robots.txt file.

The robots exclusion standard, also known as the robots exclusion protocol or simply called robots.txt file play an important role in your site’s overall SEO performance. Robots.txt file allows you to communicate with search engines and let them know which parts of your site they should index and which areas of the website should not be processed or scanned.

The robots.txt file is one of the leading ways of telling a search engine where it can go on your website and where can not. It is a text file present in the root directory of a website. Absence of a robots.txt file will not stop search engines from crawling and indexing your website. However, it is highly recommended that you create one. It looks so basic but improper use of robots.txt file can hurt the ranking and harm your website really hard, this saves the website from duplicate data, we tried to cover all the uses of robots.txt file for your website.

-If you don’t want search engines to index your internal search results pages.
-If you want search engines to ignore any duplicate pages on your website.
-If you don’t want search engines to index certain files on your website (images, PDFs, etc.)
-The robots.txt file controls how search engine spiders see and interact with your webpages.
-This file, and the bots they interact with, are fundamental parts of how search engines work.
-If you don’t want search engines to index certain areas of your website or a whole website.
-If you want to tell search engines where your sitemap is located.

Creation a robots.txt file :
In order to make a robots.txt file, you need access to the root of your domain. The simplest robots.txt file uses two key words, User-agent and Disallow. User-agents are search engine robots (or web crawler software); Disallow is a command for the user-agent that tells it not to access a particular URL. On the other side, to give Google access to a particular URL that is a child directory in a disallowed parent directory, then you can use a third key word, Allow.

Optimization Of Robots.txt File In Magento and Magento2

User-agent: [the name of the robot the following rule applies to]
Disallow: [the URL path you want to block]
Allow: [the URL path in of a sub-directory, within a blocked parent directory, that you want to unblock]

These two lines are together considered a single entry in the file, where the Disallow rule only applies to the user-agent(s) specified above it. You can include as many entries as you want, and multiple Disallow lines can apply to multiple user-agents, all in one entry. You can set the User-agent command to apply to all web crawlers by listing an asterisk (*) as in the example below:

User-agent: *

The above code prevents the search engines from indexing any pages or files on the website. Say, however, that you simply want to keep search engines out of the folder that contains your administrative control panel. Private links like Login ID, Checkout Page, Server Settings, Cart Page, Private information- account number, pin numbers, photographs etc should be disallow.

We highly recommend that if you do not have a robots.txt file on your site, then you immediately create one !!

Comment

Add Your Comment

Be the first to comment.

Need Any Help?

Got Stuck with something serious or you wish to hire us for a while. We are here to hear from you, feel free to reach us and we'll get back to you as soon as possible.

Contact Us
css.php
Hire Us!
Brief us about your requirements and we'll get back to you.
Woo! Hooy!
We have just recieved your project brief and our expert will contact you shortly.
Send Again
Close