How to Create Robots txt File in easy method

How to Create Robots txt File – Easy SEO Guide for Beginners

If you have a website or a blog and you want it to appear in Google search then you need to understand one small but powerful file called the robots txt file.

This file tells search engines what they can and cannot see on your website. It sounds technical but do not worry. In this guide you will learn how to create a robots txt file in just 2 minutes.

You do not need coding skills or paid tools. Just follow these simple steps.


What is a Robots txt File

A robots txt file is a plain text file that lives in the root folder of your website.

It gives instructions to search engines about which pages or folders they should crawl and which ones they should not.

Search engines like Google Bing and Yahoo check this file before crawling your site.

Why is Robots txt Important for SEO

The robots txt file helps improve your site SEO by:

  • Controlling how search engines crawl your site

  • Blocking unnecessary or duplicate pages

  • Saving crawl budget

  • Protecting sensitive folders or pages

  • Helping search engines focus on your important content

If your site has no robots txt file search engines may crawl everything. That includes pages you do not want to show like admin pages or private content.

How to Create a Robots txt File in 2 Minutes

Follow these steps to create a robots txt file quickly:

Step 1: Open a Text Editor

You can use any text editor like Notepad or TextEdit. Do not use Word or fancy editors.

Step 2: Add Basic Rules

Here is a simple example you can use.

makefile
User-agent: * Disallow: Sitemap: https://yourwebsite.com/sitemap.xml

This file tells all search engines they can crawl all pages. If you want to block something you can add rules like this

makefile
User-agent: *
Disallow: /admin/ Disallow: /private/ Sitemap: https://yourwebsite.com/sitemap.xml

Step 3: Save the File

Save the file as robots.txt. Make sure the name is all small letters and has .txt extension.

Step 4: Upload to Your Website

If you are using WordPress you can upload it using a file manager plugin.. If you use Blogger you can copy the content and paste it into custom robots txt settings.

How to Create Robots txt File in Blogger

Blogger makes it easy to add a robots txt file

  1. Go to your Blogger Dashboard

  2. Click Settings

  3. Scroll to Crawlers and indexing

  4. Turn on Enable custom robots txt

  5. Click Custom robots txt

  6. Paste your robots txt content

  7. Click Save

How to Create Robots txt File in WordPress

In WordPress you can use a plugin or access the file manually.

Method 1: Using a Plugin

  1. Install and activate Rank Math or Yoast SEO plugin

  2. Go to Tools in the plugin

  3. Click File Editor

  4. You will see the robots.txt file

  5. Add your rules and save

Method 2: Manual Upload

  1. Use a File Manager plugin or cPanel

  2. Go to public_html folder

  3. Upload your robots.txt file

  4. Make sure it is at the root level of your site

How to Check if Your Robots txt File Works

  1. Go to your property in Google Search Console

  2. Click on Settings

  3. Scroll to robots.txt tester

  4. Enter your file URL

  5. Check for errors or warnings

Or simply open this URL in your browser

arduino
https://yourwebsite.com/robots.txt

Replace yourwebsite.com with your actual domain

What is Disallow in Robots txt

Disallow means stop search engines from crawling the given path

For example

bash
Disallow: /admin/

This tells Google not to crawl the admin folder

What is Allow in Robots txt

Allow is used to let search engines crawl a path even if it is inside a disallowed folder

Example

vbnet
Disallow: /private/ Allow: /private/page.html

This means block the private folder but allow one page inside it

Sample Robots txt File for SEO

Here is a ready to use file for SEO friendly sites

pgsql
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php Sitemap: https://yourwebsite.com/sitemap.xml

This works well for WordPress and blogs
Change paths as per your site structure

Best Free Tools to Generate Robots txt File

If you want to save time try these tools

1. SEO Tools Robots.txt Generator

Free and fast for custom settings. Perfect for bloggers and WordPress users

2. SEO Site Checkup Robots.txt Tool

Try this tool to check if your robots txt file is working correctly

3. Google Search Console Tester

Best for checking if your rules are working

Robots txt vs Meta Robots Tag

Many people get confused between these two
Here is the difference

  • Robots txt blocks crawling before search engines even visit the page

  • Meta robots tag is placed inside a page to control indexing or following links

Use robots.txt to stop access. Use meta robots tag to stop indexing

Internal Linking Tips for SEO

You can use internal links to help search engines crawl your content better
For example

If you have a post about SEO Meta Tags. Also add links to your Sitemap Generator. This shows how your tools work together

External Linking for Extra Value

Add links to trusted sites like:

Final Thoughts

Creating a robots txt file is very simple
It takes just 2 minutes and helps search engines understand your website better. Whether you use Blogger WordPress or any other platform

  • Keep it simple
  • Test it using tools
  • Update it when your site structure changes

FAQs About Robots txt

Q1. Can I block search engines completely using robots.txt
Yes you can use Disallow slash to block everything
But this is not recommended if you want your site indexed

Q2. What happens if I do not use a robots txt file
Search engines will crawl everything unless blocked by other methods

Q3. Can I use robots txt file in Blogger

Yes you can add your own robots txt file in Blogger from the settings

Q4. Do I need a robots txt file for a new website
Yes even a basic file is better than nothing. It helps search engines crawl your site properly

Post a Comment

0 Comments