
06/11/2024
Here’s how to set up a robots.txt.
Key Directives in robots.txt
• 🤖 User-agent: Specifies which bot the rules apply to (e.g., User-agent: Googlebot or User-agent: * for all bots).
• 🚫 Disallow: Blocks bots from accessing certain files or directories (e.g., Disallow: /private/).
• ✅ Allow: Grants access to specific files or folders within a disallowed directory (only in Google) (e.g., Allow: /public/file.html).
• 🗺️ Sitemap: Points bots to your XML sitemap, helping them navigate your site’s structure (e.g., Sitemap: https://example.com/sitemap.xml).
Example of a Basic robots.txt
🤖 User-agent: *
🚫 Disallow: /private/
✅ Allow: /public/
🗺️ Sitemap: https://example.com/sitemap.xml
This file helps search engines know what to crawl and index on your website! 📈
🤖 User-agent: *
🚫 Disallow: /private/
✅ Allow: /public/
🗺️ Sitemap: https://example.com/sitemap.xml