{"section":"tutorials","requestedLocale":"en","requestedSlug":"google-search-console-tracking-robots-txt","locale":"en","slug":"google-search-console-tracking-robots-txt","path":"docs/en/tutorials/projects-and-integrations/google-search-console-tracking-robots-txt.md","branch":"main","content":"robots.txt is a text file that search engines use to define site scanning rules for crawlers.\n\nVTEX has a native interface for editing and customizing the robots.txt file.\n\nGo to: Store Settings > Storefront > Settings > SEO > Robots.txt\n\nFor a better understanding of the content, see below a detailed description of the basic functions:\n\n- **Allow:** by using this word, you will allow the search tool crawler to browse and index the address given.\n- **Disallow: **you will be blocking the content given\n\nIn order to validate the content, you must:\n\n- Check whether the URLs listed really need some rule in the robots.txt file;\n- Check whether the rules were correctly applied to the URLs intended;\n- Verify if the sitemap.xml file was listed correctly.\n\nrobots.txt file editing for your store is available at: `[accountname].vtexcommercestable.com.br/admin/Site/ConfigSEOContents.aspx`.\n\nFor a proper setup of Search Console, the next steps are to check the store’s Sitemap; present the structure of your store to facilitate crawler browsing; and accelerate the indexation of pages."}