## Enable robots.txt rules for all crawlers User-agent: * ## Don't crawl development files and folders Disallow: CVS Disallow: .svn Disallow: .idea Disallow: .sql Disallow: .tgz ## Don't crawl Magento admin page Disallow: /admin/ ## Don't crawl common Magento folders Disallow: /404/ Disallow: /app/ Disallow: /cgi-bin/ Disallow: /downloader/ Disallow: /errors/ Disallow: /includes/ Disallow: /js/ Disallow: /lib/ Disallow: /media/ Disallow: /shell/ Disallow: /skin/ Disallow: /var/ ## Don't crawl common Magento files Disallow: /api.php Disallow: /cron.php Disallow: /cron.sh Disallow: /error_log Disallow: /get.php Disallow: /install.php Disallow: /LICENSE.html Disallow: /LICENSE.txt Disallow: /LICENSE_AFL.txt Disallow: /README.txt Disallow: /RELEASE_NOTES.txt Disallow: /STATUS.txt ## Don't crawl sub-category pages that are sorted or filtered. Disallow: /*?dir* Disallow: /*?dir=desc Disallow: /*?dir=asc Disallow: /*?limit=all Disallow: /*?mode* ## Do not crawl links with session IDs Disallow: /*?SID= ## Don't crawl the checkout and user account pages Disallow: /checkout/ Disallow: /onestepcheckout/ Disallow: /customer/ Disallow: /customer/account/ Disallow: /customer/account/login/ ## Don't crawl seach pages and catalog links Disallow: /catalogsearch/ Disallow: /catalog/product_compare/ Disallow: /catalog/category/view/ Disallow: /catalog/product/view/