This article is not applicable to Cloud versions.
If the application is installed outside of a firewall, web crawlers can search for and index your end user website, subjecting your published content to public search. To prevent web crawlers from indexing your content, the server installation includes a robots.txt file that is stored in the [Install Location]\Collaboration\WWW folder.
When a web crawler searches for information, it looks for the robots.txt file before indexing content. When the crawler locates this file, it will ignore any content within your end user website, ensuring that your published content is kept private to your organization. If you wish for your content to be searchable via an internet search engine, you can edit the robots.txt file from the [Install Location]\Collaboration\WWW folder. For more information on robots.txt, refer to http://www.robotstxt.org.