Customise the Robots.txt File on a ServiceNow InstanceSummary The OOB content of https://instancename.service-now.com/robots.txt on any ServiceNow instance will be: User-agent: * Disallow: / Some customers need to customise this to allow indexing software (e.g. SortSite) which respects robots.txt to index public pages on the instanceReleaseMadrid and newerInstructions Request the 'Google custom search integration' plugin using the steps in the documentation: https://docs.servicenow.com/csh?topicname=t_ActivaGoogCustmSrchIntegr.html&version=latestOnce the plugin is installed go to Custom Search Integration->Robots.txt Definitions. At this point, there will be no 'Robots file' (Robots.txt Definitions) records, and the robots.txt will be blankAdd a 'Robots file' record with the required content and Active ticked, for example: User-agent: SomeSiteIndexingSoftware Allow: / Verify the results at https://instancename.service-now.com/robots.txt Related LinksSome customers do not want to use the 'Google custom search integration' plugin for search, they just need the ability to customise the Robots.txt file. To achieve this these customers can apply a special Update Set that removes all of the 'Google custom search integration' plugins except the part, 'Robots.txt Definitions', that allows you to customise robots.txt. This Update Set is available on KB0692532, but this article is not visible to the public because the Update Set has not been tested or verified by development and should be used with caution by customers. To get this Update Set please raise a Case in HI with ServiceNow support requesting the Update Set in KB0692532. Always test carefully on a sub-production instance. Custom URL: For each different Custom URL existing in the instance a different Robot.txt can be created providing that: An active Custom URL record exists for the particular hostname on the instance. The System Property: com.glide.generate.robots.based.onhost (type True/False) with value "true" is created in system properties.An additional robot text record is created for the particular hostname in:https://instance-url/nav_to.do?uri=%2Frobots_txt_list.doFormat: Active,hostname,text Format for a record in: robots_txt_list.do Active (indicates if the record is active)Hostname: the hostname to which the robot.txt will apply.Text: The actual indexing data which is put in place for the Search engines when indexing: https://<host-name>/robots.txt Important: Once com.glide.generate.robots.based.onhost (type True/False) with value "true" is created it is required that all records in robots_txt_list.do have a hostname associated with all records that do NOT have a hostname set. Will return as empty: when opening: https://<host-name>/robots.txt with the property active.