Summary
Release
Madrid and newer
Instructions
- Request the 'Google custom search integration' plugin using the steps in the documentation: https://docs.servicenow.com/csh?topicname=t_ActivaGoogCustmSrchIntegr.html&version=latest
- Once the plugin is installed go to Custom Search Integration->Robots.txt Definitions. At this point, there will be no 'Robots file' (Robots.txt Definitions) records, and the robots.txt will be blank
- Add a 'Robots file' record with the required content and Active ticked, for example:
- Verify the results at https://instancename.service-now.com/robots.txt
Related Links
Some customers do not want to use the 'Google custom search integration' plugin for search, they just need the ability to customise the Robots.txt file. To achieve this these customers can apply a special Update Set that removes all of the 'Google custom search integration' plugins except the part, 'Robots.txt Definitions', that allows you to customise robots.txt.
This Update Set is available on KB0692532, but this article is not visible to the public because the Update Set has not been tested or verified by development and should be used with caution by customers. To get this Update Set please raise a Case in HI with ServiceNow support requesting the Update Set in KB0692532. Always test carefully on a sub-production instance.
Custom URL:
For each different Custom URL existing in the instance a different Robot.txt can be created providing that:
- An active Custom URL record exists for the particular hostname on the instance.
- The System Property: com.glide.generate.robots.based.onhost (type True/False) with value "true" is created in system properties.
- An additional robot text record is created for the particular hostname in:https://instance-url/nav_to.do?uri=%2Frobots_txt_list.doFormat: Active,hostname,text
Format for a record in: robots_txt_list.do
- Active (indicates if the record is active)
- Hostname: the hostname to which the robot.txt will apply.
Text: The actual indexing data which is put in place for the Search engines when indexing: https://<host-name>/robots.txt
Important: Once com.glide.generate.robots.based.onhost (type True/False) with value "true" is created it is required
that all records in robots_txt_list.do have a hostname associated with all records that do NOT have a hostname set.
Will return as empty: when opening: https://<host-name>/robots.txt with the property active.