Skip Navigation
Expand
Bots and Spiders Best Practices
Answer ID 8532   |   Last Review Date 04/14/2019

What are some best practices relating to bots and spider activity?

Environment:

Bots, Spiders  
Oracle Service Cloud

Resolution:

 What does Oracle do to prevent bot activity? And what can and should you do?

  • Cloud Operations has installed a robots.txt file on each site to minimize the issues that spiders present.
  • When known spiders access your site, the activity is not counted against your billable sessions.
    • More information about this is available here: Large number of spiders in the Internet Spider Activity Report
      • Of note from this answer:
        Because of the consistent expansion of web spiders there might be times when a new and unknown spider indexes your site which does cause a spike in your answer views and or sessions. When you discover new spiders hitting your site you can always add them to the configuration setting:

        SEC_SPIDER_USER_AGENT
        Defines custom User Agents that are known web spiders. Valid entries include a comma-separated list of User Agent sub-strings (for example, MyCustomSpider or InternalGoogle). Values for this setting are case sensitive. Browsers supplying this User Agent will not experience different behavior on end-user pages in order to be in compliance with the major web search providers, though they will be considered a web spider for statistics collection. Default is blank.

  • You can edit the SEC_INVALID_ENDUSER_HOSTS configuration to explicitly list which hosts are not allowed to access the end-user interface: 
  • Monitor your site activity


For additional information, please see Session usage information (includes Demystifying Session Usage.pdf).

Available Languages for this Answer:

Notify Me
The page will refresh upon submission. Any pending input will be lost.