Why was there a spike to the Service Summary report over the course of a few days?
Analytics, Web Indexing
Oracle Service Cloud
Generally, when you see dramatic spikes of traffic data, it is due to your site being accessed by a spider. Spiders, also known as robots, are automated searching utilities that come in many varieties, from search engine indexers to email address harvester scripts.
For details on how to observe site activity see Answer ID 9693: How to understand and investigate site activity
To minimize the issues that spiders present, a robots.txt file is installed on each interface.
For more information on robots, refer to Answer ID 1669: Allowing other search engines to index the Oracle Service Cloud application.
Utilizing the robots.txt file is a good solution. Along with the statistical recording benefits, it also optimizes the end-user pages for spiders. Unfortunately not all web spiders obey robots.txt rules and may by pass the robots.txt file and be included in your stats. In these cases, you will want to include the user agent of the spider in the SEC_SPIDER_USER_AGENT configuration setting.
For more information on this setting, please see the following answer: Answer ID 4280: Spider Bots.
If you have questions around what generates a session and how you can prevent inaccurate session billing on your site please review Demystifying Session Usage (PDF). Some simple mis-steps in customization and configuration can increase billable sessions. For more information, see Session usage information.