...
For example, consider a URL of the form:
- http://www.blahblah.com/search?q=search_terms&page=1
Then the following parameters would be used: "pageChangeRegex": "(page=\d+)", "pageChangeReplace": "page=$1", "numResultsPerPage": 1.
And for a URL of the form:
- http://www.blahblahblah.com/search?q=search_terms&pagesize=20&start_result=0
The the following parameters would be used: "pageChangeRegex": "(start_result=\d+)", "pageChangeReplace": "start_result=$1", "numResultsPerPage": 20
Finally, it is more likely that standard web-crawling measures are needed such as custom user-agents, and per-page wait times. Because these might well be different from the search engine to the pages themselves, "searchConfig" has its own "waitTimeBetweenPages_ms", "userAgent" fields (it if not specified these are inherited from the parent "rss" object).