Hi,
I did get a complain from my internet provider that one site hade received 150000 requests. This had filled up some memory on their server, resulting in that this site was automatacally shut down. I have read on the forum that one can slove down the spidering. But will that help. The hit per second doesn't seem to have been the problem, rather the amount of hits. I run from one PC over not so fast broadband. Used two threads, minimum delay and "Reload all files" (realize that this is not so good)
What can be done to avoid this problem? Is it enough too slow down the spidering?
I did get a complain from my internet provider that one site hade received 150000 requests. This had filled up some memory on their server, resulting in that this site was automatacally shut down. I have read on the forum that one can slove down the spidering. But will that help. The hit per second doesn't seem to have been the problem, rather the amount of hits. I run from one PC over not so fast broadband. Used two threads, minimum delay and "Reload all files" (realize that this is not so good)
What can be done to avoid this problem? Is it enough too slow down the spidering?
Comment