Hi
I ran the v5 indexer last week on a large website and we have since been suffering a major problem with our connections to the 'backend' MySQL database server. Our website apparently suddenly started to request over 50 connections at any one time and the server failed - and is still failing.
The server people not unreasonably point at the Zoom Indexing as the activity that coincides with the database problems.
My understanding of Zoom software is it makes a single sweep of the accessible website files (this was done online) and stores all information in one or two of several large files on the webserver. Cgi code is used to interrogate those files (I used the cgi option for the search page) whenever a search is done. No ongoing database connections are needed. Is that a correct assumption?
Is there any way Zoom could increase the number of database connections needed by a website?
I ran the v5 indexer last week on a large website and we have since been suffering a major problem with our connections to the 'backend' MySQL database server. Our website apparently suddenly started to request over 50 connections at any one time and the server failed - and is still failing.
The server people not unreasonably point at the Zoom Indexing as the activity that coincides with the database problems.
My understanding of Zoom software is it makes a single sweep of the accessible website files (this was done online) and stores all information in one or two of several large files on the webserver. Cgi code is used to interrogate those files (I used the cgi option for the search page) whenever a search is done. No ongoing database connections are needed. Is that a correct assumption?
Is there any way Zoom could increase the number of database connections needed by a website?
Comment