I have a large website that I want to create separate data indexes for (because of size and performance issues). In using PHP and text files, this is not a problem because I can store the data indexes in separate "_search" folders. With CGI is appears that the data index must be loaded in the "cgi-bin" folder, which precludes creating separate data indexes because you cannot independently control "zoom_xxxxxx.zdat" naming (allowing to you generate different sets) and have search.cgi be pointed at one or the other.
For example, I want to index the site exclusive of an "e-library" (a collection of electronic books), but also provide a separate search form to search the e-library only. The text searches are taking between five and eight seconds - seems longer than that - for just the site exclusive of the e-library, which was dropped because I ran up against an ISP set 500MB memory limit.
Obviously, CGI scripts are the way to go performance-wise, but don't see a way around the problem described above.
Does someone have an idea on how to work around this issue?
For example, I want to index the site exclusive of an "e-library" (a collection of electronic books), but also provide a separate search form to search the e-library only. The text searches are taking between five and eight seconds - seems longer than that - for just the site exclusive of the e-library, which was dropped because I ran up against an ISP set 500MB memory limit.
Obviously, CGI scripts are the way to go performance-wise, but don't see a way around the problem described above.
Does someone have an idea on how to work around this issue?
Comment