If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Yes, with 300 concurrent users doing searches, this could put a lot of load on the system.
I would,
1) Put the search function on a dedicated server so it isn't being shared by other tasks. You should be able to justify the cost of a dedicated server for 300 users.
2) If you can, get a dual CPU or dual core machine.
3) Your index will be small for only 1000 files. So create a RAM disk and copy all the index files to the RAM disk. Then run the search from the RAM disk. Create a script to do this at boot up. Here is the start of a script,
mkdir /tmp/ramdisk0
mke2fs /dev/ram0
mount /dev/ram0 /tmp/ramdisk0
The will reduce the search times.
4) The next version of the CGI will be faster. Between 2x and 10x depending on the search. So if none of the above works for you. E-Mail us for a early beta build of this new version.
Comment