Incremental Backup is a great feature useful for many sites. However, Off-Line Backup is still much faster and has many other advantages, including being less fragile. I would think that the advantages of both could be easily achieved in a future release with an option to have the crawler save a copy of the files it wants to index to a local folder.
This would allow an initial full crawl of a site that would make the relevant files local for local indexing. Thereafter, incremental crawls would allow the off-line files to be maintained as a copy of what was on the site. The actual indexing could be to the off-line copy and be robust.
It would also solve the problem of indexing multiple sites into the same index when some are local and some remote.
Of course, this would not work for all sites, but then nothing does. It would be most useful to sites that rarely remove pages but there are lots of those. Since Zoom already has the file locally in order to index it, it would seem to be a minor enhancement to implement and a major benefit to many users.
This would allow an initial full crawl of a site that would make the relevant files local for local indexing. Thereafter, incremental crawls would allow the off-line files to be maintained as a copy of what was on the site. The actual indexing could be to the off-line copy and be robust.
It would also solve the problem of indexing multiple sites into the same index when some are local and some remote.
Of course, this would not work for all sites, but then nothing does. It would be most useful to sites that rarely remove pages but there are lots of those. Since Zoom already has the file locally in order to index it, it would seem to be a minor enhancement to implement and a major benefit to many users.
Comment