PassMark Logo
Home » Forum

Announcement

Collapse
No announcement yet.

zoom_wordmap.zdat is 0 KB after latest crawl

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • zoom_wordmap.zdat is 0 KB after latest crawl

    The zoom_wordmap.zdat file for my latest crawl is saying it is 0KB.

    Normally our 'wordmap' file is at least 450MB.

    The only thing I changed was unchecking the 'Context description' and 'Image' in the 'Search results layout' section.

    Is this to be expected after unchecking those options? Or has something gone wrong?

  • #2
    Something's gone wrong and there should be error messages in the Log window. Make sure you have Errors and Warnings checked to be displayed. Let us know what it's saying in your log.
    --Ray
    Wrensoft Web Software
    Sydney, Australia
    Zoom Search Engine

    Comment


    • #3
      Hi,

      I've copied the end of the log file below. It looks like there were some issues closing the pagetext data file - perhaps this caused the issue in the wordmap file.

      Code:
      03|04/04/16 08:20:04|Writing index data for CGI/Linux search... (Please wait)
      03|04/04/16 08:20:04|Created pagedata data file (zoom_pagedata.zdat)
      08|04/04/16 08:20:04|Failed to close pagetext data file (zoom_pagetext.tmp)
      03|04/04/16 08:20:04|Created pageinfo data file (zoom_pageinfo.zdat)
      08|04/04/16 08:20:04|Failed to close pagetext data file (zoom_pagetext.zdat)
      03|04/04/16 08:20:04|Flushing index data to disk ...
      03|04/04/16 08:20:05|Merging all flushed wordmap data files ...
      03|04/04/16 08:20:18|Created wordmap data file (zoom_wordmap.zdat)
      03|04/04/16 08:20:18|Created dictionary data file (zoom_dictionary.zdat)
      03|04/04/16 08:20:18|Created script settings file (settings.zdat)
      10|04/04/16 08:20:18|Indexing completed at Mon Apr 04 08:20:18 2016
      12|04/04/16 08:20:18|INDEX SUMMARY
      12|04/04/16 08:20:18|Files indexed: 175695
      12|04/04/16 08:20:18|Files skipped: 1663452
      12|04/04/16 08:20:18|Files filtered: 5812
      12|04/04/16 08:20:18|Files downloaded: 181520
      12|04/04/16 08:20:18|Unique words found: 360132
      12|04/04/16 08:20:18|Variant words found: 254499
      12|04/04/16 08:20:18|Total words found: 142996078
      12|04/04/16 08:20:18|Avg. unique words per page: 2.05
      12|04/04/16 08:20:18|Avg. words per page: 813
      12|04/04/16 08:20:18|Start index time: 05:00:02 (2016/04/03)
      12|04/04/16 08:20:18|Elapsed index time: 27:20:16
      12|04/04/16 08:20:18|Peak physical memory used: 387 MB
      12|04/04/16 08:20:18|Peak virtual memory used: 940 MB
      12|04/04/16 08:20:18|Errors: 9
      12|04/04/16 08:20:18|URLs visited by spider: 186338
      12|04/04/16 08:20:18|URLs in spider queue: 0
      12|04/04/16 08:20:18|Total bytes scanned/downloaded: 1629911972
      12|04/04/16 08:20:18|File extensions: 
      12|04/04/16 08:20:18|    .htm indexed: 0
      12|04/04/16 08:20:18|    .html indexed: 0
      12|04/04/16 08:20:18|    .php indexed: 171756
      12|04/04/16 08:20:18|    .asp indexed: 0
      12|04/04/16 08:20:18|    .aspx indexed: 0
      12|04/04/16 08:20:18|    .txt indexed: 0
      12|04/04/16 08:20:18|    .cgi indexed: 0
      12|04/04/16 08:20:18|    .shtml indexed: 0
      12|04/04/16 08:20:18|    .pl indexed: 0
      12|04/04/16 08:20:18|    .php3 indexed: 0
      12|04/04/16 08:20:18|    No extensions indexed: 3939
      02|04/04/16 08:20:18|Cleaning up memory used for index data... please wait.
      08|04/04/16 08:20:18|Failed to close pagetext data file (zoom_pagetext.zdat)
      02|04/04/16 08:20:18|Finished cleaning up memory.
      03|04/04/16 08:20:19|Copied search script to: C:\Users\Administrator\Documents\Wrensoft\Zoom Search Engine Indexer\search.cgi
      03|04/04/16 08:20:31|Created XML sitemap index file (sitemap_index.xml)
      03|04/04/16 08:20:31|Created XML sitemap files (sitemap.xml to sitemap4.xml)
      Is there any particular reason you can see as the issue?

      The spider will crawl again on Sunday, so perhaps it was just a one off.

      Comment


      • #4
        Our latest crawl has built successfully.

        I believe the issue may have been caused by a lack of space on the compiling server.

        The only thing a changed was clearing space on the server, so perhaps that was the issue.

        Comment

        Working...
        X