PassMark Logo
Home » Forum


No announcement yet.

No Index without Enabling robots.txt support

  • Filter
  • Time
  • Show
Clear All
new posts

  • No Index without Enabling robots.txt support


    We use the robots 'noindex' tag on certain pages to tell the big search engines not to index that page - but we still want our internal Zoom search engine to include those pages.

    However, there are pages that we want the Zoom Search Engine to skip completely - effectively the same as the robots 'noindex' tag.

    Is there such a tag like the 'ZOOMSTOP' tag that will do this? Something like 'ZOOMNOINDEX'?

    Or is there a way we can emulate this behavior by using Filtering?

    We can't add the URL's into the Skip List as only certain pages under the same dynamic URL need to no be indexed.

    I hope that makes sense.

    Any guidance you can give will be appreciated.


  • #2
    Just a quick UPDATE

    I missed a section in the Manual where it says you can essentially use filtering to do what I'm after.

    But instead of filtering for -<meta name="robots" content="noindex">

    I will filter for -ZOOMNOINDEX

    My new question is.....

    Will the Zoom Search Engine recognise the filter word if it is commented out?

    <!-- -ZOOMNOINDEX -->



    • #3
      HTML comments will be ignored by the Content Filter feature.

      But you could use an unique meta tag instead, e.g.

      -<meta name="ZOOMROBOTS" content="noindex">

      Meanwhile... note that to allow Zoom to index pages that you've instructed other search engines to noindex, make sure you have unchecked "robots.txt support" under "Configure"->"Spider options"

      Originally posted by bhtech View Post
      We can't add the URL's into the Skip List as only certain pages under the same dynamic URL need to no be indexed.

      Not sure if I follow this. If it has the same dynamic URL, then wouldn't it return as the same page?

      If you mean that it's the same script (e.g. "mydynamicpage.php") but with different HTTP GET parameters, e.g.

      Then you can still use the Skip List and specify those parameters. It is not limited to just the filename. It applies to the entire URL.
      Wrensoft Web Software
      Sydney, Australia
      Zoom Search Engine


      • #4
        Hi Ray,

        Perfect - exactly what I was after.

        I will add the custom meta tag and filter that way.

        I can confirm that we have the Enable "robots.txt" support checkbox unchecked.

        Apologies for my poor explanation of the dynamic URL - I did mean the same script with different parameters.


        Our logic in the script is if the thread id (10 and 20 in the example) is lower than 15 than make the page no index.

        I'm not sure if I would be able to achieve my goal through the skip-list, however I'm happy to do it through the Filtering option.

        Let me know if there is a way though.



        • #5
          Hi Ray,

          We have been successfully using the settings above to filter out the required pages.

          However I am now curious to know if there is a way to do a No Index BUT Follow links rule?

          Similar to setting <meta name="robots" content="noindex,follow" /> for search engine bots.

          Is that something I can achieve?



          • #6
            You don't need to use "Follow".
            Specifying "index" or "follow" values in the robots meta tag will have no effect as this is thedefault behaviour for all pages scanned.


            • #7

              I mean that the links on the page will be followed, but the page itself won't be in the index.

              Is that possible?



              • #8
                Yes, that is what you get with the noindex flag.