Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Simon Dingley 1474 posts 3431 karma points c-trib
    Jul 21, 2009 @ 16:37
    Simon Dingley
    0

    Taking it forward

    Great work Lee, I was considering something similar which never really got started. I was looking at making use of wwwRobotRules on CodePlex http://robotrules.codeplex.com/ - may be of interest or use to you if you decided to do anything further to this package.

  • Lee Kelleher 4026 posts 15836 karma points MVP 13x admin c-trib
    Jul 22, 2009 @ 23:23
    Lee Kelleher
    0

    (I'd completely forgotten that I'd set-up this forum **yikes**)

    Hi Simon,

    I looked at WWWRobotRules to do the validation... but it was slightly overkill for our needs.  That library is more targetted towards crawler apps, downloading a remote robots.txt, parsing it and checking/verifying links/URLs against the allow/disallow rules.

    If you take a look at the source-code for our Robots.txt Editor - I do a simple validation, looping through each line checking for valid keywords (i.e. "User-Agent", "Disallow", etc).

    - Lee

Please Sign in or register to post replies

Write your reply to:

Draft