[Sussex] Apache throttling/fair use

Steve Dobson steve at dobson.org
Sun Jan 23 22:28:15 UTC 2005


Alan, Mark, Thomas

Thanks for advice, looks like I have some RTFMing to do :-)

Can these help with a user making a copy of each page:

  i.e
     1: Loads a pages
     2: Selects save
     3: Clicks on link to the next page
     4: Goto 2 until all pages saved

The site has a login - cookie based, so most robots which are
not hand driven (wget --load-cockies <file>) will not get past the login
page which I guess is okay for my client.   He wants to be very protective
of his copyright, which I understand as it is the corner stone of his
business.

He also wants to allow prospective new clients a limited access to the
site.  They would be allowed to view a few pages per day.  Can any of
this stuff help here, or do I need a new mod (maybe one I wrote myself).

Steve




More information about the Sussex mailing list