Filters?
Posted: Sat Mar 27, 2004 6:51 am
In general, what's the best method or strategy for determining how to setup the Page & Picture filters in Picaloader to make downloading images from a website fast & efficient? What's the best way to analyze a website to accomplish this? I know I use a lot of trial and error.
I know every site will be different, I'm more or less looking for a general "How to" for dummies! Are there any tricks that you can share? A good site to use as an example would be WonderfulGirls.
When I try to do this, especially with websites based on javascript, php, etc., I usually start by trying to get a very small sample of the website but I get the filters wrong and download nothing. Sometimes I can get the the filters to work pretty good. The alternative is no filters and get everything. I'm mostly concerned with analzing a bunch of links that don't have anything to do with images which just wastes a lot of time. If I can catch them on the Monitor Tab I usually add them to an Exclude Filter. Is there some way to log all the urls into a file that can be analyzed?
I'm also having trouble with a site that uses a lot of Flash. I can get some of it by creating separate Tasks for a specific webpages but I can't get Picaloader to download anything from the main web page. It gets to some url with "function" at the end and it & Stops right after that. I guess it can't analyze it. It has some adult content or I would post its url here for you to take a look at.
Well its getting late, TIA
I know every site will be different, I'm more or less looking for a general "How to" for dummies! Are there any tricks that you can share? A good site to use as an example would be WonderfulGirls.
When I try to do this, especially with websites based on javascript, php, etc., I usually start by trying to get a very small sample of the website but I get the filters wrong and download nothing. Sometimes I can get the the filters to work pretty good. The alternative is no filters and get everything. I'm mostly concerned with analzing a bunch of links that don't have anything to do with images which just wastes a lot of time. If I can catch them on the Monitor Tab I usually add them to an Exclude Filter. Is there some way to log all the urls into a file that can be analyzed?
I'm also having trouble with a site that uses a lot of Flash. I can get some of it by creating separate Tasks for a specific webpages but I can't get Picaloader to download anything from the main web page. It gets to some url with "function" at the end and it & Stops right after that. I guess it can't analyze it. It has some adult content or I would post its url here for you to take a look at.
Well its getting late, TIA