Page 1 of 1

Cache header backlog using filters - all files fail

PostPosted: Tue Jun 06, 2017 8:34 am
by Bexley
Hi, I am using positive filters for a group (accept certain users only). I continually end up with blocked cache. For example now I have 21 files on Import for a group. Following your procedure to clear this I remove the locked file by copying the Import files elsewhere to see what is locked. The problem is Newsbin fails to process on every single file so I end up doing stop and restart 21 times, or I can just delete all the Import files and not get an update. Since these are filtered updates - no spam, much lower volume - why is Newsbin failing consistently here? Any thoughts on what to do? Regards

Re: Cache header backlog using filters - all files fail

PostPosted: Tue Jun 06, 2017 10:11 am
by Quade
I'm skeptical the filters have anything to do with the fact your GZ files are stalling. It's more likely they're corrupt on disk and that's what's causing them to stall.

You could try to un-GZ them with WinRAR. That might tell you if they're damaged or not. If you're running out of disk space while doing header downloads, that could corrupt them too.

Re: Cache header backlog using filters - all files fail

PostPosted: Wed Jun 07, 2017 9:03 am
by Bexley
Thanks, I will check. Only mentioned filters to indicate lower volumes, nothing to do with them I think. Regards

Re: Cache header backlog using filters - all files fail

PostPosted: Sun Jun 11, 2017 1:20 pm
by Bexley
Hi, there appeared to be nothing wrong with the gz files, able to open with 7z no problem. Disk is fast SSD with download and decoding sent to other disks. Over 100GB free. Regards

Re: Cache header backlog using filters - all files fail

PostPosted: Mon Jun 12, 2017 12:15 am
by Quade
Maybe we should revisit your filtering idea then.

The question is whether the GZ files are really getting stuck or whether they filters are slowing things down enough to make them appear stuck.

How complicated are the filters? Filtering the header imports is much more expensive than filtering later.

Re: Cache header backlog using filters - all files fail

PostPosted: Mon Jun 12, 2017 9:11 am
by Bexley
Hi, thanks as always for the prompt reply. First, I opened and tested the gz, it opened but the crc failed. This is 100MB download but the SSD is fast, etc (gaming hardware). The filter is accept if string match (no regex). I can try poster match? So today there were 6 chunks 40MB, on test all failed crc check. I will try poster match filter.

Re: Cache header backlog using filters - all files fail

PostPosted: Mon Jun 12, 2017 9:17 am
by Bexley
Negatory on poster filter, 4 small chunks, 1.4m, 10.5kb, 1.56m, 141kb all fail crc. Really it seems to fail with even the smallest update, mystified here.

Re: Cache header backlog using filters - all files fail

PostPosted: Mon Jun 12, 2017 5:32 pm
by Quade
Hi, thanks as always for the prompt reply. First, I opened and tested the gz, it opened but the crc failed. This is 100MB download but the SSD is fast, etc (gaming hardware). The filter is accept if string match (no regex). I can try poster match? So today there were 6 chunks 40MB, on test all failed crc check. I will try poster match filter.


If you have bad GZ files that probably explains the stalling. The main question then is why your GZ files get corrupted but mine don't. GZ files aren't that complicated. Newsbin reads data from the internet and writes it out compressed as GZ files. There's not a bunch of room for it to get corrupted. Typically I suspect either something like bad RAM or your security software damaging the files.