Page 1 of 1

Ex newsleecher

PostPosted: Sat Sep 27, 2014 3:59 am
by Downloadski
I am glad i was told about newsbin.
Used newsleecher for years, but stayed with 3.9 as i did not like the caching on the download directory.
With 3.9 i had the temp directory mapped to a 4 GB ramdrive. This worked ok to cache the parts till the file was completed to write.

I have upgraded to 500 mbps line and i did not manage speeds over 350mbps with newsleecher.
With some settings i did i saw peaks of 530 mbps now with newsbin.

ChunkCacheSize=1000
MemCacheLimit=2000

I do see the chunk buffer go back to 200 when i download large things (50 gb discussions)
The rar files in these discussions are up to 1 GB with only 1 or 2 blocks sometimes


Can someone explain me if i assume correctly that:
Memcachelimit is per KB, so 2000 is 2 GByte

Chunk cachesize = number of 384 ? kbyte usenet packets to store ?
So 1000 would be 384 MByte (assuming that 384 is the correct value)

I can than increase the buffers so i use my available ram beter (12 Gb under windows 7)

Re: Ex newsleecher

PostPosted: Sat Sep 27, 2014 7:05 am
by Quade
MemCacheLimit isn't used anymore.

ChunkCacheSize, each block represents one post for a file, most posts are under 1 Meg, typically 250K-600K on average. The sizes are variable depending on what you download, they can each grow to up to about a meg depending on the size of the posts.

The reason to run a large chunk cache size is to buffer downloads while Newsbin is doing something heavy duty on disk.

Re: Ex newsleecher

PostPosted: Sat Sep 27, 2014 8:03 am
by Downloadski
So a chunk is than basically the linesize a poster used while posting like 3000-5000 ?

I will remove that memcachelimit parameter than, and order a license.

Re: Ex newsleecher

PostPosted: Sat Sep 27, 2014 10:00 am
by Quade
Chunk = Decoded Posts (yEnc/Mime/UU decoded) in Newsbin nomenclature.

You size to chunkcachesize to try to fit all of a file into memory so, Newsbin only has to write the file once. The rest of the chunks buffer the download so, download can continue even if the thing that writes files to disk gets busy.

If you want to PM an example of what you're downloading, I'll check it out. Don't post it here out in the open.

Re: Ex newsleecher

PostPosted: Sat Feb 28, 2015 10:09 am
by Downloadski
After some months i can say i really like this product.
Steady downloads at line rate of 536 mbps most of the time.

Re: Ex newsleecher

PostPosted: Sat Feb 28, 2015 10:34 am
by Quade
Wow that's pretty impressive.

Re: Ex newsleecher

PostPosted: Sat Feb 28, 2015 10:57 am
by Downloadski
Yes, I had to learn to use the options a bit, but it does a great job. On the pek 67 Mbyte/sec comes down.
It is than unrarring a file, parring an other file and downloading the 3rd file at the same time Image link not allowed for unregistered users
Really nicely made product.

Re: Ex newsleecher

PostPosted: Sat Feb 28, 2015 11:07 am
by Quade
I was just talking on the IRC channel about how to make the unrar process work better. My plan is incremental unrar, where it unrars after each file downloads so, there's no big unrar at the end. I think that would make it work even faster.

Re: Ex newsleecher

PostPosted: Sat Feb 28, 2015 11:32 am
by Downloadski
I download all parts on SSD and unrar to raid-5 volume, so IO is not an issue with 500 mbps yet.