Ex newsleecher

This is the place for people switching to Newsbin from other newsreaders, or for people evaluating Newsbin who are already experienced Usenet users. If your old newsreader had a feature you liked that Newsbin doesn't, tell us about it.
Forum rules
- No discussion of specific content found on Usenet.
- Be nice to others. Trolls and Flamers will be banned at our discretion
- No advertising for other newsreaders, products, or services.

Ex newsleecher

Postby Downloadski » Sat Sep 27, 2014 3:59 am

I am glad i was told about newsbin.
Used newsleecher for years, but stayed with 3.9 as i did not like the caching on the download directory.
With 3.9 i had the temp directory mapped to a 4 GB ramdrive. This worked ok to cache the parts till the file was completed to write.

I have upgraded to 500 mbps line and i did not manage speeds over 350mbps with newsleecher.
With some settings i did i saw peaks of 530 mbps now with newsbin.

ChunkCacheSize=1000
MemCacheLimit=2000

I do see the chunk buffer go back to 200 when i download large things (50 gb discussions)
The rar files in these discussions are up to 1 GB with only 1 or 2 blocks sometimes


Can someone explain me if i assume correctly that:
Memcachelimit is per KB, so 2000 is 2 GByte

Chunk cachesize = number of 384 ? kbyte usenet packets to store ?
So 1000 would be 384 MByte (assuming that 384 is the correct value)

I can than increase the buffers so i use my available ram beter (12 Gb under windows 7)
Downloadski
Occasional Contributor
Occasional Contributor
 
Posts: 16
Joined: Sat Sep 27, 2014 3:26 am

Re: Ex newsleecher

Postby Quade » Sat Sep 27, 2014 7:05 am

MemCacheLimit isn't used anymore.

ChunkCacheSize, each block represents one post for a file, most posts are under 1 Meg, typically 250K-600K on average. The sizes are variable depending on what you download, they can each grow to up to about a meg depending on the size of the posts.

The reason to run a large chunk cache size is to buffer downloads while Newsbin is doing something heavy duty on disk.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 44865
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: Ex newsleecher

Postby Downloadski » Sat Sep 27, 2014 8:03 am

So a chunk is than basically the linesize a poster used while posting like 3000-5000 ?

I will remove that memcachelimit parameter than, and order a license.
Downloadski
Occasional Contributor
Occasional Contributor
 
Posts: 16
Joined: Sat Sep 27, 2014 3:26 am

Re: Ex newsleecher

Postby Quade » Sat Sep 27, 2014 10:00 am

Chunk = Decoded Posts (yEnc/Mime/UU decoded) in Newsbin nomenclature.

You size to chunkcachesize to try to fit all of a file into memory so, Newsbin only has to write the file once. The rest of the chunks buffer the download so, download can continue even if the thing that writes files to disk gets busy.

If you want to PM an example of what you're downloading, I'll check it out. Don't post it here out in the open.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 44865
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: Ex newsleecher

Postby Downloadski » Sat Feb 28, 2015 10:09 am

After some months i can say i really like this product.
Steady downloads at line rate of 536 mbps most of the time.
Downloadski
Occasional Contributor
Occasional Contributor
 
Posts: 16
Joined: Sat Sep 27, 2014 3:26 am

Re: Ex newsleecher

Postby Quade » Sat Feb 28, 2015 10:34 am

Wow that's pretty impressive.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 44865
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: Ex newsleecher

Postby Downloadski » Sat Feb 28, 2015 10:57 am

Yes, I had to learn to use the options a bit, but it does a great job. On the pek 67 Mbyte/sec comes down.
It is than unrarring a file, parring an other file and downloading the 3rd file at the same time Image link not allowed for unregistered users
Really nicely made product.
Downloadski
Occasional Contributor
Occasional Contributor
 
Posts: 16
Joined: Sat Sep 27, 2014 3:26 am

Re: Ex newsleecher

Postby Quade » Sat Feb 28, 2015 11:07 am

I was just talking on the IRC channel about how to make the unrar process work better. My plan is incremental unrar, where it unrars after each file downloads so, there's no big unrar at the end. I think that would make it work even faster.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 44865
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: Ex newsleecher

Postby Downloadski » Sat Feb 28, 2015 11:32 am

I download all parts on SSD and unrar to raid-5 volume, so IO is not an issue with 500 mbps yet.
Downloadski
Occasional Contributor
Occasional Contributor
 
Posts: 16
Joined: Sat Sep 27, 2014 3:26 am


Return to Immigrants from other Newsreaders

Who is online

Users browsing this forum: No registered users and 1 guest