NBP at the beginning fast, but now it got really slow.

There is no such thing as a stupid question here as long as it pertains to using NewsBin. Newbies feel free to get help getting started or asking questions that may be obvious. Regular users are asked to be gentle. :-)

Moderators: Quade, dexter

NBP at the beginning fast, but now it got really slow.

Postby RankoSaotome » Sun Aug 20, 2017 9:26 am

Hello again.

In the first time NBP did what I wanted.
Fast download, collecting the downloaded files in one Subfolder and it was so fast that I almost could not sort it away (downloadspeed up to maximum around 130MBit/s). Because of that I moved the download folder from a 1TB SSD to a 10TB HDD. The chunks/spool folders always remainded on a local SSD.
Everything kept stable and it was great for me.

Now after almost one month, it keeps going more slowly. Now downloadspeed in the Win TaskManagere looks like sawtooht or so: 30 seconds it does not download anything, then for 5-10 seconds it loads something with 10-30MBit/s and then nothing again...).
The chunks Folder gets filled with hundred thousands files. I don't find a way to assemble the chunks to useable Parts, that MultiPAR can work with.
And I can't figure out, what slows NBP down.

What I tried:
- Reboot System almost every 1 oder 2 days. Sometime it helps for some Minutes/ohours but gets that slow again.
- increaced cache to 2000, but it never reached 0 so I dont think the problem is the cache.
- I stopped NBP, cleaned the chunks-Folder (deleted everything), restartet PC and then started NBP again.
Sometimes it did work, sometimes NBP seems to download nothting for a very long time and then continues to go slow.
- Then I changed the locations of the Folders from the local SSDs to local HDDs and as it did not help back again.
No change.

Somehow I am now clueless how/where to look:
- What does NBP do in every minute? Is there a way to look NBP on the fingers?
- Why does it not assemble the chunks? Or how to assemble the chunks myself, so that I get broken Parts, that PAR can try to recover?
- What do the status letters in the downloadwindow mean? Example: [5 files, 3 pars] D:0 DL:0 Retry:0 N3

With the program I used earlier, I also had the problem, that the chunks (there they were called parts) did not get assembled to useable files automatically, because some postings were missing, but I could manually start the assembly and then throw all broken *.rar and *.par2 in one Folder and let MultiPar do its job.
Somehow I don't see how I can do it with NBP.

Any Ideas?

Thank you in advance!

My configuratin now:
8Core AMD CPU up to 4GHz, 32GB DDR3 Ram, several SSD and HDD, Win 764Bit (CPU Load at around 25% Memory usage around 10GB). This hardware has for this Longime Test of NBP almost nothing else to do. So the Problem is not an other programm hogging resources or disrupting something.
Chunks-Folder go on a local SATA SSD (1TB F: only for this purpose. Now almost 400GB Chunk-Files (more than 600.000))
Download Folder resides on a local SATA HDD (10TB HDD H:; almost empty)
NBP got fed with nzb files and older postings from the Groups (currently around 3000 files in download queue).
NBP PAR enabled, NBP Unrar deactivated, because of the nature of the Files. Sometimes the content is a complete File *.mkv, sometime it is a subfolder in the Archive, sometimes teher are multiple subfolders recursively in the Archive, and when I let it all unrar in one Folder I get a bit meshup from all hundrets/thousands of downloads.
Please excuse my typos. I am not native english speaking.
= = = =
System: AMD FX8350 (8core @4GHz), 32GB DDR3, some SSD and some HDD, Win7 64bits
RankoSaotome
Occasional Contributor
Occasional Contributor
 
Posts: 13
Joined: Tue Jul 25, 2017 6:58 pm

Re: NBP at the beginning fast, but now it got really slow.

Postby Quade » Sun Aug 20, 2017 3:07 pm

Now almost 400GB Chunk-Files (more than 600.000))


Might want to wipe these files then restart Newsbin and see what happens. I'd reduce the cache back down to 200-400 too. The cache usage today is completely different from when the large cache recommendations were made. The new beta puts even less stress on the cache. You really shouldn't be collecting chunk files like this. It suggest the download has gotten way ahead of assembly. Again, the new beta has changes the should reduce the connection of chunks.

Couple things might cause issues.,

1 - Make sure any NZB autoload folder DO NOT scan the download folder.

2 - The various cache files "Signature.db3" and "DownloadMarker.db3". Grow forever so, as they grow performance will go down. Might be worth moving them out of the data folder and see what happens to performance.

3 - Might not be related to Newsbin at all. Maybe you virus scanner has updated and it scanning the downloaded files more heavily.

Now after almost one month, it keeps going more slowly. Now downloadspeed in the Win TaskManagere looks like sawtooht or so: 30 seconds it does not download anything, then for 5-10 seconds it loads something with 10-30MBit/s and then nothing again...).
The chunks Folder gets filled with hundred thousands files. I don't find a way to assemble the chunks to useable Parts, that MultiPAR can work with.
And I can't figure out, what slows NBP down.


Suggests you have some major disk bottleneck. As an experiment, I might remove all non-MS supplied security software, reboot and try some test downloads to see if performance improves.

You might want to give the latest beta a try. The way it handles chunks is quite different now.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 42501
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: NBP at the beginning fast, but now it got really slow.

Postby RankoSaotome » Mon Aug 21, 2017 10:44 pm

Hello.

This morning I tried this:
I deleted all chunks.
I removed Signature.db3 and DownloadMarker.db3.
I never used autoload nzb.
Then I booted the PC and startet NBP (6.73 #4821).

Yes like I experienced in the past it helped a bit, at first.
NBP was fast again, then after some hours it began developing the sawtooth pattern in Win7-Taskmanager networkdisplay.
And now (almost a day later) I have in the chunks folder (on a SSD) around 500GB in over 800.000 chunkfiles, the network speed currently jumps between 0, 34 and 60MBit/s in the mentioned Sawtooth pattern.

No, there is no disk bottleneck.
Local copying works with more than 100-300MByte/s (depending on how long and big the transfer is, if SSD or HDD are involved and if Windows cache is filled or not).
Transfer from this PC to LAN-network works with speeds over 400MByte/s writing on Raidarrays on the target PCs (yes this is a 10G Network based on a Intel X540-T1 Interface and the fitting 10G Switches and serveral other 10G attached PC).
Yes I have AV Software running and I will test the System sometimes later without it, but since NBP does have this Problem, but no other software with major disk transfers, I really doubt it is the cause.
In the past, when I helped others, and their AV-Software did slow down, most of the time the CPU rate was very high. This is not the problem with this machine.

My wild guess: the massive chunks and Databases to keep track of the chunks and the checksums of downloaded parts and so on are a bit much. (even windows alone need several seconds just to list over 800.000 Files).
So I get again to my question:
The chunks Folder gets filled with hundred thousands files. I don't find a way to assemble the chunks to useable Parts, that MultiPAR can work with.

You suggested the latest beta. Is there a way to assemble the chunks that belog together with their correct Filename, so I can throw them in one folder and let Multipar or so take over?
Please excuse my typos. I am not native english speaking.
= = = =
System: AMD FX8350 (8core @4GHz), 32GB DDR3, some SSD and some HDD, Win7 64bits
RankoSaotome
Occasional Contributor
Occasional Contributor
 
Posts: 13
Joined: Tue Jul 25, 2017 6:58 pm

Re: NBP at the beginning fast, but now it got really slow.

Postby Quade » Wed Aug 23, 2017 1:21 pm

I'd try the current beta then. It works differently and you should have fewer chunks. There is no DB tracking chunks. The "DB" is the folder itself.

In your case I might tell Newsbin to stop downloading while unraring and repairing. It's in the performance options.

The fact your chunk folder fills up indicates that you're downloading much faster than par/unrar can keep up. This is relatively common. B8 changes how files are assembled which should reduce chunks. In this case, the backlog will be in the PAR processing instead of in the chunk processing.

No, there is no disk bottleneck.


There's always a disk bottle-neck. Sustained copying isn't the same as the back and forth needed to download, repair and unrar. Try copying two large files at the same time to the same drive and you'll see performance fall on it's face.

Yes I have AV Software running and I will test the System sometimes later without it, but since NBP does have this Problem, but no other software with major disk transfers, I really doubt it is the cause.


Depending on what you're running, this is in fact normally the primary cause of slow downloads. I can't count the number of people who denied it but then tried my suggestion and found out it was true.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 42501
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: NBP at the beginning fast, but now it got really slow.

Postby RankoSaotome » Fri Aug 25, 2017 12:16 am

Quade wrote:I'd try the current beta then. It works differently and you should have fewer chunks.

First I did reduce the cache to 400.
Then 2 days ago I downloaded and installed the available beta 6.80b7 build 4965.
Up until now it works better.
Now when I looked the chunks never were over 100GB.
So really it looks like NBP does assemble them now in a better way.

Quade wrote:There is no DB tracking chunks. The "DB" is the folder itself.

But I guess there is a DB to track already downloaded files (CRC-check?)

Quade wrote:In your case I might tell Newsbin to stop downloading while unraring and repairing. It's in the performance options.

I dont need to tell NBP to stop download while unrar, because (like i posted) I do not let NBP unrar.
NBP PAR enabled, NBP Unrar deactivated
In the Performance Option I habe unchecked:
Pause download during UnRAR/Repair
Fail chunks with bad YENC Formats
In the Performance Option I habe checked:
x Reduce the amount of CPU that PAR repair ans UnRAR can use
x limit Repair so, it leaves CPU available
x Pre-allocate files

Quade wrote:This is relatively common. B8 changes how files are assembled which should reduce chunks. In this case, the backlog will be in the PAR processing instead of in the chunk processing.

When I downloaded I got B7. I will look for B8 maybe tomorrow. Today I do not have the time for it.

Quade wrote:There's always a disk bottle-neck. Sustained copying isn't the same as the back and forth needed to download, repair and unrar. Try copying two large files at the same time to the same drive and you'll see performance fall on it's face.

Yes I understand that there is a difference between copying and repair/unrar.
That is also a reason, why I do not use NBP to unrar. And the SSD with the chunks is not the same SSD with the downloadfolder.
Like I said; I use several SSD and HDDs.
Yes copying 2 large files at the same time reducees speed of each individual file.
But I guess each file transfering with over 70MByte/s shoud be far more that it woud be needed.
And then there are dozens of GB DDR3 Ram to cache.
Since I experienced the download speedproblem I switched the downloadfolder back to SSD.
But like I wrote above the beta installation was the first big step in improving the performance back to what I had in beginning with NBP.

AV-Software:
Quade wrote:Depending on what you're running, this is in fact normally the primary cause of slow downloads.

Like I said: I will try this also in the near future, but even if it may party responsible, it cant be the sole reason or the better performance of the beta would not show.


Thank you for your patience!
Please excuse my typos. I am not native english speaking.
= = = =
System: AMD FX8350 (8core @4GHz), 32GB DDR3, some SSD and some HDD, Win7 64bits
RankoSaotome
Occasional Contributor
Occasional Contributor
 
Posts: 13
Joined: Tue Jul 25, 2017 6:58 pm

Re: NBP at the beginning fast, but now it got really slow.

Postby Quade » Fri Aug 25, 2017 9:19 am

I dont need to tell NBP to stop download while unrar, because (like i posted) I do not let NBP unrar.
NBP PAR enabled, NBP Unrar deactivated


Repair and unrar are equally intensive. If it's repairing files, the chunks will backlog or they would before. Now file assembly happens independent of the repair process. I expect this might impact maximum download speed.

But I guess there is a DB to track already downloaded files (CRC-check?)


The CRC calculation happens during assembly so it's essentially free. Same for the 16K hash used by autopar.

Like I said: I will try this also in the near future, but even if it may party responsible, it cant be the sole reason or the better performance of the beta would not show.


As I said, many people say this and are wrong. If you want maximum sustained speed, you don't want anything getting between Newsbin and the network and Newsbin and the disks. AV software specifically intercepts and gets in the the way of both net and disk IO. It's how they work. Some even do MITM attacks on SSL. The only AV package that doesn't seem to hit performance is the built in Microsoft AV stuff.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 42501
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97


Return to Newbie Forum

Who is online

Users browsing this forum: No registered users and 1 guest