Newsbin v3 [nzbget, nzbgetweb, nzbgetter & unpak]

Today I release a new complete package for the Conceptronic CH3SNAS & CH3MNAS (2 different binaries)

before you use my package make sure you use

fun_plug, PHP5 with curl support, have installed unrar and installed the lighttpd module as well
the newsbin package you can download here

It contains:

  • – nzbget version 0.7.0-testing-r357
  • – nzbgetweb version v 1.4 (testing-5)
  • – nzbgetter version 1.0
  • – modified script

in short how to install:

put the correct file into the root of the Harddisk of the NAS, this is /mnt/HD_a2

untar the tgz file with tar -xvzf newsbinch3mnas-v3.tgz or tar -xvzf newsbinch3snas-v3.tgz, a folder newsbin will be created
Than there are a few files you have to copy manually into the correct folders
go into the newsbin folder with ‘cd newsbin’
than copy  the 2 shell script files (extension SH) to the startup folder of Fun_Plug

‘cp *.sh /ffp/start’

with next reboot nzbget will be started automatically.
optional you can copy from /conf/ the cronfile into /ffp/start as well to have nzbgetter auto-download your nzb files, it is possible you already installed it, so do not copy it cause your old crontab file will be overwritten.

if you do not reboot you have to make sure you start the two shell scripts manually.

cd /ffp/start
sh start
sh start

the first script starts nzbget for downloading from usenet, the 2nd script will run lighttpd with the correct configurations

upload your nzbfiles manually into /mnt/HD_a2/newsbin/nzb or use nzbgetter.

to access nzbgetweb interface: http://[ip-of-nas]:8000
to access nzbgetter web interface: http://[ip-of-nas]:8000/nzbgetter or access it through the nzbgetweb interface

Please make sure you configure nzbget with your usenet account and password
the script will make sure that downloaded stuff will be put in the correct folder after download.
this short how-to is far from perfect and a quick reference next to the normal manual of the various tools

if you want to upgrade

copy the binary from the /bin/ folder
copy the /web/ folder to your /web folder
copy the /nzbgetter folder to a new created /web folder
optional: copy from the /conf/ folder the to /ffp/start


51 thoughts on “Newsbin v3 [nzbget, nzbgetweb, nzbgetter & unpak]

  1. Erik says:

    “to access nzbgetter web interface: http://[ip-of-nas]:8000/nzbgetter or access it through the nzbgetweb interface”

    Thanks for the update but how do I do the last one “access it through the nzbgetweb interface” ?

    I did a update and the rest works

    I have NZBGet Web Interface v 1.4 (testing-5)
    and NZBGet version 0.7.0-testing-r357

    but dont see a link to nzbgetter


    in there is a # in front of /bin/crontab -l > $CRONTXT
    and by a update you also need to copy

  2. my big mistake .. damn .. i forgot to include the usermenu.php in the nzbgetweb interface. I will update my package

    update: package updated and created for some users the usermenu.php to put in the nzbgetweb folder.

  3. AJ says:

    edit: Ah I did find a problem. But maybe its just me.
    Everytime I change a POSTPROCESSING-SCRIPT OPTION it won’t unrar…
    the nzb file gets a .queued.
    and it doesn’t matter what I change.
    For example unpak_recent_age from 6 to 1.
    If i change the option back, it still doens’t work, I have to use the unpak.cfg.example to get it back to unrar again.

    when I edit with Notebook++, I does unrar

  4. my does not use any of the postprocessing options within nzbgetweb configuration

    as the I use is very specific for CH3SNAS / CH3MNAS it can cause an issue when you use the postprocessing (it might)
    cause I never tried the unpak function within the nzbgetweb part

  5. Vincent says:

    First of all: thanks for the tools & guide.

    I got a small problem however. When i try the “Fetch Newsbin Report” function i get a blank page with “Fatal error: Call to undefined function gzinflate() in /mnt/HD_a2/newsbin/web/functions.php on line 683”

  6. dude says:

    Still using nzbget 0.6.0 with no probs, maybe I’ll upgrade, thanks for the tut.
    nzbgetter, no clue what it does, I’ll Google first before I ask what it adds to nzbget. :-)

    Edit: NZBGetter is a PHP Script for linux based systems to spider NZB index sites for NZB files matching your predefined search patterns. The script downloads matching NZB files and passes them to your Usenet Reader.

  7. Vincent

    you need to edit your php.ini

    ; # for newzbin (i believe):
    and than change also
    extension_dir = “/mnt/HD_a2/ffp/lib/”

    where resides (for ffp in a very long url) I will check tonight

  8. Vincent says:

    Thanks Dennis,

    i set my extension_dir to “/ffp/lib/php/extensions/no-debug-non-zts-20060613” and it works!

  9. dude says:

    Great fun this nzbgetter. I think more useful for leeching your favorite TV show than other stuff on usenet. (It doesn’t show you results to choose from but puts every match directly into your nzb folder.)

    Dennis, did you try to get “PHP cURL” working on the ch3snas or is this of no need?
    (wget seems to work fine: /ffp/bin/wget)

  10. PHP cURL is working out of the box if you download

    PHP Version 5.2.9 from the inreto repository (Fonz his fun_plug website)
    5.2.9 has curl module and yes that one is working fine ;-)

  11. Nice release Dennis!

    Nzbgetter is a very good script for a 1.0 release!

    This is funny…., but who is the girl that sametimes is in your banner title in this blog, and also appears in you twitter ?

    I am in love :)

  12. Dennis, one question

    In recent months i have search for an invite for Newzbin, it no lucky. You dont have an extra invite for me ?

  13. WeirJack says:

    Fist of all many thanks for your efforts and updates.
    When i was using your previous version i found out to my suprise that tv series whith an nzb like xxx.tvhd.s05e03 nicely gets unrared in a map TV series 5 episode 2 map. But it seems that doesn’t work standard with this version anymore.
    After some searching i found (at least i think) it has to do with the and the related unpak.cfg. Now when i put an nzb in a “Series”folder with the meaning of unpacking it in Series/TV/series5/episode2/ map ik get the error”: Post- Process: / mnt/ HD_ a2/ newsbin/ conf/ unpak. sh: / mnt/ HD_ a2/ newsbin/ conf/ unpak. cfg: line 18: : not found.
    Tried everything but doesn’t work and files get not unrarred and error stays :(
    Do you know what the problem could be or at least do i have to search for line 18 in Nzbget.conf of of unpak.cfg ??
    With kind regards

  14. to be honest; I have no clue whatsoever with that unpak script

    First of all there are various ways for the categorisation.. automatic and manual
    when defined manual it’s done in the config settings.php within nzbgetweb

    when done from the newsgroups it’s defined in unpak.cfg from the script

    I have 0% scripting skills for shell scripts so I have no clue where to work or check from.

    I am still using another ‘old’ script I think maybe that one I could copy and use, but I thinking more about someone able to make another new script with all the functions in it .. but than I need the help from someone able to do it

    in the script there is a category tv-series and when I select this, the movie is unpacked

  15. WeirJack says:


    thnax for your fast en honoust reply.
    I wil DL your previous nzbgetweb and use the working scrips from that install.Off course i deleted the old install after a succesful installation of the latest version :(

  16. ghai says:

    Hallo ik heb nzbgetter 1.0 oer op staan maar zag dat er een nieuwe is nzbgetter 1.1 maar hoe zet ik dat over die andere heen met welke code ?

  17. nzbgetter heeft 1 bestand die je moet backuppen

    1. Backup nzbg_download.xml

    This file can be found in the /nzbgetter/conf/ directory.
    Copy it to save location. It contains your download list.

    2. Remove or rename old nzbgetter directory

    than copy the new nzbgetter over the old one

  18. WeirJack says:


    Just one more question since i see that you also are playing with the new CH3MNAS. Any luck yet getting the usb-storage.ko working to mount the usb drive ?. All usb-storage.ko i find are compiled for the CH3SNAS of DNS-323 and these does not seem to work on the new MNAS versions.

    P.S. i already found out that as soon as i change one single line in unpak.cfg i get the error :(
    So the error has to do with the unpak.cfg file and not the of nzbget.conf

  19. I can’t get rid of the message: “Error: Could not upload file! Error code: 7.”
    I completely reinstalled funplug + newsbin etc..
    I changed php.ini and test.php gives: post_max_size 20M, upload_max_filesize 256M 256M, upload_tmp_dir /mnt/HD_a2/tmp /mnt/HD_a2/tmp.
    I changed the httpd.conf with “bin-path” => “/ffp/bin/php-cgi“, to /ffp/bin/php-cgi -c /ffp/etc/php.ini“;
    I changed The webinterface config: UploadMaxFileSize 21446430 bytes.
    I restarted the nas and the

    Still i can’t upload large nzb files (like 2.9mb size). I can upload 2.1 mb size nzb files, but not larger. How can i solve this?

  20. Eric says:

    Maybe a strange question but can this setup (newsbin) cause my harddisk not to spin down anymore?

    I have my NZB upload and check folder on an usb-drive. But I also use the as supplied with the package. Is this causing the running drive? Any way to avoid this?

    Besides this everything is working and looking great. Did a test run with some TV-series and presto, their they are. Thanks

  21. Mitch
    check in php.ini

    file_uploads = On

    ; Temporary directory for HTTP uploaded files (will use system default if not
    ; specified).
    upload_tmp_dir = /mnt/HD_a2/tmp

    Also the upload_tmp_dir must be CORRECT

    check otherwise the logfile of nzbgetweb / lighttpd what is the error you get
    of course do not forget to restart lighttpd after applying other new settings to lighttpd/php

  22. @Eric
    yes, I do not care about the disks
    so if you want to have the disks spin down, you have to alter the script that EVERYTHING is RAN from USB DISK and than even I do not know if the HDD’s do spin down yes/no

    the usenet stuff is + 1 year running here without issues.

  23. file_uploads = On, upload_tmp_dir = /mnt/HD_a2/tmp (but this folder doens’t exists)
    the only logfiles i can find are in: /mnt/HD_a2/newsbin/logs, they do not show anything from the nzb upload (only the 2.1 mb nzb file, that successfully uploads).

    lighttpd restart gives:
    root@nas:/mnt/HD_a2/ffp/start# sh restart
    Stopping lighttpd
    /ffp/etc/lighttpd.conf: Required file not found or not readable

    but, restart gives:
    root@nas:/mnt/HD_a2/ffp/start# sh restart
    Stopping lighttpd
    Starting /ffp/sbin/lighttpd -f /mnt/HD_a2/newsbin/conf/lighttpd.conf

    Which seems fine to me.
    The only problem is that i can’t upload big nzb files, small nzb files (like 2 mb) will upload with no problem.

  24. Vink says:


    allereerst bedankt voor alle spullen die je deelt. Werkt goed.

    Alles wat uitgepakt wordt, gaat ook naar de New folder…dus, als ik smallville download gaat hij en naar de Tv folder en naar de New folder. Hoe krijg ik dit eruit?

  25. @dennis,

    Het probleem is nu idd verholpen, wel raar dat hij zelf niet de map maakte of een dergelijke error gaf. En apart dat kleinere bestanden wel lukte.

    In elk geval enorm bedankt!

  26. @mitch
    het niet aanmaken heeft alles te maken het het feit dat dat niet in het script zit en wel in de how-to .. (geloof ik) ;-)

    kleine bestanden gaan via het memory. Als dat niet kan, dan moet ie swappen naar een tmp folder .. .. vandaar.

  27. Ludger says:

    Hallo Dennis,

    ik krijg foutcode 7, maw grote nzb bestanden pakt hij niet. Is dat met de update opgelost en zo ja ik snap de update niet waar moet ik welke files heen kopieren.
    Of moet ik gewoon alles overschrijven?
    Ik hoor graag vanje.

  28. Vink says:


    allereerst bedankt voor alle spullen die je deelt. Werkt goed.

    Alles wat uitgepakt wordt, gaat ook naar de New folder…dus, als ik smallville download gaat hij en naar de Tv folder en naar de New folder. Hoe krijg ik dit eruit?

    Of hoe krijg ik alles zonder categories naar de map new? Heb het nog steeds niet gevonden :S


    Issue zit hem in de, weet alleen niet hoe ik dit eruit krijg.

  29. Ludger says:

    Hallo Deniis,
    je zult het wel druk hebben, maar kun je me helpen met de te grote nzb bestanden?
    Ik heb je vorige uitleg helemaal gedaan en alles werkt super.Nu kom ik een paar grote nzb’s tegen en loop ik vast. Aanpassen van de php.ini heb ik gedaan maar helpt niet. alle tips hierboven geprobeerd niets helpt.
    Kun jij me hiermee helpen?

  30. Ludger,

    kijk eens wat ik met Mitch heb uitgevogeld ..
    hij had hetzelfde probleem.
    niet een juiste temp folder aangemaakt waardoor een grote nzb file niet geupload kan worden via nzbget
    tip; het kan wel als je gewoon handmatig de nzb file in de folder nzb zet

  31. Ludger says:

    ik zal het nog een keer doorlopen, maar met optie 2 ben ik ook al blij, dat wist ik nl. niet.
    Bedankt voor al je spullen die je deelt.

  32. Gene says:

    Hi Dennis!

    Quick question – I have a DNS-323 running NZBGet version 0.7.0-testing-r317 and NZBGet Web Interface v 1.4 (testing-2) – so far it has been running great.

    I would like to upgrade and use NZBGetter but was wondering which specific binary should I use and what would be the easiest way to upgrade…

    Thanks for the help in advance!

  33. the ch3snas binary is suitable for DNS-323

    updating depends on what you have been using before

    to be honest: there is no real update available. plain reconfiguration is the best way to go

  34. imdos says:

    Hi Dennis,

    I have changed the upload size like this:
    ; Maximum allowed size for uploaded files.
    upload_max_filesize = 90M

    But if i upload a file larger than 10Mb i get the following;
    The file Star.Wars.Episode.I.The.Phantom.Menace.Bluray.Edition.1080p.nzb is bigger than 10000000 bytes!

    I allready did a grep through the php scripts for the string but could not find it either. Is this a known limitation of nzbget?

  35. check
    post_max_size = 128M
    check if your
    upload_tmp_dir = /mnt/HD_a2/tmp is existing
    DO NOT rever to /tmp cause than it can consume your memory !!

  36. imdos says:

    I have post_max_size around 80M if i remember correctly.

    The upload_tmp_dir = /mnt/HD_a2/usbstorage/newsbin/tmp if i remember correctly. Which is a USB-stick of 1 Gb with roughly 500 Mb available space. I changed a few lines in your configuration to let the HDD’s spindown if there is nothing to download.

    (free disk space HD2: 129.7 GB
    free disk space USB: 503.42 MB)

    Temporary files downloaded are put onto /mnt/HD_b4/newsbin

    So that should not pose a problem.

    Could there be something else we are not seeing at the moment?

  37. I see /mnt/HD_b4/newsbin which is different from my installation, so ..
    I have no clue

    I somewhere wrote on this site:

    if you want to upload nzb

  38. imdos says:

    Dennis, thanks for your quick reply.

    I inserted the line “Temporary files downloaded are put onto /mnt/HD_b4/newsbin” to let you know that the downloading of nntp files which need to be joined afterwards takes place at a separate partition. So this should not effect the upload_tmp_dir = /mnt/HD_a2/usbstorage/newsbin/tmp which is on my USB-stick.

    I have restarted lighttpd multiple times, but the problem still arises.

    I will modify my settings tonight to 256M and check if that resolves the problem and will do a diff on the file you supplied within your package.

  39. imdos says:

    Still having trouble with uploads bigger than 10mb.

    server.upload-dirs = ( “/mnt/HD_a2/usbstorage/tmp” )

    post_max_size = 120M
    ; Maximum allowed size for uploaded files.
    upload_max_filesize = 228M
    upload_tmp_dir = /mnt/HD_a2/usbstorage/tmp

    I also did some modification of the time parameters!

    So i have no clue as to what i should be changing now :S

  40. imdos says:

    Hi Dennis,

    Thanks indeed problem solved there was another reference.

    Forgot to do a grep -i because it contained both upper- and lowercase i presume

  41. jeroen says:

    Hmmz, cant get it to work, i cant login using the ip of the nas and the 8000port, chrome says the link is not wokring.

    please help……..



Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.