@node Integration @unnumbered Integration with existing software Here is some examples of how you can solve popular tasks with NNCP, making them store-and-forward friendly. @menu * Postfix:: * Web feeds: Feeds. * Web pages: WARCs. * BitTorrent and huge files: BitTorrent. * Downloading service: DownloadService. * Git:: * Multimedia streaming: Multimedia. @end menu @node Postfix @section Integration with Postfix This section is taken from @url{http://www.postfix.org/UUCP_README.html, Postfix and UUCP} manual and just replaces UUCP-related calls with NNCP ones. @strong{Setting up a Postfix Internet to NNCP gateway} Here is how to set up a machine that sits on the Internet and that forwards mail to a LAN that is connected via NNCP. @itemize @item You need an @ref{nncp-exec} program that extracts the sender address from mail that arrives via NNCP, and that feeds the mail into the Postfix @command{sendmail} command. @item Define a @command{pipe(8)} based mail delivery transport for delivery via NNCP: @verbatim /usr/local/etc/postfix/master.cf: nncp unix - n n - - pipe flags=F user=nncp argv=nncp-exec -quiet $nexthop sendmail $recipient @end verbatim This runs the @command{nncp-exec} command to place outgoing mail into the NNCP queue after replacing @var{$nexthop} by the the receiving NNCP node and after replacing @var{$recipient} by the recipients. The @command{pipe(8)} delivery agent executes the @command{nncp-exec} command without assistance from the shell, so there are no problems with shell meta characters in command-line parameters. @item Specify that mail for @emph{example.com}, should be delivered via NNCP, to a host named @emph{nncp-host}: @verbatim /usr/local/etc/postfix/transport: example.com nncp:nncp-host .example.com nncp:nncp-host @end verbatim See the @command{transport(5)} manual page for more details. @item Execute the command @command{postmap /etc/postfix/transport} whenever you change the @file{transport} file. @item Enable @file{transport} table lookups: @verbatim /usr/local/etc/postfix/main.cf: transport_maps = hash:$config_directory/transport @end verbatim @item Add @emph{example.com} to the list of domains that your site is willing to relay mail for. @verbatim /usr/local/etc/postfix/main.cf: relay_domains = example.com ...other relay domains... @end verbatim See the @option{relay_domains} configuration parameter description for details. @item Execute the command @command{postfix reload} to make the changes effective. @end itemize @strong{Setting up a Postfix LAN to NNCP gateway} Here is how to relay mail from a LAN via NNCP to the Internet. @itemize @item You need an @ref{nncp-exec} program that extracts the sender address from mail that arrives via NNCP, and that feeds the mail into the Postfix @command{sendmail} command. @item Specify that all remote mail must be sent via the @command{nncp} mail transport to your NNCP gateway host, say, @emph{nncp-gateway}: @verbatim /usr/local/etc/postfix/main.cf: relayhost = nncp-gateway default_transport = nncp @end verbatim Postfix 2.0 and later also allows the following more succinct form: @verbatim /usr/local/etc/postfix/main.cf: default_transport = nncp:nncp-gateway @end verbatim @item Define a @command{pipe(8)} based message delivery transport for mail delivery via NNCP: @verbatim /usr/local/etc/postfix/master.cf: nncp unix - n n - - pipe flags=F user=nncp argv=nncp-exec -quiet $nexthop sendmail $recipient @end verbatim This runs the @command{nncp-exec} command to place outgoing mail into the NNCP queue. It substitutes the hostname (@emph{nncp-gateway}, or whatever you specified) and the recipients before execution of the command. The @command{nncp-exec} command is executed without assistance from the shell, so there are no problems with shell meta characters. @item Execute the command @command{postfix reload} to make the changes effective. @end itemize @node Feeds @section Integration with Web feeds RSS and Atom feeds could be collected using @url{https://github.com/wking/rss2email, rss2email} program. It converts all incoming feed entries to email messages. Read about how to integration @ref{Postfix} with email. @command{rss2email} could be run in a cron, to collect feeds without any user interaction. Also this program supports ETags and won't pollute the channel if remote server supports them too. After installing @command{rss2email}, create configuration file: @verbatim $ r2e new rss-robot@address.com @end verbatim and add feeds you want to retrieve: @verbatim $ r2e add https://git.cypherpunks.ru/cgit.cgi/nncp.git/atom/?h=master @end verbatim and run the process: @verbatim $ r2e run @end verbatim @node WARCs @section Integration with Web pages Simple HTML web page can be downloaded very easily for sending and viewing it offline after: @verbatim $ wget http://www.example.com/page.html @end verbatim But most web pages contain links to images, CSS and JavaScript files, required for complete rendering. @url{https://www.gnu.org/software/wget/, GNU Wget} supports that documents parsing and understanding page dependencies. You can download the whole page with dependencies the following way: @verbatim $ wget \ --page-requisites \ --convert-links \ --adjust-extension \ --restrict-file-names=ascii \ --span-hosts \ --random-wait \ --execute robots=off \ http://www.example.com/page.html @end verbatim that will create @file{www.example.com} directory with all files necessary to view @file{page.html} web page. You can create single file compressed tarball with that directory and send it to remote node: @verbatim $ tar cf - www.example.com | xz -9 | nncp-file - remote.node:www.example.com-page.tar.xz @end verbatim But there are multi-paged articles, there are the whole interesting sites you want to get in a single package. You can mirror the whole web site by utilizing @command{wget}'s recursive feature: @verbatim $ wget \ --recursive \ --timestamping \ -l inf \ --no-remove-listing \ --no-parent \ [...] http://www.example.com/ @end verbatim There is a standard for creating @url{https://en.wikipedia.org/wiki/Web_ARChive, Web ARChives}: @strong{WARC}. Fortunately again, @command{wget} supports it as an output format. @verbatim $ wget \ --warc-file www.example_com-$(date '+%Y%M%d%H%m%S') \ --no-warc-compression \ --no-warc-keep-log \ [...] http://www.example.com/ @end verbatim That command will create uncompressed @file{www.example_com-XXX.warc} web archive. By default, WARCs are compressed using @url{https://en.wikipedia.org/wiki/Gzip, gzip}, but, in example above, we have disabled it to compress with stronger @command{xz}, before sending via @command{nncp-file}. There are plenty of software acting like HTTP proxy for your browser, allowing to view that WARC files. However you can extract files from that archive using @url{https://pypi.python.org/pypi/Warcat, warcat} utility, producing usual directory hierarchy: @verbatim $ python3 -m warcat extract \ www.example_com-XXX.warc \ --output-dir www.example.com-XXX \ --progress @end verbatim @node BitTorrent @section BitTorrent and huge files If dealing with @ref{Git}, @ref{Feeds, web feeds} and @ref{Multimedia, multimedia} goes relatively fast, then BitTorrent and huge files consumes much time. You can not wait for downloads finish, but want to queue them after. @url{http://aria2.github.io/, aria2} multi-protocol download utility could be used for solving that issue conveniently. It supports HTTP, HTTPS, FTP, SFTP and BitTorrent protocols, together with @url{http://tools.ietf.org/html/rfc5854, Metalink} format. BitTorrent support is fully-featured: UDP trackers, DHT, PEX, encryption, magnet URIs, Web-seeding, selective downloads, LPD. @command{aria2} can accelerate HTTP*/*FTP downloads by segmented multiple parallel connections. You can queue you files after they are completely downloaded. @file{aria2-downloaded.sh} contents: @verbatiminclude aria2-downloaded.sh Also you can prepare @url{http://aria2.github.io/manual/en/html/aria2c.html#files, input file} with the jobs you want to download: @verbatim $ cat jobs http://www.nncpgo.org/download/nncp-0.11.tar.xz out=nncp.txz http://www.nncpgo.org/download/nncp-0.11.tar.xz.sig out=nncp.txz.sig $ aria2c \ --on-download-complete aria2-downloaded.sh \ --input-file jobs @end verbatim and all that downloaded (@file{nncp.txz}, @file{nncp.txz.sig}) files will be sent to @file{remote.node} when finished. @node DownloadService @section Downloading service Previous sections tell about manual downloading and sending results to remote node. But one wish to remotely initiate downloading. That can be easily solved with @ref{CfgExec, exec} handles. @verbatim exec: warcer: ["/bin/sh", "/path/to/warcer.sh"] wgeter: ["/bin/sh", "/path/to/wgeter.sh"] aria2c: [ "/usr/local/bin/aria2c", "--on-download-complete", "aria2-downloaded.sh", "--on-bt-download-complete", "aria2-downloaded.sh" ] @end verbatim @file{warcer.sh} contents: @verbatiminclude warcer.sh @file{wgeter.sh} contents: @verbatiminclude wgeter.sh Now you can queue that node to send you some website's page, file or BitTorrents: @verbatim $ echo http://www.nncpgo.org/Postfix.html | nncp-exec remote.node warcer postfix-whole-page $ echo http://www.nncpgo.org/Postfix.html | nncp-exec remote.node wgeter postfix-html-page $ echo \ http://www.nncpgo.org/download/nncp-0.11.tar.xz http://www.nncpgo.org/download/nncp-0.11.tar.xz.sig | nncp-exec remote.node aria2c @end verbatim @node Git @section Integration with Git @url{https://git-scm.com/, Git} version control system already has all necessary tools for store-and-forward networking. @url{https://git-scm.com/docs/git-bundle, git-bundle} command is everything you need. Use it to create bundles containing all required blobs/trees/commits and tags: @verbatim $ git bundle create repo-initial.bundle master --tags --branches $ git tag -f last-bundle $ nncp-file repo-initial.bundle remote.node:repo-$(date % '+%Y%M%d%H%m%S').bundle @end verbatim Do usual working with the Git: commit, add, branch, checkout, etc. When you decide to queue your changes for sending, create diff-ed bundle and transfer them: @verbatim $ git bundle create repo-$(date '+%Y%M%d%H%m%S').bundle last-bundle..master or maybe $ git bundle create repo-$(date '+%Y%M%d').bundle --since=10.days master @end verbatim Received bundle on remote machine acts like usual remote: @verbatim $ git clone -b master repo-XXX.bundle @end verbatim overwrite @file{repo.bundle} file with newer bundles you retrieve and fetch all required branches and commits: @verbatim $ git pull # assuming that origin remote points to repo.bundle $ git fetch repo.bundle master:localRef $ git ls-remote repo.bundle @end verbatim Bundles are also useful when cloning huge repositories (like Linux has). Git's native protocol does not support any kind of interrupted download resuming, so you will start from the beginning if connection is lost. Bundles, being an ordinary files, can be downloaded with native HTTP/FTP/NNCP resuming capabilities. After you fetch repository via the bundle, you can add an ordinary @file{git://} remote and fetch the difference. Also you can find the following exec-handler useful: @verbatiminclude git-bundler.sh And it allows you to request for bundles like that: @code{echo some-old-commit..master | nncp-exec REMOTE bundler REPONAME}. @node Multimedia @section Integration with multimedia streaming Many video and audio streams could be downloaded using @url{http://yt-dl.org/, youtube-dl} program. @url{https://rg3.github.io/youtube-dl/supportedsites.html, Look} how many of them are supported, including @emph{Dailymotion}, @emph{Vimeo} and @emph{YouTube}. When you multimedia becomes an ordinary file, you can transfer it easily. @verbatim $ youtube-dl \ --exec 'nncp-file {} remote.node:' \ 'https://www.youtube.com/watch?list=PLd2Cw8x5CytxPAEBwzilrhQUHt_UN10FJ' @end verbatim