2 @unnumbered Integration with existing software
4 Here is some examples of how you can solve popular tasks with NNCP,
5 making them store-and-forward friendly.
11 * BitTorrent and huge files: BitTorrent.
12 * Downloading service: DownloadService.
14 * Multimedia streaming: Multimedia.
18 @section Integration with Postfix
20 This section is taken from @url{http://www.postfix.org/UUCP_README.html,
21 Postfix and UUCP} manual and just replaces UUCP-related calls with NNCP
24 @strong{Setting up a Postfix Internet to NNCP gateway}
26 Here is how to set up a machine that sits on the Internet and that forwards
27 mail to a LAN that is connected via NNCP.
31 @item You need an @ref{nncp-exec} program that extracts the sender
32 address from mail that arrives via NNCP, and that feeds the mail into
33 the Postfix @command{sendmail} command.
35 @item Define a @command{pipe(8)} based mail delivery transport for
38 /usr/local/etc/postfix/master.cf:
39 nncp unix - n n - - pipe
40 flags=F user=nncp argv=nncp-exec -quiet $nexthop sendmail $recipient
43 This runs the @command{nncp-exec} command to place outgoing mail into
44 the NNCP queue after replacing @var{$nexthop} by the the receiving NNCP
45 node and after replacing @var{$recipient} by the recipients. The
46 @command{pipe(8)} delivery agent executes the @command{nncp-exec}
47 command without assistance from the shell, so there are no problems with
48 shell meta characters in command-line parameters.
50 @item Specify that mail for @emph{example.com}, should be delivered via
51 NNCP, to a host named @emph{nncp-host}:
54 /usr/local/etc/postfix/transport:
55 example.com nncp:nncp-host
56 .example.com nncp:nncp-host
59 See the @command{transport(5)} manual page for more details.
61 @item Execute the command @command{postmap /etc/postfix/transport}
62 whenever you change the @file{transport} file.
64 @item Enable @file{transport} table lookups:
67 /usr/local/etc/postfix/main.cf:
68 transport_maps = hash:$config_directory/transport
71 @item Add @emph{example.com} to the list of domains that your site is
72 willing to relay mail for.
75 /usr/local/etc/postfix/main.cf:
76 relay_domains = example.com ...other relay domains...
79 See the @option{relay_domains} configuration parameter description for
82 @item Execute the command @command{postfix reload} to make the changes
87 @strong{Setting up a Postfix LAN to NNCP gateway}
89 Here is how to relay mail from a LAN via NNCP to the Internet.
93 @item You need an @ref{nncp-exec} program that extracts the sender
94 address from mail that arrives via NNCP, and that feeds the mail into
95 the Postfix @command{sendmail} command.
97 @item Specify that all remote mail must be sent via the @command{nncp}
98 mail transport to your NNCP gateway host, say, @emph{nncp-gateway}:
101 /usr/local/etc/postfix/main.cf:
102 relayhost = nncp-gateway
103 default_transport = nncp
106 Postfix 2.0 and later also allows the following more succinct form:
109 /usr/local/etc/postfix/main.cf:
110 default_transport = nncp:nncp-gateway
113 @item Define a @command{pipe(8)} based message delivery transport for
114 mail delivery via NNCP:
117 /usr/local/etc/postfix/master.cf:
118 nncp unix - n n - - pipe
119 flags=F user=nncp argv=nncp-exec -quiet $nexthop sendmail $recipient
122 This runs the @command{nncp-exec} command to place outgoing mail into
123 the NNCP queue. It substitutes the hostname (@emph{nncp-gateway}, or
124 whatever you specified) and the recipients before execution of the
125 command. The @command{nncp-exec} command is executed without assistance
126 from the shell, so there are no problems with shell meta characters.
128 @item Execute the command @command{postfix reload} to make the changes
134 @section Integration with Web feeds
136 RSS and Atom feeds could be collected using
137 @url{https://github.com/wking/rss2email, rss2email} program. It
138 converts all incoming feed entries to email messages. Read about how to
139 integration @ref{Postfix} with email. @command{rss2email} could be run
140 in a cron, to collect feeds without any user interaction. Also this
141 program supports ETags and won't pollute the channel if remote server
144 After installing @command{rss2email}, create configuration file:
146 $ r2e new rss-robot@address.com
148 and add feeds you want to retrieve:
150 $ r2e add https://git.cypherpunks.ru/cgit.cgi/nncp.git/atom/?h=master
158 @section Integration with Web pages
160 Simple HTML web page can be downloaded very easily for sending and
161 viewing it offline after:
163 $ wget http://www.example.com/page.html
166 But most web pages contain links to images, CSS and JavaScript files,
167 required for complete rendering.
168 @url{https://www.gnu.org/software/wget/, GNU Wget} supports that
169 documents parsing and understanding page dependencies. You can download
170 the whole page with dependencies the following way:
176 --restrict-file-names=ascii \
179 --execute robots=off \
180 http://www.example.com/page.html
182 that will create @file{www.example.com} directory with all files
183 necessary to view @file{page.html} web page. You can create single file
184 compressed tarball with that directory and send it to remote node:
186 $ tar cf - www.example.com | zstd |
187 nncp-file - remote.node:www.example.com-page.tar.zst
190 But there are multi-paged articles, there are the whole interesting
191 sites you want to get in a single package. You can mirror the whole web
192 site by utilizing @command{wget}'s recursive feature:
198 --no-remove-listing \
201 http://www.example.com/
204 There is a standard for creating
205 @url{https://en.wikipedia.org/wiki/Web_ARChive, Web ARChives}:
206 @strong{WARC}. Fortunately again, @command{wget} supports it as an
210 --warc-file www.example_com-$(date '+%Y%M%d%H%m%S') \
211 --no-warc-compression \
214 http://www.example.com/
216 That command will create uncompressed @file{www.example_com-XXX.warc}
217 web archive. By default, WARCs are compressed using
218 @url{https://en.wikipedia.org/wiki/Gzip, gzip}, but, in example above,
219 we have disabled it to compress with stronger and faster
220 @url{https://en.wikipedia.org/wiki/Zstd, zstd}, before sending via
223 There are plenty of software acting like HTTP proxy for your browser,
224 allowing to view that WARC files. However you can extract files from
225 that archive using @url{https://pypi.python.org/pypi/Warcat, warcat}
226 utility, producing usual directory hierarchy:
228 $ python3 -m warcat extract \
229 www.example_com-XXX.warc \
230 --output-dir www.example.com-XXX \
235 @section BitTorrent and huge files
237 If dealing with @ref{Git}, @ref{Feeds, web feeds} and @ref{Multimedia,
238 multimedia} goes relatively fast, then BitTorrent and huge files
239 consumes much time. You can not wait for downloads finish, but want to
242 @url{http://aria2.github.io/, aria2} multi-protocol download utility
243 could be used for solving that issue conveniently. It supports HTTP,
244 HTTPS, FTP, SFTP and BitTorrent protocols, together with
245 @url{http://tools.ietf.org/html/rfc5854, Metalink} format. BitTorrent
246 support is fully-featured: UDP trackers, DHT, PEX, encryption, magnet
247 URIs, Web-seeding, selective downloads, LPD. @command{aria2} can
248 accelerate HTTP*/*FTP downloads by segmented multiple parallel
251 You can queue you files after they are completely downloaded.
252 @file{aria2-downloaded.sh} contents:
253 @verbatiminclude aria2-downloaded.sh
256 @url{http://aria2.github.io/manual/en/html/aria2c.html#files, input file}
257 with the jobs you want to download:
260 http://www.nncpgo.org/download/nncp-0.11.tar.xz
262 http://www.nncpgo.org/download/nncp-0.11.tar.xz.sig
265 --on-download-complete aria2-downloaded.sh \
268 and all that downloaded (@file{nncp.txz}, @file{nncp.txz.sig}) files
269 will be sent to @file{remote.node} when finished.
271 @node DownloadService
272 @section Downloading service
274 Previous sections tell about manual downloading and sending results to
275 remote node. But one wish to remotely initiate downloading. That can be
276 easily solved with @ref{CfgExec, exec} handles.
280 warcer: ["/bin/sh", "/path/to/warcer.sh"]
281 wgeter: ["/bin/sh", "/path/to/wgeter.sh"]
283 "/usr/local/bin/aria2c",
284 "--on-download-complete", "aria2-downloaded.sh",
285 "--on-bt-download-complete", "aria2-downloaded.sh"
290 @file{warcer.sh} contents:
291 @verbatiminclude warcer.sh
293 @file{wgeter.sh} contents:
294 @verbatiminclude wgeter.sh
296 Now you can queue that node to send you some website's page, file or
300 $ echo http://www.nncpgo.org/Postfix.html |
301 nncp-exec remote.node warcer postfix-whole-page
302 $ echo http://www.nncpgo.org/Postfix.html |
303 nncp-exec remote.node wgeter postfix-html-page
305 http://www.nncpgo.org/download/nncp-0.11.tar.xz
306 http://www.nncpgo.org/download/nncp-0.11.tar.xz.sig |
307 nncp-exec remote.node aria2c
311 @section Integration with Git
313 @url{https://git-scm.com/, Git} version control system already has all
314 necessary tools for store-and-forward networking.
315 @url{https://git-scm.com/docs/git-bundle, git-bundle} command is
318 Use it to create bundles containing all required blobs/trees/commits and tags:
320 $ git bundle create repo-initial.bundle master --tags --branches
321 $ git tag -f last-bundle
322 $ nncp-file repo-initial.bundle remote.node:repo-$(date % '+%Y%M%d%H%m%S').bundle
325 Do usual working with the Git: commit, add, branch, checkout, etc. When
326 you decide to queue your changes for sending, create diff-ed bundle and
329 $ git bundle create repo-$(date '+%Y%M%d%H%m%S').bundle last-bundle..master
331 $ git bundle create repo-$(date '+%Y%M%d').bundle --since=10.days master
334 Received bundle on remote machine acts like usual remote:
336 $ git clone -b master repo-XXX.bundle
338 overwrite @file{repo.bundle} file with newer bundles you retrieve and
339 fetch all required branches and commits:
341 $ git pull # assuming that origin remote points to repo.bundle
342 $ git fetch repo.bundle master:localRef
343 $ git ls-remote repo.bundle
346 Bundles are also useful when cloning huge repositories (like Linux has).
347 Git's native protocol does not support any kind of interrupted download
348 resuming, so you will start from the beginning if connection is lost.
349 Bundles, being an ordinary files, can be downloaded with native
350 HTTP/FTP/NNCP resuming capabilities. After you fetch repository via the
351 bundle, you can add an ordinary @file{git://} remote and fetch the
354 Also you can find the following exec-handler useful:
355 @verbatiminclude git-bundler.sh
356 And it allows you to request for bundles like that:
357 @code{echo some-old-commit..master | nncp-exec REMOTE bundler REPONAME}.
360 @section Integration with multimedia streaming
362 Many video and audio streams could be downloaded using
363 @url{http://yt-dl.org/, youtube-dl} program.
364 @url{https://rg3.github.io/youtube-dl/supportedsites.html, Look} how
365 many of them are supported, including @emph{Dailymotion}, @emph{Vimeo}
368 When you multimedia becomes an ordinary file, you can transfer it easily.
371 --exec 'nncp-file {} remote.node:' \
372 'https://www.youtube.com/watch?list=PLd2Cw8x5CytxPAEBwzilrhQUHt_UN10FJ'