X-Git-Url: http://www.git.cypherpunks.ru/?p=nncp.git;a=blobdiff_plain;f=doc%2Fchunked.texi;h=b73eb9223937f9b5f6caac7e85ba3cfecb074f72;hp=5f78c2311bda9d716028d4de1b0de13d7422729f;hb=203dfe36da7adf2b3089e4fa4017a67409cbad70;hpb=0fad171c0d79ad583c0faf5427e22d1d62a0a52d diff --git a/doc/chunked.texi b/doc/chunked.texi index 5f78c23..b73eb92 100644 --- a/doc/chunked.texi +++ b/doc/chunked.texi @@ -1,4 +1,5 @@ @node Chunked +@cindex chunked @unnumbered Chunked files There is ability to transfer huge files with dividing them into smaller @@ -13,8 +14,10 @@ on the destination node. Splitting is done with @ref{nncp-file, nncp-file -chunked} command and reassembling with @ref{nncp-reass} command. +@vindex .nncp.meta +@vindex .nncp.chunk Chunked @file{FILE} produces @file{FILE.nncp.meta}, -@file{FILE.nncp.chunk0}, @file{FILE.nncp.chunk1}, ... files. All +@file{FILE.nncp.chunk0}, @file{FILE.nncp.chunk1}, @dots{} files. All @file{.nncp.chunkXXX} can be concatenated together to produce original @file{FILE}. @@ -44,6 +47,7 @@ size and their hash checksums. This is @ref{MTH} checksum of each chunk @end multitable +@cindex ZFS recordsize @anchor{ChunkedZFS} It is strongly advisable to reassemble incoming chunked files on @url{https://en.wikipedia.org/wiki/ZFS, ZFS} dataset with deduplication