Compare commits

...

162 Commits

Author SHA1 Message Date
ed
f0abc0ef59 v1.15.4 2024-10-04 23:19:28 +00:00
ed
a99fa3375d the impresources.files traversible is not threadsafe 2024-10-04 22:37:29 +00:00
ed
22c7e09b3f small fixes;
* make-sfx: delete failed deps downloads
* tlcheck: detect untranslated strings
2024-10-04 20:56:16 +00:00
ed
0dfe1d5b35 toast countdown bar 2024-10-04 19:29:54 +00:00
ed
a99a3bc6d7 audio-player: fix compact-mode rendering glitch on narrow screens 2024-10-04 18:15:18 +00:00
ed
9804f25de3 add option for natural sorting; thx @oshiteku 2024-10-04 00:30:04 +00:00
ed
ae98200660 og: support filekeys 2024-10-03 23:52:11 +00:00
ed
e45420646f share folders as qr-codes 2024-10-03 23:14:06 +00:00
ed
21be82ef8b fix #101 (show logues even if dotfiles are hidden) 2024-10-03 22:19:32 +02:00
ed
001afe00cb i18n: time plurals 2024-10-03 07:38:33 +00:00
ed
19a5985f29 allow uploading logues; closes #100 2024-10-02 23:16:59 +00:00
ed
2715ee6c61 fix confusing toast on F2 with nothing selected (#100) 2024-10-02 23:11:29 +00:00
ed
dc157fa28f webdav: support explicit <allprop/> (WinSCP) 2024-10-02 22:28:23 +00:00
ed
1ff14b4e05 optimizations, failsafes, formatting 2024-10-02 21:59:53 +00:00
ed
480ac254ab webdav: show toplevel volumes when root is unmapped
previously, only real folders could be listed by a webdav client;
a server which does not have any filesystem paths mapped to `/`
would cause clients to panic when trying to list the server root

now, assuming volumes `/foo` and `/bar/qux` exist, when accessing `/`
the user will see `/foo` but not `/bar` due to limitations in `walk`,
and `qux` will only appear when viewing `/bar`

a future rework of the recursion logic should further improve this
2024-10-02 21:12:58 +00:00
ed
4b95db81aa thx keth 2024-10-01 22:37:50 +00:00
ed
c81e898435 partyfuse: also support mounting nginx, iis
these additional parsers are not included in the sfx-embedded
copy of partyfuse.py; grab it from github when necessary
2024-10-01 22:37:07 +00:00
ed
f1646b96ca dist: strip some pointless code 2024-10-01 18:35:36 +00:00
ed
44f2b63e43 partyfuse: embed fuse.py into sfx 2024-10-01 18:27:42 +00:00
ed
847a2bdc85 partyfuse: bump datacache chunksize
previous approach:
* cache 64K on first read
* cache 1M on subsequent intersecting reads

new approach:
* cache 64K on first read
* cache 1M on the next intersecting read
* cache 8M on subsequent intersecting reads
* cache 4M on standalone reads at offsets >1M

improves performance by 50% on windows
and should help on high-latency connections
2024-10-01 17:15:35 +00:00
ed
03f0f99469 partyfuse: fix extremely slow dircache lookups
the cache was a list of files instead of a dict... dude

also adds a max-num dircache limit
in addition to the expiration time
2024-10-01 17:07:28 +00:00
ed
3900e66158 partyfuse: modernize html parser (just in case) 2024-10-01 17:00:17 +00:00
ed
3dff6cda40 partyfuse: normalize naming in parsers 2024-10-01 16:55:00 +00:00
ed
73d05095b5 partyfuse: misc correctness;
* support more unix envs with granular fuse config
* generated URLs were OK but technically incorrect
2024-10-01 16:49:39 +00:00
ed
fcdc1728eb #102: make UI translation easier in docker 2024-10-01 00:04:07 +00:00
ed
8b942ea237 partyfuse: cleanup logging and exceptions
windows runs 50% faster with recentlog on infos too...
2024-09-29 23:19:33 +00:00
ed
88a1c5ca5d optimize non-e2d ram usage down to 10% or so
drop chunk-hashes in the up2k snap, plus other insignificant attribs
to reduce both the snapfile size and the ram usage by about 90%

reduces startup/shutdown time by a lot since there's less to serdes
(does not affect -e2d which was already optimal)

other changes:

* improve incoming-eta accuracy when the initial handshake
   was made a long time before the upload actually started

* move the list of incoming files in the controlpanel to the top
2024-09-27 21:11:10 +00:00
ed
047176b297 py2 fix 2024-09-27 21:06:01 +00:00
ed
dc4d0d8e71 smb: upto 2x faster; but still very buggy:
* do not absreal paths unless necessary
* do not determine username if no users configured
* impacket 0.12 fixed the foldersize limit, but now
   you get extremely poor performance in large folders
   so the previous workaround is still default-enabled
2024-09-27 17:09:48 +00:00
ed
b9c5c7bbde u2c: early exclude + fix py2 perf
* don't traverse into excluded folders
* 3.5x faster on python2.7
2024-09-23 17:20:04 +00:00
ed
9daeed923f u2c: remove all deps to become 3x faster on small files
reduces performance on python 2.7, but that is ok

also fixes `unknown encoding: idna` due to cpython race
2024-09-22 18:07:36 +00:00
ed
66b260cea9 pkgres: fix tiny leak in template loader 2024-09-20 22:25:36 +00:00
ed
58cf01c2ad fix linter warnings 2024-09-20 22:24:39 +00:00
ed
d866841c19 pkgres:
* pyz: yeet the resource tar which is now pointless thanks to pkgres
* cache impresource stuff because pyz lookups are Extremely slow
* prefer tx_file when possible for slightly better performance
* use hardcoded list of expected resources instead of dynamic
   discovery at runtime; much simpler and probably safer
* fix some forgotten resources (copying.txt, insecure.pem)
* fix loading jinja templates on windows
2024-09-19 22:04:49 +00:00
Shiz
a462a644fb Python 3.7 package resources support (#98)
add support for reading webdeps and jinja-templates using either
importlib_resources or pkg_resources, which removes the need for
extracting these to a temporary folder on the filesystem

* util: add helper functions to abstract embedded resource access
* http*: serve embedded resources through resource abstraction
* main: check webdeps through resource abstraction
* httpconn: remove unused method `respath(name)`
* use __package__ to find package resources
* util: use importlib_resources backport if available
* pass E.pkg as module object for importlib_resources compatibility
* util: add pkg_resources compatibility to resource abstraction
2024-09-19 09:00:34 +00:00
ed
678675a9a6 fix prometheus metrics; broke in 609c5921 2024-09-16 21:04:58 +00:00
ed
de9069ef1d update pkgs to 1.15.3 2024-09-16 01:20:16 +00:00
ed
c0c0a1a83a v1.15.3 2024-09-16 01:07:50 +00:00
ed
1d004b6dbd update pkgs to 1.15.2 2024-09-16 00:44:48 +00:00
ed
b90e1200d7 v1.15.2 2024-09-16 00:20:20 +00:00
ed
4493a0a804 misc mojibake filename support 2024-09-16 00:12:49 +00:00
ed
58835b2b42 ux bugfixes:
* show media tags in shares
* html hydrator assumed a folder named `foo.txt` was a doc
* due to sessions, use `pwd` as password placeholder on services
2024-09-15 23:37:24 +00:00
ed
427597b603 show total directory size in listings
sizes are computed during `-e2ds` indexing, and new uploads
are counted, but a rescan is necessary after a move or delete
2024-09-15 23:01:18 +00:00
ed
7d64879ba8 more optimizations,
* 5% less cpu load from clients fetching thumbnails
* and slight improvement to up2k stuff
2024-09-15 17:46:43 +00:00
ed
bb715704b7 ren_open was too fancy 2024-09-15 14:39:35 +00:00
ed
d67e9cc507 sqlite and misc optimizations:
* exponentially slow upload handshakes caused by lack of rd+fn
   sqlite index; became apparent after a volume hit 200k files
* listing big folders 5% faster due to `_quotep3b`
* optimize `unquote`, 20% faster but only used rarely
* reindex on startup 150x faster in some rare cases
   (same filename in MANY folders)

the database is now around 10% larger (likely worst-case)
2024-09-15 13:18:43 +00:00
ed
2927bbb2d6 strip dev-only asserts at build stage 2024-09-14 22:17:35 +00:00
ed
0527b59180 cosmetic: only print hostname warning once 2024-09-14 20:37:56 +00:00
ed
a5ce1032d3 tlnote + nginx unix-socket example 2024-09-12 21:42:33 +00:00
ed
1c2acdc985 add flameshot client example 2024-09-11 20:56:38 +00:00
ed
4e75534ef8 optimize BrokerThr, 7x faster:
reduce the overhead of function-calls from the client thread
to the svchub singletons (up2k, thumbs, metrics) down to 14%

and optimize up2k chunk-receiver to spend 5x less time bookkeeping
which restores up2k performance to before introducing incoming-ETA
2024-09-11 20:37:10 +00:00
ultwcz
7a573cafd1 fix: translation: Check the newly added Chinese translation (#97) 2024-09-11 19:03:53 +00:00
ed
844194ee29 incoming-ETA: improve accuracy 2024-09-11 06:56:12 +00:00
ed
609c5921d4 list incoming files + ETA in controlpanel 2024-09-10 21:24:05 +00:00
ed
c79eaa089a update pkgs to 1.15.1 2024-09-09 23:55:37 +00:00
ed
e9d962f273 v1.15.1 2024-09-09 23:43:43 +00:00
ed
b5405174ec add login sessions 2024-09-09 23:39:20 +00:00
ed
6eee601521 fix u2c --ow (overwrite/replace)
the u2c flag to overwrite files on the server became no-op in v1.13.8
2024-09-09 19:40:38 +00:00
ed
2fac2bee7c update pkgs to 1.15.0 2024-09-08 20:02:25 +00:00
ed
c140eeee6b v1.15.0 2024-09-08 19:25:46 +00:00
ed
c5988a04f9 up2k.js: bump handshake timeout for safededup 2024-09-08 18:06:37 +00:00
ed
a2e0f98693 disable upload deduplication by default;
dedup is still encouraged and fully supported, but
being default-enabled has caused too many surprises

enabling `--dedup` restores the previous default behavior

also renames `--never-symlink` to `--hardlink-only`
2024-09-08 17:09:14 +00:00
ed
1111153f06 test dedup relinking 2024-09-08 12:55:27 +00:00
ed
e5a836cb7d og: fix links to textfiles 2024-09-08 12:12:34 +00:00
ed
b0de84cbc5 db-verify: support newlines in filenames + flag 404s 2024-09-08 00:44:22 +00:00
ed
cbb718e10d css fixes:
* improve hotdog-stand theme
* fix up2k tabs glow (went poof in a syntax error)
2024-09-07 19:29:40 +00:00
ed
b5ad9369fe confine xlink behavior behind its volflag
symlinks between volumes will only be created if xlink is
enabled, so such symlinks should be ignored if xlink is
disabled, as they might originate from other software

this prevents accidental rewriting of non-dedup symlinks
2024-09-07 19:17:32 +00:00
ed
4401de0413 fix mv with --no-dedup in volumes with dupes;
if --no-dedup was enabled in a volume which already contained
symlinked duplicate files, renaming/moving folders could fail

this is due to folder contents being moved one file at a time
(which is how symlink breakage is prevented) except the links
are moved assuming the final directory layout, meaning they
may be intermittently broken during the movie

with no-dedup, the symlinks are converted into full files as
each symlink is encountered, but a temporarily broken symlink
would crash the procedure

fix this by giving `_symlink` a new parameter `fsrc`
which is a known valid inode for data copying purposes
2024-09-07 00:47:12 +00:00
ed
6e671c5245 verify on-disk contents before dedup;
previously, the assumption was made that the database and filesystem
would not desync, and that an upload could safely be substituted with
a symlink to an existing copy on-disk, assuming said copy still
existed on-disk at all

this is fine if copyparty is the only software that makes changes to
the filesystem, but that is a shitty assumption to make in hindsight

add `--safe-dedup` which takes a "safety level", and by default (50)
it will no longer blindly expect that the filesystem has not been
altered through other means; the file contents will now be hashed
and compared to the database

deduplication can be much slower as a result, but definitely worth it
as this avoids some potentially very unpleasant surprises

the previous behavior can be restored with `--safe-dedup 1`
2024-09-06 19:08:14 +00:00
ed
08848be784 u2c: add hashgen mode + fix shutdown lag 2024-09-06 00:31:25 +00:00
ed
b599fbae97 use local timezone in log messages; closes #96
timezone can be changed with `export TZ=Europe/Oslo` before launch

using naive timestamps like this appears to be safe as of 3.13-rc1,
no deprecation warnings, just a tiny bit slower than assuming UTC
2024-09-05 19:31:33 +00:00
ed
a8dabc99f6 add more translations 2024-09-04 23:46:32 +00:00
ed
f1130db131 fix confusing message when uploading dupes
due to deduplication, it is intentionally impossible to
upload several identical copies of a file in parallel

by default, the up2k client will upload files sorted by
size, which usually leads to dupes being grouped together,
and it will try to do just that

this is by design, as it improves performance on average,
but it also shows the confusing (but technically-correct)
message "resume the partial upload into the original path"

fix this with a more appropriate message

note that this approach was selected in favor of pausing
handshakes while the initial copy finishes uploading,
because that could severely reduce upload performance
by preventing optimal use of multiple connections
2024-09-04 22:03:26 +00:00
ed
735ec35546 update pkgs to 1.14.4 2024-09-02 01:21:07 +00:00
ed
5a009a2a64 v1.14.4 2024-09-02 01:08:41 +00:00
ed
d9e9526247 fix js typo (could panic on network glitches) 2024-09-02 00:58:15 +00:00
ed
5a8c3b8be0 optimize test_httpcli.py too, from 1.64 to 1.51s 2024-08-31 22:03:06 +00:00
ed
1c9c17fb9b optimize test_dedup.py
* 7.71s originally
* 4.51s with fstab reuse
* 4.34s without db_wd
* 4.02s with no pp start
* 3.73s with Cfg reuse
2024-08-31 21:54:47 +00:00
ed
7f82449179 changelog: cleanup historic entries 2024-08-31 20:39:37 +00:00
ed
e455ec994e logo tweaks (kerning, footer-slant) 2024-08-31 20:37:58 +00:00
ed
c111027420 update pkgs to 1.14.3 2024-08-30 23:29:47 +00:00
ed
abcdf479e6 v1.14.3 2024-08-30 23:11:22 +00:00
ed
ad2371f810 shares: add revival and expiration extension 2024-08-30 22:25:50 +00:00
ed
c4e2b0f95f doc-viewer: always wordwrap code 2024-08-30 22:13:10 +00:00
ed
3da62ec234 fix dedup bug as of v1.13.8:
* v1.13.8 broke collision resolving for non-identical files;
   the correct filename was reserved but not symlinked to
   the original file, leaving a zerobyte file instead.
   See v1.14.3 github release notes for remediation info

* add sanchecks for early detection of index/fs desync;
   saves performance and gives less confusing logs
2024-08-30 22:06:25 +00:00
ed
01233991f3 tftp: support unmapped root 2024-08-30 16:08:50 +00:00
ed
ee35974273 readme hacking 2024-08-29 22:17:13 +00:00
ed
7037e7365e add logo 2024-08-29 22:00:08 +00:00
ed
03b13e8a1c sfx-customizer:
* better translation stripping
* add support in bruteforcer
* add examples

and fix login-banner usage example
2024-08-28 05:53:26 +00:00
ed
cdd2da0208 update pkgs to 1.14.2 2024-08-23 23:43:46 +00:00
ed
cec0e0cf02 v1.14.2 2024-08-23 23:07:18 +00:00
ed
8122ddedfe share multiple files (#84);
if files (one or more) are selected for sharing, then
a virtual folder is created to hold the selected files

if a single file is selected for sharing, then
the returned URL will point directly to that file

and fix some shares-related bugs:
* password coalescing
* log-spam on reload
2024-08-23 22:55:31 +00:00
ultwcz
55a77c5e89 Chinese translation fixes (#95)
* fix: translation: changing from `" "` to `' '` for some strings;
	using `./scripts/tlcheck.sh eng chi copyparty/web/browser.js`

* fix: translation: Check the newly added Chinese translation
2024-08-23 08:14:24 +00:00
ed
461f31582d add IDs for ricing (#93) + fix a11y bleed 2024-08-22 20:14:08 +00:00
ed
f356faa278 u2c: support multiple exclusion patterns 2024-08-22 20:03:25 +00:00
ed
9f034d9c4c fix confusing logmsg for zerobyte files 2024-08-22 19:54:10 +00:00
ed
ba52590ae4 translation tweaks 2024-08-22 19:52:20 +00:00
ultwcz
92edea1de5 add translation: Chinese (#94) 2024-08-22 17:19:16 +00:00
ed
7ff46966da fix some issues with shares mentioned in #84;
* crash when root volume is unmapped
* rephrase login-page for shares
* add chrome support (lol)
* fix confusing helptext
* improve ux
  * placeholders in share creator
  * button to disable expiration in share creator
  * human-readable timestamps in share listing
2024-08-19 21:38:47 +00:00
ed
fca70b3508 update pkgs to 1.14.1 2024-08-19 00:24:52 +00:00
ed
70009cd984 v1.14.1 2024-08-19 00:14:44 +00:00
ed
8d8b88c4fd update pkgs to 1.14.0 2024-08-18 23:36:57 +00:00
ed
c4b0cccefd v1.14.0 2024-08-18 23:11:36 +00:00
ed
7c2beba555 add file/folder sharing; closes #84 2024-08-18 22:49:13 +00:00
ed
7d8d94388b invert volume scrollwheel
<daniiooo> also iirc some time ago we were talking about the scroll for volume ed
<daniiooo> and how its reversed
<ed> is it reversed though? most people said it worked the way they expected
<daniiooo> fuck maybe i agreed back then too
<daniiooo> its the opposite in both aimp and mpv though
<ed> is it w
<tatsu> its a feature
<Devices> it's to keep you on your toes
<Devices> consciously use copyparty
<ed> i can invert it no problem
<ed> would be a nice surprise for anyone who's used it
<Flaminator> Scroll down turns the audio down right?
<daniiooo> ye it makes it louder in cpp
<Devices> why would scrolling down make something louder
<Vin> yeah that's odd
<Vin> scrolling up should make it louder
<Flaminator> It's what it does for me in winamp, mpc-hc and foobar2000.
<daniiooo> so now the question is who itc agreed to whats currently in cpp
<daniiooo> haha
<ed> idk but i'm inverting it
<ed> let's invert it every 6 months
2024-08-17 20:36:59 +00:00
ed
0b46b1a614 fix some vproxy issues (#93):
* navpane would always feed the vproxy paths into the tree
   instead of only when necessary (the initial load)

* mkdir would return `X-New-Dir` without the `rp-loc` prefix
  * chpw and some other redirects also sent raw vpaths

Reported-by: @iridial
2024-08-17 18:17:40 +00:00
ed
5153db6bff ux: login margin; theme2: yellow buttons
the red buttons from protonmail's monokai theme look better,
but they're confusing because intuitively red means off
2024-08-17 15:55:55 +00:00
ed
b0af4b3712 hook/reloc: dupe in one vol doesn't mean dupe in another 2024-08-16 21:08:22 +00:00
ed
c8f4aeaefa hook/reloc: fix up2k jank
* wark landed in the wrong registry when moved to another volume
   (harmless; upload would succeed on the next handshake)

* dedup did not apply correctly when moved into another volume,
   since all the checks were done based on the previous vol;
   fix this by recursing the whole thing

also update the reloc example after some real-world experience

Reported-by: @daniiooo
2024-08-15 19:26:06 +00:00
ed
00da74400c password-changer fixes:
* fix `--chpw-no` which did nothing
* print list of users with unchanged passwords by default
* more granular verbosity levels
2024-08-15 17:30:01 +00:00
ed
83fb569d61 make passwords user-changeable; closes #92 2024-08-14 20:09:57 +00:00
ed
5a62cb4869 fix custom fonts in sandboxed docs;
`@import` must be at the very start of a `<style>` tag

Reported-by: @thaddeuskkr (thx!)
2024-08-14 15:30:04 +00:00
ed
687df2fabd unix-socket fixes:
* support x-forwarded-for
* option to specify socket permissions and group
* in containers, avoid collision during restart
* add --help-bind with examples
2024-08-14 04:47:10 +00:00
ed
cdd0794d6e update pkgs to 1.13.8 2024-08-13 00:20:04 +00:00
ed
dcc988135e v1.13.8 2024-08-13 00:08:23 +00:00
ed
3db117d85f list status of optional dependencies 2024-08-12 22:48:53 +00:00
ed
ee9aad82dd support listening on unix sockets 2024-08-12 21:58:02 +00:00
ed
2d6eb63fce scripts/uncomment: python 3.12 support;
`tokenize.FSTRING_MIDDLE` was introduced, changing the
representation of `f"x{{y"` from `STRING(f"x{{y")` to:

* `FSTRING_START('f"')`
* `FSTRING_MIDDLE('x{')`
* `FSTRING_MIDDLE('y')`
* `FSTRING_END('"')`

each literal `{` (encoded as `{{` in the input) now appears as a
single `{` as the final character of its `FSTRING_MIDDLE`, with
additional consecutive `FSTRING_MIDDLE` tokens if necessary

regular interpolating `{` are encoded as separate `OP` tokens

the fact that the literal `{` is encoded as a single `{` instead
of `{{` breaks the assumption that the string-value of each token
maps directly to the original code

fix this by replacing `{` with `{{` and `}` with `}}` in
`FSTRING_MIDDLE` tokens, and not adding whitespace after
`FSTRING_MIDDLE` tokens
2024-08-12 19:55:17 +00:00
ed
ca001c8504 update deps (pyftpdlib, win10-python) 2024-08-12 18:51:52 +00:00
ed
4e581c59da fix s390x w/a, up2k name-randomizer 2024-08-12 17:45:19 +00:00
ed
dbd42bc6bf add option to load custom js on all pages 2024-08-11 23:51:17 +00:00
ed
c862ec1b64 up2k.js: optimal pipelining 2024-08-11 21:15:44 +00:00
ed
f709140571 hook/reloc: helptext mentioned jank that doesn't exist anymore 2024-08-11 15:07:21 +00:00
ed
ef1c4b7a20 this guy didn't make it in 2024-08-11 14:55:51 +00:00
ed
6c94a63f1c add hook side-effects; closes #86
hooks can now interrupt or redirect actions, and initiate
related actions, by printing json on stdout with commands

mainly to mitigate limitations such as sharex/sharex#3992

xbr/xau can redirect uploads to other destinations with `reloc`
and most hooks can initiate indexing or deletion of additional
files by giving a list of vpaths in json-keys `idx` or `del`

there are limitations;
* xbu/xau effects don't apply to ftp, tftp, smb
* xau will intentionally fail if a reloc destination exists
* xau effects do not apply to up2k

also provides more details for hooks:
* xbu/xau: basic-uploader vpath with filename
* xbr/xar: add client ip
2024-08-11 14:52:32 +00:00
ed
20669c73d3 rm dead code (gridview conditional dl/play)
and maybe fix negative eta when a chunk gets eaten by the network
2024-08-09 21:57:42 +00:00
ed
0da719f4c2 up2k: shrink request headers
v1.13.5 made some proxies angry with its massive chunklists

when stitching chunks, only list the first chunk hash in full,
and include a truncated hash for the consecutive chunks

should be enough for logfiles to make sense
and to smoketest that clients are behaving
2024-08-08 18:24:18 +00:00
ed
373194c38a better up2k stitching on fat32 servers:
* the batches don't need to be window-aligned
* improve js backoff (in case of more funnies)
2024-08-05 19:52:50 +00:00
ed
3d245431fc linter fixes 2024-08-05 18:48:16 +00:00
ed
250c8c56f0 fix deadlock on IBM mainframes (s390x) 2024-08-02 23:05:44 +00:00
ed
e136231c8e docker: add portainer howto 2024-08-02 23:01:32 +00:00
ed
98ffaadf52 docker: use less RAM at runtime
compile to bytecode so cpython doesn't have to keep it in memory

ram usage reduced by:
* min: 5.4 MiB (32.6 to 27.2)
* ac/im: 5.2 MiB (39.0 to 33.8)
* dj/iv: 10.6 MiB (67.3 to 56.7)

startup time reduced from:
* min: 1.3s to 0.6s
* ac/im: 1.6s to 0.9s
* dj/iv: 2.0s to 1.1s

image size increased by 4 MiB (min), 6 MiB (ac/im/iv), 9 MiB (dj)

ram usage measured on idle with:
while true; do ps aux | grep -E 'R[S]S|no[-]crt'; read -n1; echo; done

startup time measured with:
time podman run --rm -it localhost/copyparty-min-amd64 --exit=idx
2024-08-02 22:11:23 +00:00
ed
ebb1981803 py2: reduce ram usage 2024-08-01 20:01:42 +00:00
ed
72361c99e1 add import chickenbits 2024-08-01 18:29:25 +00:00
ed
d5c9c8ebbd make it 5% faster 2024-07-31 17:51:53 +00:00
ed
746229846d add test for zip-download 2024-07-30 22:44:29 +00:00
ed
ffd7cd3ca8 update pkgs to 1.13.6 2024-07-29 20:56:00 +00:00
ed
b3cecabca3 v1.13.6 2024-07-29 20:28:51 +00:00
ed
662541c64c audio-player: show status while loading 2024-07-29 20:14:39 +00:00
ed
225bd80ea8 up2k.js: fix overshoot in chunk stitcher 2024-07-29 19:19:22 +00:00
ed
85e54980cc up2k.js: set timeouts for uploads
in the event that an upload chunk gets stuck, the js would
never stop waiting for a response, requiring a page reload

improves reliability when running behind a reverse-proxy
which is configured to never timeout requests (can make
sense when combined with other services on the same box)
2024-07-29 19:17:03 +00:00
ed
a19a0fa9f3 fix modal wordwrap in firefox;
with overflow:auto, firefox picks the div-width before estimating
the height, causing it to undershoot by the scrollbar width
and then messing up the text alignment

fix: conditionally set overflow-y:scroll using js
2024-07-29 18:04:35 +00:00
ed
9bb6e0dc62 misc ux:
* wait until page (au) has loaded to register hotkeys
* hotkey `m` would grow sidebar if tree was minimized
* more exact warning about num.parallel uploads
* keep more console logs in memory
* message phrasing
2024-07-29 17:59:34 +00:00
ed
15ddcf53e7 add bsod theme 2024-07-26 22:09:59 +00:00
ed
6b54972ec0 update comparison vs similar software:
* general changes:
  * upload speed comparisons considering v1.13.5

* hfs2:
  * dead project with unfixed vulnerabilities

* hfs3:
  * has replaced hfs2
  * uploads are now resumable
  * add new functionality:
    * write-only folders
    * unmap subfolders
    * move and delete files
    * folder-rproxy
    * themes
    * basic audio player, image viewer

* filebrowser:
  * uploads are now parallelized, resumable, segmented
    * but single large files are not accelerated
  * can listen on unix sockets
  * folder-rproxy is supported
  * more cpu efficient than copyparty
2024-07-26 19:46:03 +00:00
ed
0219eada23 cleanup: strip trailing whitespace 2024-07-26 19:33:56 +00:00
ed
8916bce306 u2c fixes:
* `--sz` was num.chunks, not the intended MiB
* crash on exit with `-z` and no modified files
* summary upload elapsed-time could exceed wallclock
2024-07-26 19:28:47 +00:00
ed
99edba4fd9 change xm examples to reject users without write-access; #68 2024-07-25 19:23:08 +00:00
ed
64de3e01e8 update pkgs to 1.13.5 2024-07-22 23:48:24 +00:00
ed
8222ccc40b v1.13.5 2024-07-22 23:23:53 +00:00
ed
dc449bf8b0 fix grid toolbar undocking after viewing a pic/vid 2024-07-22 23:09:25 +00:00
ed
ef0ecf878b recommend rclone over davfs2; closes #90 2024-07-22 22:46:24 +00:00
ed
53f1e3c91d ui option to play video as audio
audio extraction happens serverside to opus or mp3
depending on browser support

remuxing (extracting audio without transcoding)
is currently not supported, and is not planned
2024-07-22 22:30:21 +00:00
ed
eeef80919f css-fix for firefox52 (centos6) 2024-07-22 20:59:05 +00:00
ed
987bce2182 u2c fixes:
* don't stitch across deduplicated blocks
* print speed/time for hash/upload
* more compact json in handshakes
2024-07-22 20:55:32 +00:00
ed
b511d686f0 up2k fixes:
* progress donuts should include inflight bytes
* changes to stitch-size in settings didn't apply until next refresh
* serverlog was too verbose; truncate chunk hashes
* mention absolute cloudflare limit in readme
2024-07-22 19:06:01 +00:00
ed
132a83501e add chunk stitching; twice as fast long-distance uploads:
rather than sending each file chunk as a separate HTTP request,
sibling chunks will now be fused together into larger HTTP POSTs
which results in unreasonably huge speed boosts on some routes
( `2.6x` from Norway to US-East,  `1.6x` from US-West to Finland )

the `x-up2k-hash` request header now takes a comma-separated list
of chunk hashes, which must all be sibling chunks, resulting in
one large consecutive range of file data as the post body

a new global-option `--u2sz`, default `1,64,96`, sets the target
request size as 64 MiB, allowing the settings ui to specify any
value between 1 and 96 MiB, which is cloudflare's max value

this does not cause any issues for resumable uploads; thanks to the
streaming HTTP POST parser, each chunk will be verified and written
to disk as they arrive, meaning only the untransmitted chunks will
have to be resent in the event of a connection drop -- of course
assuming there are no misconfigured WAFs or caching-proxies

the previous up2k approach of uploading each chunk in a separate HTTP
POST was inefficient in many real-world scenarios, mainly due to TCP
window-scaling behaving erratically in some IXPs / along some routes

a particular link from Norway to Virginia,US is unusably slow for
the first 4 MiB, only reaching optimal speeds after 100 MiB, and
then immediately resets the scale when the request has been sent;
connection reuse does not help in this case

on this route, the basic-uploader was somehow faster than up2k
with 6 parallel uploads; only time i've seen this
2024-07-21 23:35:37 +00:00
ed
e565ad5f55 better errors through broker 2024-07-21 20:36:50 +00:00
ed
f955d2bd58 dangit 2024-07-20 22:28:40 +00:00
ed
5953399090 add helptext exporters (html, txt) 2024-07-17 23:06:01 +00:00
ed
d26a944d95 hooks: add cache-warmer 2024-07-17 21:00:59 +00:00
ed
50dac15568 update pkgs to 1.13.4 2024-07-16 05:48:45 +00:00
123 changed files with 9809 additions and 2088 deletions

1
.gitignore vendored
View File

@@ -30,6 +30,7 @@ copyparty/res/COPYING.txt
copyparty/web/deps/
srv/
scripts/docker/i/
scripts/deps-docker/uncomment.py
contrib/package/arch/pkg/
contrib/package/arch/src/

24
.vscode/settings.json vendored
View File

@@ -22,6 +22,9 @@
"terminal.ansiBrightCyan": "#9cf0ed",
"terminal.ansiBrightWhite": "#ffffff",
},
"python.terminal.activateEnvironment": false,
"python.analysis.enablePytestSupport": false,
"python.analysis.typeCheckingMode": "standard",
"python.testing.pytestEnabled": false,
"python.testing.unittestEnabled": true,
"python.testing.unittestArgs": [
@@ -31,23 +34,8 @@
"-p",
"test_*.py"
],
"python.linting.pylintEnabled": true,
"python.linting.flake8Enabled": true,
"python.linting.banditEnabled": true,
"python.linting.mypyEnabled": true,
"python.linting.flake8Args": [
"--max-line-length=120",
"--ignore=E722,F405,E203,W503,W293,E402,E501,E128,E226",
],
"python.linting.banditArgs": [
"--ignore=B104,B110,B112"
],
// python3 -m isort --py=27 --profile=black copyparty/
"python.formatting.provider": "none",
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter"
},
"editor.formatOnSave": true,
// python3 -m isort --py=27 --profile=black ~/dev/copyparty/{copyparty,tests}/*.py && python3 -m black -t py27 ~/dev/copyparty/{copyparty,tests,bin}/*.py $(find ~/dev/copyparty/copyparty/stolen -iname '*.py')
"editor.formatOnSave": false,
"[html]": {
"editor.formatOnSave": false,
"editor.autoIndent": "keep",
@@ -58,6 +46,4 @@
"files.associations": {
"*.makefile": "makefile"
},
"python.linting.enabled": true,
"python.pythonPath": "/usr/bin/python3"
}

208
README.md
View File

@@ -1,4 +1,6 @@
# 💾🎉 copyparty
<img src="https://github.com/9001/copyparty/raw/hovudstraum/docs/logo.svg" width="250" align="right"/>
### 💾🎉 copyparty
turn almost any device into a file server with resumable uploads/downloads using [*any*](#browser-support) web browser
@@ -41,7 +43,9 @@ turn almost any device into a file server with resumable uploads/downloads using
* [unpost](#unpost) - undo/delete accidental uploads
* [self-destruct](#self-destruct) - uploads can be given a lifetime
* [race the beam](#race-the-beam) - download files while they're still uploading ([demo video](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm))
* [incoming files](#incoming-files) - the control-panel shows the ETA for all incoming files
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
* [shares](#shares) - share a file or folder by creating a temporary link
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
* [media player](#media-player) - plays almost every audio format there is
* [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
@@ -62,7 +66,8 @@ turn almost any device into a file server with resumable uploads/downloads using
* [smb server](#smb-server) - unsafe, slow, not recommended for wan
* [browser ux](#browser-ux) - tweaking the ui
* [opengraph](#opengraph) - discord and social-media embeds
* [file indexing](#file-indexing) - enables dedup and music search ++
* [file deduplication](#file-deduplication) - enable symlink-based upload deduplication
* [file indexing](#file-indexing) - enable music search, upload-undo, and better dedup
* [exclude-patterns](#exclude-patterns) - to save some time
* [filesystem guards](#filesystem-guards) - avoid traversing into other filesystems
* [periodic rescan](#periodic-rescan) - filesystem monitoring
@@ -76,6 +81,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [upload events](#upload-events) - the older, more powerful approach ([examples](./bin/mtag/))
* [handlers](#handlers) - redefine behavior with plugins ([examples](./bin/handlers/))
* [identity providers](#identity-providers) - replace copyparty passwords with oauth and such
* [user-changeable passwords](#user-changeable-passwords) - if permitted, users can change their own passwords
* [using the cloud as storage](#using-the-cloud-as-storage) - connecting to an aws s3 bucket and similar
* [hiding from google](#hiding-from-google) - tell search engines you dont wanna be indexed
* [themes](#themes)
@@ -85,6 +91,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [prometheus](#prometheus) - metrics/stats can be enabled
* [other extremely specific features](#other-extremely-specific-features) - you'll never find a use for these
* [custom mimetypes](#custom-mimetypes) - change the association of a file extension
* [feature chickenbits](#feature-chickenbits) - buggy feature? rip it out
* [packages](#packages) - the party might be closer than you think
* [arch package](#arch-package) - now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
* [fedora package](#fedora-package) - does not exist yet
@@ -111,6 +118,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [HTTP API](#HTTP-API) - see [devnotes](./docs/devnotes.md#http-api)
* [dependencies](#dependencies) - mandatory deps
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
* [dependency chickenbits](#dependency-chickenbits) - prevent loading an optional dependency
* [optional gpl stuff](#optional-gpl-stuff)
* [sfx](#sfx) - the self-contained "binary" (recommended!)
* [copyparty.exe](#copypartyexe) - download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
@@ -124,7 +132,7 @@ turn almost any device into a file server with resumable uploads/downloads using
just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** -- that's it! 🎉
* or install through pypi: `python3 -m pip install --user -U copyparty`
* or install through [pypi](https://pypi.org/project/copyparty/): `python3 -m pip install --user -U copyparty`
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
* or install [on arch](#arch-package) [on NixOS](#nixos-module) [through nix](#nix-package)
* or if you are on android, [install copyparty in termux](#install-on-android)
@@ -194,7 +202,7 @@ firewall-cmd --reload
also see [comparison to similar software](./docs/versus.md)
* backend stuff
* ☑ IPv6
* ☑ IPv6 + unix-sockets
* ☑ [multiprocessing](#performance) (actual multithreading)
* ☑ volumes (mountpoints)
* ☑ [accounts](#accounts-and-volumes)
@@ -209,7 +217,7 @@ also see [comparison to similar software](./docs/versus.md)
* upload
* ☑ basic: plain multipart, ie6 support
* ☑ [up2k](#uploading): js, resumable, multithreaded
* unaffected by cloudflare's max-upload-size (100 MiB)
* **no filesize limit!** ...unless you use Cloudflare, then it's 383.9 GiB
* ☑ stash: simple PUT filedropper
* ☑ filename randomizer
* ☑ write-only folders
@@ -225,6 +233,7 @@ also see [comparison to similar software](./docs/versus.md)
* ☑ [navpane](#navpane) (directory tree sidebar)
* ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename))
* ☑ audio player (with [OS media controls](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) and opus/mp3 transcoding)
* ☑ play video files as audio (converted on server)
* ☑ image gallery with webm player
* ☑ textfile browser with syntax hilighting
* ☑ [thumbnails](#thumbnails)
@@ -232,7 +241,7 @@ also see [comparison to similar software](./docs/versus.md)
* ☑ ...of videos using FFmpeg
* ☑ ...of audio (spectrograms) using FFmpeg
* ☑ cache eviction (max-age; maybe max-size eventually)
* ☑ multilingual UI (english, norwegian, [add your own](./docs/rice/#translations)))
* ☑ multilingual UI (english, norwegian, chinese, [add your own](./docs/rice/#translations)))
* ☑ SPA (browse while uploading)
* server indexing
* ☑ [locate files by contents](#file-search)
@@ -578,9 +587,6 @@ images with the following names (see `--th-covers`) become the thumbnail of the
* the order is significant, so if both `cover.png` and `folder.jpg` exist in a folder, it will pick the first matching `--th-covers` entry (`folder.jpg`)
* and, if you enable [file indexing](#file-indexing), it will also try those names as dotfiles (`.folder.jpg` and so), and then fallback on the first picture in the folder (if it has any pictures at all)
in the grid/thumbnail view, if the audio player panel is open, songs will start playing when clicked
* indicated by the audio files having the ▶ icon instead of 💾
enabling `multiselect` lets you click files to select them, and then shift-click another file for range-select
* `multiselect` is mostly intended for phones/tablets, but the `sel` option in the `[⚙️] settings` tab is better suited for desktop use, allowing selection by CTRL-clicking and range-selection with SHIFT-click, all without affecting regular clicking
* the `sel` option can be made default globally with `--gsel` or per-volume with volflag `gsel`
@@ -646,6 +652,7 @@ up2k has several advantages:
* uploads resume if you reboot your browser or pc, just upload the same files again
* server detects any corruption; the client reuploads affected chunks
* the client doesn't upload anything that already exists on the server
* no filesize limit unless imposed by a proxy, for example Cloudflare, which blocks uploads over 383.9 GiB
* much higher speeds than ftp/scp/tarpipe on some internet connections (mainly american ones) thanks to parallel connections
* the last-modified timestamp of the file is preserved
@@ -725,6 +732,13 @@ download files while they're still uploading ([demo video](http://a.ocv.me/pub/g
requires the file to be uploaded using up2k (which is the default drag-and-drop uploader), alternatively the command-line program
### incoming files
the control-panel shows the ETA for all incoming files , but only for files being uploaded into volumes where you have read-access
![copyparty-cpanel-upload-eta-or8](https://github.com/user-attachments/assets/fd275ffa-698c-4fca-a307-4d2181269a6a)
## file manager
cut/paste, rename, and delete files/folders (if you have permission)
@@ -743,6 +757,42 @@ file selection: click somewhere on the line (not the link itsef), then:
you can move files across browser tabs (cut in one tab, paste in another)
## shares
share a file or folder by creating a temporary link
when enabled in the server settings (`--shr`), click the bottom-right `share` button to share the folder you're currently in, or alternatively:
* select a folder first to share that folder instead
* select one or more files to share only those files
this feature was made with [identity providers](#identity-providers) in mind -- configure your reverseproxy to skip the IdP's access-control for a given URL prefix and use that to safely share specific files/folders sans the usual auth checks
when creating a share, the creator can choose any of the following options:
* password-protection
* expire after a certain time; `0` or blank means infinite
* allow visitors to upload (if the user who creates the share has write-access)
semi-intentional limitations:
* cleanup of expired shares only works when global option `e2d` is set, and/or at least one volume on the server has volflag `e2d`
* only folders from the same volume are shared; if you are sharing a folder which contains other volumes, then the contents of those volumes will not be available
* no option to "delete after first access" because tricky
* when linking something to discord (for example) it'll get accessed by their scraper and that would count as a hit
* browsers wouldn't be able to resume a broken download unless the requester's IP gets allowlisted for X minutes (ref. tricky)
specify `--shr /foobar` to enable this feature; a toplevel virtual folder named `foobar` is then created, and that's where all the shares will be served from
* you can name it whatever, `foobar` is just an example
* if you're using config files, put `shr: /foobar` inside the `[global]` section instead
users can delete their own shares in the controlpanel, and a list of privileged users (`--shr-adm`) are allowed to see and/or delet any share on the server
after a share has expired, it remains visible in the controlpanel for `--shr-rt` minutes (default is 1 day), and the owner can revive it by extending the expiration time there
**security note:** using this feature does not mean that you can skip the [accounts and volumes](#accounts-and-volumes) section -- you still need to restrict access to volumes that you do not intend to share with unauthenticated users! it is not sufficient to use rules in the reverseproxy to restrict access to just the `/share` folder.
## batch rename
select some files and press `F2` to bring up the rename UI
@@ -800,6 +850,7 @@ some hilights:
* OS integration; control playback from your phone's lockscreen ([windows](https://user-images.githubusercontent.com/241032/233213022-298a98ba-721a-4cf1-a3d4-f62634bc53d5.png) // [iOS](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) // [android](https://user-images.githubusercontent.com/241032/233212311-a7368590-08c7-4f9f-a1af-48ccf3f36fad.png))
* shows the audio waveform in the seekbar
* not perfectly gapless but can get really close (see settings + eq below); good enough to enjoy gapless albums as intended
* videos can be played as audio, without wasting bandwidth on the video
click the `play` link next to an audio file, or copy the link target to [share it](https://a.ocv.me/pub/demo/music/Ubiktune%20-%20SOUNDSHOCK%202%20-%20FM%20FUNK%20TERRROR!!/#af-1fbfba61&t=18) (optionally with a timestamp to start playing from, like that example does)
@@ -987,7 +1038,7 @@ some recommended FTP / FTPS clients; `wark` = example password:
## webdav server
with read-write support, supports winXP and later, macos, nautilus/gvfs
with read-write support, supports winXP and later, macos, nautilus/gvfs ... a greay way to [access copyparty straight from the file explorer in your OS](#mount-as-drive)
click the [connect](http://127.0.0.1:3923/?hc) button in the control-panel to see connection instructions for windows, linux, macos
@@ -1064,12 +1115,12 @@ some **BIG WARNINGS** specific to SMB/CIFS, in decreasing importance:
* [shadowing](#shadowing) probably works as expected but no guarantees
and some minor issues,
* clients only see the first ~400 files in big folders; [impacket#1433](https://github.com/SecureAuthCorp/impacket/issues/1433)
* clients only see the first ~400 files in big folders;
* this was originally due to [impacket#1433](https://github.com/SecureAuthCorp/impacket/issues/1433) which was fixed in impacket-0.12, so you can disable the workaround with `--smb-nwa-1` but then you get unacceptably poor performance instead
* hot-reload of server config (`/?reload=cfg`) does not include the `[global]` section (commandline args)
* listens on the first IPv4 `-i` interface only (default = :: = 0.0.0.0 = all)
* login doesn't work on winxp, but anonymous access is ok -- remove all accounts from copyparty config for that to work
* win10 onwards does not allow connecting anonymously / without accounts
* on windows, creating a new file through rightclick --> new --> textfile throws an error due to impacket limitations -- hit OK and F5 to get your file
* python3 only
* slow (the builtin webdav support in windows is 5x faster, and rclone-webdav is 30x faster)
@@ -1108,14 +1159,44 @@ note that this disables hotlinking because the opengraph spec demands it; to sne
you can also hotlink files regardless by appending `?raw` to the url
NOTE: because discord (and maybe others) strip query args such as `?raw` in opengraph tags, any links which require a filekey or dirkey will not work
if you want to entirely replace the copyparty response with your own jinja2 template, give the template filepath to `--og-tpl` or volflag `og_tpl` (all members of `HttpCli` are available through the `this` object)
## file deduplication
enable symlink-based upload deduplication globally with `--dedup` or per-volume with volflag `dedup`
when someone tries to upload a file that already exists on the server, the upload will be politely declined and a symlink is created instead, pointing to the nearest copy on disk, thus reducinc disk space usage
**warning:** when enabling dedup, you should also:
* enable indexing with `-e2dsa` or volflag `e2dsa` (see [file indexing](#file-indexing) section below); strongly recommended
* ...and/or `--hardlink-only` to use hardlink-based deduplication instead of symlinks; see explanation below
it will not be safe to rename/delete files if you only enable dedup and none of the above; if you enable indexing then it is not *necessary* to also do hardlinks (but you may still want to)
by default, deduplication is done based on symlinks (symbolic links); these are tiny files which are pointers to the nearest full copy of the file
you can choose to use hardlinks instead of softlinks, globally with `--hardlink-only` or volflag `hardlinkonly`;
advantages of using hardlinks:
* hardlinks are more compatible with other software; they behave entirely like regular files
* you can safely move and rename files using other file managers
* symlinks need to be managed by copyparty to ensure the destinations remain correct
advantages of using symlinks (default):
* each symlink can have its own last-modified timestamp, but a single timestamp is shared by all hardlinks
* symlinks make it more obvious to other software that the file is not a regular file, so this can be less dangerous
* hardlinks look like regular files, so other software may assume they are safe to edit without affecting the other copies
**warning:** if you edit the contents of a deduplicated file, then you will also edit all other copies of that file! This is especially surprising with hardlinks, because they look like regular files, but that same file exists in multiple locations
global-option `--xlink` / volflag `xlink` additionally enables deduplication across volumes, but this is probably buggy and not recommended
## file indexing
enables dedup and music search ++
enable music search, upload-undo, and better dedup
file indexing relies on two database tables, the up2k filetree (`-e2d`) and the metadata tags (`-e2t`), stored in `.hist/up2k.db`. Configuration can be done through arguments, volflags, or a mix of both.
@@ -1129,7 +1210,6 @@ through arguments:
* `-e2v` verfies file integrity at startup, comparing hashes from the db
* `-e2vu` patches the database with the new hashes from the filesystem
* `-e2vp` panics and kills copyparty instead
* `--xlink` enables deduplication across volumes
the same arguments can be set as volflags, in addition to `d2d`, `d2ds`, `d2t`, `d2ts`, `d2v` for disabling:
* `-v ~/music::r:c,e2ds,e2tsr` does a full reindex of everything on startup
@@ -1142,7 +1222,6 @@ note:
* upload-times can be displayed in the file listing by enabling the `.up_at` metadata key, either globally with `-e2d -mte +.up_at` or per-volume with volflags `e2d,mte=+.up_at` (will have a ~17% performance impact on directory listings)
* `e2tsr` is probably always overkill, since `e2ds`/`e2dsa` would pick up any file modifications and `e2ts` would then reindex those, unless there is a new copyparty version with new parsers and the release note says otherwise
* the rescan button in the admin panel has no effect unless the volume has `-e2ds` or higher
* deduplication is possible on windows if you run copyparty as administrator (not saying you should!)
### exclude-patterns
@@ -1311,6 +1390,8 @@ you can set hooks before and/or after an event happens, and currently you can ho
there's a bunch of flags and stuff, see `--help-hooks`
if you want to write your own hooks, see [devnotes](./docs/devnotes.md#event-hooks)
### upload events
@@ -1351,6 +1432,29 @@ there is a [docker-compose example](./docs/examples/docker/idp-authelia-traefik)
a more complete example of the copyparty configuration options [look like this](./docs/examples/docker/idp/copyparty.conf)
but if you just want to let users change their own passwords, then you probably want [user-changeable passwords](#user-changeable-passwords) instead
## user-changeable passwords
if permitted, users can change their own passwords in the control-panel
* not compatible with [identity providers](#identity-providers)
* must be enabled with `--chpw` because account-sharing is a popular usecase
* if you want to enable the feature but deny password-changing for a specific list of accounts, you can do that with `--chpw-no name1,name2,name3,...`
* to perform a password reset, edit the server config and give the user another password there, then do a [config reload](#server-config) or server restart
* the custom passwords are kept in a textfile at filesystem-path `--chpw-db`, by default `chpw.json` in the copyparty config folder
* if you run multiple copyparty instances with different users you *almost definitely* want to specify separate DBs for each instance
* if [password hashing](#password-hashing) is enbled, the passwords in the db are also hashed
* ...which means that all user-defined passwords will be forgotten if you change password-hashing settings
## using the cloud as storage
@@ -1451,10 +1555,14 @@ you can either:
* or do location-based proxying, using `--rp-loc=/stuff` to tell copyparty where it is mounted -- has a slight performance cost and higher chance of bugs
* if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below
when running behind a reverse-proxy (this includes services like cloudflare), it is important to configure real-ip correctly, as many features rely on knowing the client's IP. Look out for red and yellow log messages which explain how to do this. But basically, set `--xff-hdr` to the name of the http header to read the IP from (usually `x-forwarded-for`, but cloudflare uses `cf-connecting-ip`), and then `--xff-src` to the IP of the reverse-proxy so copyparty will trust the xff-hdr. Note that `--rp-loc` in particular will not work at all unless you do this
some reverse proxies (such as [Caddy](https://caddyserver.com/)) can automatically obtain a valid https/tls certificate for you, and some support HTTP/2 and QUIC which *could* be a nice speed boost, depending on a lot of factors
* **warning:** nginx-QUIC (HTTP/3) is still experimental and can make uploads much slower, so HTTP/1.1 is recommended for now
* depending on server/client, HTTP/1.1 can also be 5x faster than HTTP/2
for improved security (and a 10% performance boost) consider listening on a unix-socket with `-i unix:770:www:/tmp/party.sock` (permission `770` means only members of group `www` can access it)
example webserver configs:
* [nginx config](contrib/nginx/copyparty.conf) -- entire domain/subdomain
@@ -1555,6 +1663,23 @@ in a config-file, this is the same as:
run copyparty with `--mimes` to list all the default mappings
### feature chickenbits
buggy feature? rip it out by setting any of the following environment variables to disable its associated bell or whistle,
| env-var | what it does |
| -------------------- | ------------ |
| `PRTY_NO_IFADDR` | disable ip/nic discovery by poking into your OS with ctypes |
| `PRTY_NO_IPV6` | disable some ipv6 support (should not be necessary since windows 2000) |
| `PRTY_NO_LZMA` | disable streaming xz compression of incoming uploads |
| `PRTY_NO_MP` | disable all use of the python `multiprocessing` module (actual multithreading, cpu-count for parsers/thumbnailers) |
| `PRTY_NO_SQLITE` | disable all database-related functionality (file indexing, metadata indexing, most file deduplication logic) |
| `PRTY_NO_TLS` | disable native HTTPS support; if you still want to accept HTTPS connections then TLS must now be terminated by a reverse-proxy |
| `PRTY_NO_TPOKE` | disable systemd-tmpfilesd avoider |
example: `PRTY_NO_IFADDR=1 python3 copyparty-sfx.py`
# packages
the party might be closer than you think
@@ -1758,10 +1883,12 @@ interact with copyparty using non-browser clients
* FUSE: mount a copyparty server as a local filesystem
* cross-platform python client available in [./bin/](bin/)
* able to mount nginx and iis directory listings too, not just copyparty
* can be downloaded from copyparty: controlpanel -> connect -> [partyfuse.py](http://127.0.0.1:3923/.cpr/a/partyfuse.py)
* [rclone](https://rclone.org/) as client can give ~5x performance, see [./docs/rclone.md](docs/rclone.md)
* sharex (screenshot utility): see [./contrib/sharex.sxcu](contrib/#sharexsxcu)
* and for screenshots on linux, see [./contrib/flameshot.sh](./contrib/flameshot.sh)
* contextlet (web browser integration); see [contrib contextlet](contrib/#send-to-cppcontextletjson)
@@ -1796,9 +1923,9 @@ alternatively, some alternatives roughly sorted by speed (unreproducible benchma
* [rclone-webdav](./docs/rclone.md) (25s), read/WRITE (rclone v1.63 or later)
* [rclone-http](./docs/rclone.md) (26s), read-only
* [partyfuse.py](./bin/#partyfusepy) (35s), read-only
* [partyfuse.py](./bin/#partyfusepy) (26s), read-only
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
* davfs2 (103s), read/WRITE, *very fast* on small files
* davfs2 (103s), read/WRITE
* [win10-webdav](#webdav-server) (138s), read/WRITE
* [win10-smb2](#smb-server) (387s), read/WRITE
@@ -1838,6 +1965,9 @@ below are some tweaks roughly ordered by usefulness:
* `-q` disables logging and can help a bunch, even when combined with `-lo` to redirect logs to file
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
* and also makes thumbnails load faster, regardless of e2d/e2t
* `--dedup` enables deduplication and thus avoids writing to the HDD if someone uploads a dupe
* `--safe-dedup 1` makes deduplication much faster during upload by skipping verification of file contents; safe if there is no other software editing/moving the files in the volumes
* `--no-dirsz` shows the size of folder inodes instead of the total size of the contents, giving about 30% faster folder listings
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
* if your volumes are on a network-disk such as NFS / SMB / s3, specifying larger values for `--iobuf` and/or `--s-rd-sz` and/or `--s-wr-sz` may help; try setting all of them to `524288` or `1048576` or `4194304`
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
@@ -1877,6 +2007,7 @@ some notes on hardening
* cors doesn't work right otherwise
* if you allow anonymous uploads or otherwise don't trust the contents of a volume, you can prevent XSS with volflag `nohtml`
* this returns html documents as plaintext, and also disables markdown rendering
* when running behind a reverse-proxy, listen on a unix-socket for tighter access control (and more performance); see [reverse-proxy](#reverse-proxy) or `--help-bind`
safety profiles:
@@ -1891,7 +2022,7 @@ safety profiles:
* `--hardlink` creates hardlinks instead of symlinks when deduplicating uploads, which is less maintenance
* however note if you edit one file it will also affect the other copies
* `--vague-403` returns a "404 not found" instead of "401 unauthorized" which is a common enterprise meme
* `--nih` removes the server hostname from directory listings
* `-nih` removes the server hostname from directory listings
* option `-sss` is a shortcut for the above plus:
* `--no-dav` disables webdav support
@@ -1954,6 +2085,8 @@ volflag `dky` disables the actual key-check, meaning anyone can see the contents
volflag `dks` lets people enter subfolders as well, and also enables download-as-zip/tar
if you enable dirkeys, it is probably a good idea to enable filekeys too, otherwise it will be impossible to hotlink files from a folder which was accessed using a dirkey
dirkeys are generated based on another salt (`--dk-salt`) + filesystem-path and have a few limitations:
* the key does not change if the contents of the folder is modified
* if you need a new dirkey, either change the salt or rename the folder
@@ -2036,11 +2169,42 @@ enable [thumbnails](#thumbnails) of...
* **JPEG XL pictures:** `pyvips` or `ffmpeg`
enable [smb](#smb-server) support (**not** recommended):
* `impacket==0.11.0`
* `impacket==0.12.0`
`pyvips` gives higher quality thumbnails than `Pillow` and is 320% faster, using 270% more ram: `sudo apt install libvips42 && python3 -m pip install --user -U pyvips`
### dependency chickenbits
prevent loading an optional dependency , for example if:
* you have an incompatible version installed and it causes problems
* you just don't want copyparty to use it, maybe to save ram
set any of the following environment variables to disable its associated optional feature,
| env-var | what it does |
| -------------------- | ------------ |
| `PRTY_NO_ARGON2` | disable argon2-cffi password hashing |
| `PRTY_NO_CFSSL` | never attempt to generate self-signed certificates using [cfssl](https://github.com/cloudflare/cfssl) |
| `PRTY_NO_FFMPEG` | **audio transcoding** goes byebye, **thumbnailing** must be handled by Pillow/libvips |
| `PRTY_NO_FFPROBE` | **audio transcoding** goes byebye, **thumbnailing** must be handled by Pillow/libvips, **metadata-scanning** must be handled by mutagen |
| `PRTY_NO_MUTAGEN` | do not use [mutagen](https://pypi.org/project/mutagen/) for reading metadata from media files; will fallback to ffprobe |
| `PRTY_NO_PIL` | disable all [Pillow](https://pypi.org/project/pillow/)-based thumbnail support; will fallback to libvips or ffmpeg |
| `PRTY_NO_PILF` | disable Pillow `ImageFont` text rendering, used for folder thumbnails |
| `PRTY_NO_PIL_AVIF` | disable 3rd-party Pillow plugin for [AVIF support](https://pypi.org/project/pillow-avif-plugin/) |
| `PRTY_NO_PIL_HEIF` | disable 3rd-party Pillow plugin for [HEIF support](https://pypi.org/project/pyheif-pillow-opener/) |
| `PRTY_NO_PIL_WEBP` | disable use of native webp support in Pillow |
| `PRTY_NO_PSUTIL` | do not use [psutil](https://pypi.org/project/psutil/) for reaping stuck hooks and plugins on Windows |
| `PRTY_NO_VIPS` | disable all [libvips](https://pypi.org/project/pyvips/)-based thumbnail support; will fallback to Pillow or ffmpeg |
example: `PRTY_NO_PIL=1 python3 copyparty-sfx.py`
* `PRTY_NO_PIL` saves ram
* `PRTY_NO_VIPS` saves ram and startup time
* python2.7 on windows: `PRTY_NO_FFMPEG` + `PRTY_NO_FFPROBE` saves startup time
## optional gpl stuff
some bundled tools have copyleft dependencies, see [./bin/#mtag](bin/#mtag)
@@ -2079,7 +2243,7 @@ then again, if you are already into downloading shady binaries from the internet
## zipapp
another emergency alternative, [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz) has less features, requires python 3.7 or newer, worse compression, and more importantly is unable to benefit from more recent versions of jinja2 and such (which makes it less secure)... lots of drawbacks with this one really -- but it *may* just work if the regular sfx fails to start because the computer is messed up in certain funky ways, so it's worth a shot if all else fails
another emergency alternative, [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz) has less features, is slow, requires python 3.7 or newer, worse compression, and more importantly is unable to benefit from more recent versions of jinja2 and such (which makes it less secure)... lots of drawbacks with this one really -- but it does not unpack any temporay files to disk, so it *may* just work if the regular sfx fails to start because the computer is messed up in certain funky ways, so it's worth a shot if all else fails
run it by doubleclicking it, or try typing `python copyparty.pyz` in your terminal/console/commandline/telex if that fails

View File

@@ -15,22 +15,18 @@ produces a chronological list of all uploads by collecting info from up2k databa
# [`partyfuse.py`](partyfuse.py)
* mount a copyparty server as a local filesystem (read-only)
* **supports Windows!** -- expect `194 MiB/s` sequential read
* **supports Linux** -- expect `117 MiB/s` sequential read
* **supports Linux** -- expect `600 MiB/s` sequential read
* **supports macos** -- expect `85 MiB/s` sequential read
filecache is default-on for windows and macos;
* macos readsize is 64kB, so speed ~32 MiB/s without the cache
* windows readsize varies by software; explorer=1M, pv=32k
note that copyparty should run with `-ed` to enable dotfiles (hidden otherwise)
also consider using [../docs/rclone.md](../docs/rclone.md) instead for 5x performance
and consider using [../docs/rclone.md](../docs/rclone.md) instead; usually a bit faster, especially on windows
## to run this on windows:
* install [winfsp](https://github.com/billziss-gh/winfsp/releases/latest) and [python 3](https://www.python.org/downloads/)
* [x] add python 3.x to PATH (it asks during install)
* `python -m pip install --user fusepy`
* `python -m pip install --user fusepy` (or grab a copy of `fuse.py` from the `connect` page on your copyparty, and keep it in the same folder)
* `python ./partyfuse.py n: http://192.168.1.69:3923/`
10% faster in [msys2](https://www.msys2.org/), 700% faster if debug prints are enabled:

View File

@@ -2,7 +2,7 @@ standalone programs which are executed by copyparty when an event happens (uploa
these programs either take zero arguments, or a filepath (the affected file), or a json message with filepath + additional info
run copyparty with `--help-hooks` for usage details / hook type explanations (xbu/xau/xiu/xbr/xar/xbd/xad)
run copyparty with `--help-hooks` for usage details / hook type explanations (xm/xbu/xau/xiu/xbr/xar/xbd/xad/xban)
> **note:** in addition to event hooks (the stuff described here), copyparty has another api to run your programs/scripts while providing way more information such as audio tags / video codecs / etc and optionally daisychaining data between scripts in a processing pipeline; if that's what you want then see [mtp plugins](../mtag/) instead
@@ -13,6 +13,7 @@ run copyparty with `--help-hooks` for usage details / hook type explanations (xb
* [image-noexif.py](image-noexif.py) removes image exif by overwriting / directly editing the uploaded file
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable
* [into-the-cache-it-goes.py](into-the-cache-it-goes.py) avoids bugs in caching proxies by immediately downloading each file that is uploaded
# upload batches
@@ -23,6 +24,7 @@ these are `--xiu` hooks; unlike `xbu` and `xau` (which get executed on every sin
# before upload
* [reject-extension.py](reject-extension.py) rejects uploads if they match a list of file extensions
* [reloc-by-ext.py](reloc-by-ext.py) redirects an upload to another destination based on the file extension
# on message

View File

@@ -0,0 +1,140 @@
#!/usr/bin/env python3
import sys
import json
import shutil
import platform
import subprocess as sp
from urllib.parse import quote
_ = r"""
try to avoid race conditions in caching proxies
(primarily cloudflare, but probably others too)
by means of the most obvious solution possible:
just as each file has finished uploading, use
the server's external URL to download the file
so that it ends up in the cache, warm and snug
this intentionally delays the upload response
as it waits for the file to finish downloading
before copyparty is allowed to return the URL
NOTE: you must edit this script before use,
replacing https://example.com with your URL
NOTE: if the files are only accessible with a
password and/or filekey, you must also add
a cromulent password in the PASSWORD field
NOTE: needs either wget, curl, or "requests":
python3 -m pip install --user -U requests
example usage as global config:
--xau j,t10,bin/hooks/into-the-cache-it-goes.py
parameters explained,
xau = execute after upload
j = this hook needs upload information as json (not just the filename)
t10 = abort download and continue if it takes longer than 10sec
example usage as a volflag (per-volume config):
-v srv/inc:inc:r:rw,ed:c,xau=j,t10,bin/hooks/into-the-cache-it-goes.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with params explained above)
example usage as a volflag in a copyparty config file:
[/inc]
srv/inc
accs:
r: *
rw: ed
flags:
xau: j,t10,bin/hooks/into-the-cache-it-goes.py
"""
# replace this with your site's external URL
# (including the :portnumber if necessary)
SITE_URL = "https://example.com"
# if downloading is protected by passwords or filekeys,
# specify a valid password between the quotes below:
PASSWORD = ""
# if file is larger than this, skip download
MAX_MEGABYTES = 8
# =============== END OF CONFIG ===============
WINDOWS = platform.system() == "Windows"
def main():
fun = download_with_python
if shutil.which("curl"):
fun = download_with_curl
elif shutil.which("wget"):
fun = download_with_wget
inf = json.loads(sys.argv[1])
if inf["sz"] > 1024 * 1024 * MAX_MEGABYTES:
print("[into-the-cache] file is too large; will not download")
return
file_url = "/"
if inf["vp"]:
file_url += inf["vp"] + "/"
file_url += inf["ap"].replace("\\", "/").split("/")[-1]
file_url = SITE_URL.rstrip("/") + quote(file_url, safe=b"/")
print("[into-the-cache] %s(%s)" % (fun.__name__, file_url))
fun(file_url, PASSWORD.strip())
print("[into-the-cache] Download OK")
def download_with_curl(url, pw):
cmd = ["curl"]
if pw:
cmd += ["-HPW:%s" % (pw,)]
nah = sp.DEVNULL
sp.check_call(cmd + [url], stdout=nah, stderr=nah)
def download_with_wget(url, pw):
cmd = ["wget", "-O"]
cmd += ["nul" if WINDOWS else "/dev/null"]
if pw:
cmd += ["--header=PW:%s" % (pw,)]
nah = sp.DEVNULL
sp.check_call(cmd + [url], stdout=nah, stderr=nah)
def download_with_python(url, pw):
import requests
headers = {}
if pw:
headers["PW"] = pw
with requests.get(url, headers=headers, stream=True) as r:
r.raise_for_status()
for _ in r.iter_content(chunk_size=1024 * 256):
pass
if __name__ == "__main__":
main()

View File

@@ -23,17 +23,18 @@ because the keyword "anime" is in the DESTS config below
needs python3
example usage as global config (not a good idea):
python copyparty-sfx.py --xm f,j,t60,bin/hooks/qbittorrent-magnet.py
python copyparty-sfx.py --xm aw,f,j,t60,bin/hooks/qbittorrent-magnet.py
parameters explained,
xm = execute on message (📟)
aw = only users with write-access can use this
f = fork; don't delay other hooks while this is running
j = provide message information as json (not just the text)
t60 = abort if qbittorrent has to think about it for more than 1 min
example usage as a volflag (per-volume config, much better):
-v srv/qb:qb:A,ed:c,xm=f,j,t60,bin/hooks/qbittorrent-magnet.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-v srv/qb:qb:A,ed:c,xm=aw,f,j,t60,bin/hooks/qbittorrent-magnet.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/qb as volume /qb with Admin for user 'ed',
running this plugin on all messages with the params explained above)
@@ -44,7 +45,7 @@ example usage as a volflag in a copyparty config file:
accs:
A: ed
flags:
xm: f,j,t60,bin/hooks/qbittorrent-magnet.py
xm: aw,f,j,t60,bin/hooks/qbittorrent-magnet.py
the volflag examples only kicks in if you send the torrent magnet
while you're in the /qb folder (or any folder below there)

127
bin/hooks/reloc-by-ext.py Normal file
View File

@@ -0,0 +1,127 @@
#!/usr/bin/env python3
import json
import os
import re
import sys
_ = r"""
relocate/redirect incoming uploads according to file extension or name
example usage as global config:
--xbu j,c1,bin/hooks/reloc-by-ext.py
parameters explained,
xbu = execute before upload
j = this hook needs upload information as json (not just the filename)
c1 = this hook returns json on stdout, so tell copyparty to read that
example usage as a volflag (per-volume config):
-v srv/inc:inc:r:rw,ed:c,xbu=j,c1,bin/hooks/reloc-by-ext.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params explained above)
example usage as a volflag in a copyparty config file:
[/inc]
srv/inc
accs:
r: *
rw: ed
flags:
xbu: j,c1,bin/hooks/reloc-by-ext.py
note: this could also work as an xau hook (after-upload), but
because it doesn't need to read the file contents its better
as xbu (before-upload) since that's safer / less buggy,
and only xbu works with up2k (dragdrop into browser)
"""
PICS = "avif bmp gif heic heif jpeg jpg jxl png psd qoi tga tif tiff webp"
VIDS = "3gp asf avi flv mkv mov mp4 mpeg mpeg2 mpegts mpg mpg2 nut ogm ogv rm ts vob webm wmv"
MUSIC = "aac aif aiff alac amr ape dfpwm flac m4a mp3 ogg opus ra tak tta wav wma wv"
def main():
inf = json.loads(sys.argv[1])
vdir, fn = os.path.split(inf["vp"])
try:
fn, ext = fn.rsplit(".", 1)
except:
# no file extension; pretend it's "bin"
ext = "bin"
ext = ext.lower()
# this function must end by printing the action to perform;
# that's handled by the print(json.dumps(... at the bottom
#
# the action can contain the following keys:
# "vp" is the folder URL to move the upload to,
# "ap" is the filesystem-path to move it to (but "vp" is safer),
# "fn" overrides the final filename to use
##
## some example actions to take; pick one by
## selecting it inside the print at the end:
##
# create a subfolder named after the filetype and move it into there
into_subfolder = {"vp": ext}
# move it into a toplevel folder named after the filetype
into_toplevel = {"vp": "/" + ext}
# move it into a filetype-named folder next to the target folder
into_sibling = {"vp": "../" + ext}
# move images into "/just/pics", vids into "/just/vids",
# music into "/just/tunes", and anything else as-is
if ext in PICS.split():
by_category = {"vp": "/just/pics"}
elif ext in VIDS.split():
by_category = {"vp": "/just/vids"}
elif ext in MUSIC.split():
by_category = {"vp": "/just/tunes"}
else:
by_category = {} # no action
# now choose the default effect to apply; can be any of these:
# into_subfolder into_toplevel into_sibling by_category
effect = {"vp": "/junk"}
##
## but we can keep going, adding more speicifc rules
## which can take precedence, replacing the fallback
## effect we just specified:
##
fn = fn.lower() # lowercase filename to make this easier
if "screenshot" in fn:
effect = {"vp": "/ss"}
if "mpv_" in fn:
effect = {"vp": "/anishots"}
elif "debian" in fn or "biebian" in fn:
effect = {"vp": "/linux-ISOs"}
elif re.search(r"ep(isode |\.)?[0-9]", fn):
effect = {"vp": "/podcasts"}
# regex lets you grab a part of the matching
# text and use that in the upload path:
m = re.search(r"\b(op|ed)([^a-z]|$)", fn)
if m:
# the regex matched; use "anime-op" or "anime-ed"
effect = {"vp": "/anime-" + m[1]}
# aaand DO IT
print(json.dumps({"reloc": effect}))
if __name__ == "__main__":
main()

View File

@@ -12,18 +12,19 @@ application/x-www-form-urlencoded (for example using the
📟 message-to-server-log in the web-ui)
example usage as global config:
--xm f,j,t3600,bin/hooks/wget.py
--xm aw,f,j,t3600,bin/hooks/wget.py
parameters explained,
xm = execute on message-to-server-log
aw = only users with write-access can use this
f = fork; don't delay other hooks while this is running
j = provide message information as json (not just the text)
c3 = mute all output
t3600 = timeout and abort download after 1 hour
example usage as a volflag (per-volume config):
-v srv/inc:inc:r:rw,ed:c,xm=f,j,t3600,bin/hooks/wget.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-v srv/inc:inc:r:rw,ed:c,xm=aw,f,j,t3600,bin/hooks/wget.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
@@ -36,7 +37,7 @@ example usage as a volflag in a copyparty config file:
r: *
rw: ed
flags:
xm: f,j,t3600,bin/hooks/wget.py
xm: aw,f,j,t3600,bin/hooks/wget.py
the volflag examples only kicks in if you send the message
while you're in the /inc folder (or any folder below there)

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -19,6 +19,9 @@
* the `act:bput` thing is optional since copyparty v1.9.29
* using an older sharex version, maybe sharex v12.1.1 for example? dw fam i got your back 👉😎👉 [`sharex12.sxcu`](sharex12.sxcu)
### [`flameshot.sh`](flameshot.sh)
* takes a screenshot with [flameshot](https://flameshot.org/) on Linux, uploads it, and writes the URL to clipboard
### [`send-to-cpp.contextlet.json`](send-to-cpp.contextlet.json)
* browser integration, kind of? custom rightclick actions and stuff
* rightclick a pic and send it to copyparty straight from your browser

14
contrib/flameshot.sh Executable file
View File

@@ -0,0 +1,14 @@
#!/bin/bash
set -e
# take a screenshot with flameshot and send it to copyparty;
# the image url will be placed on your clipboard
password=wark
url=https://a.ocv.me/up/
filename=$(date +%Y-%m%d-%H%M%S).png
flameshot gui -s -r |
curl -T- $url$filename?pw=$password |
tail -n 1 |
xsel -ib

View File

@@ -1,14 +1,10 @@
# when running copyparty behind a reverse proxy,
# the following arguments are recommended:
#
# -i 127.0.0.1 only accept connections from nginx
#
# -nc must match or exceed the webserver's max number of concurrent clients;
# copyparty default is 1024 if OS permits it (see "max clients:" on startup),
# look for "max clients:" when starting copyparty, as nginx should
# not accept more consecutive clients than what copyparty is able to;
# nginx default is 512 (worker_processes 1, worker_connections 512)
#
# you may also consider adding -j0 for CPU-intensive configurations
# (5'000 requests per second, or 20gbps upload/download in parallel)
# rarely, in some extreme usecases, it can be good to add -j0
# (40'000 requests per second, or 20gbps upload/download in parallel)
# but this is usually counterproductive and slightly buggy
#
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
#
@@ -20,10 +16,33 @@
#
# and then enable it below by uncomenting the cloudflare-only.conf line
upstream cpp {
upstream cpp_tcp {
# alternative 1: connect to copyparty using tcp;
# cpp_uds is slightly faster and more secure, but
# cpp_tcp is easier to setup and "just works"
# ...you should however restrict copyparty to only
# accept connections from nginx by adding these args:
# -i 127.0.0.1
server 127.0.0.1:3923 fail_timeout=1s;
keepalive 1;
}
upstream cpp_uds {
# alternative 2: unix-socket, aka. "unix domain socket";
# 5-10% faster, and better isolation from other software,
# but there must be at least one unix-group which both
# nginx and copyparty is a member of; if that group is
# "www" then run copyparty with the following args:
# -i unix:770:www:/tmp/party.sock
server unix:/tmp/party.sock fail_timeout=1s;
keepalive 1;
}
server {
listen 443 ssl;
listen [::]:443 ssl;
@@ -34,7 +53,8 @@ server {
#include /etc/nginx/cloudflare-only.conf;
location / {
proxy_pass http://cpp;
# recommendation: replace cpp_tcp with cpp_uds below
proxy_pass http://cpp_tcp;
proxy_redirect off;
# disable buffering (next 4 lines)
proxy_http_version 1.1;
@@ -52,6 +72,7 @@ server {
}
}
# default client_max_body_size (1M) blocks uploads larger than 256 MiB
client_max_body_size 1024M;
client_header_timeout 610m;

View File

@@ -1,6 +1,6 @@
# Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty
pkgver="1.13.3"
pkgver="1.15.3"
pkgrel=1
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
arch=("any")
@@ -21,7 +21,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
)
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
backup=("etc/${pkgname}.d/init" )
sha256sums=("35845d6335fba4a13d153d7062f365dad529202bc865b93267d899e19a0a6da3")
sha256sums=("d4b02a8d618749c317161773fdd3b66992557f682b7cccd8c4c8583497c4cb24")
build() {
cd "${srcdir}/${pkgname}-${pkgver}"

View File

@@ -1,5 +1,5 @@
{
"url": "https://github.com/9001/copyparty/releases/download/v1.13.3/copyparty-sfx.py",
"version": "1.13.3",
"hash": "sha256-LtbdioAYtWGC4+5frzUjXwm0thubkyMhc86YU/rXIuo="
"url": "https://github.com/9001/copyparty/releases/download/v1.15.3/copyparty-sfx.py",
"version": "1.15.3",
"hash": "sha256-OmoLwakVaZM9QwkujT4wwhqC5KcaS5u81DDTjS2r7MI="
}

View File

@@ -20,6 +20,13 @@ point `--js-browser` to one of these by URL:
## example any-js
point `--js-browser` and/or `--js-other` to one of these by URL:
* [`banner.js`](banner.js) shows a very enterprise [legal-banner](https://github.com/user-attachments/assets/8ae8e087-b209-449c-b08d-74e040f0284b)
## example browser-css
point `--css-browser` to one of these by URL:

93
contrib/plugins/banner.js Normal file
View File

@@ -0,0 +1,93 @@
(function() {
// usage: copy this to '.banner.js' in your webroot,
// and run copyparty with the following arguments:
// --js-browser /.banner.js --js-other /.banner.js
// had to pick the most chuuni one as the default
var bannertext = '' +
'<h3>You are accessing a U.S. Government (USG) Information System (IS) that is provided for USG-authorized use only.</h3>' +
'<p>By using this IS (which includes any device attached to this IS), you consent to the following conditions:</p>' +
'<ul>' +
'<li>The USG routinely intercepts and monitors communications on this IS for purposes including, but not limited to, penetration testing, COMSEC monitoring, network operations and defense, personnel misconduct (PM), law enforcement (LE), and counterintelligence (CI) investigations.</li>' +
'<li>At any time, the USG may inspect and seize data stored on this IS.</li>' +
'<li>Communications using, or data stored on, this IS are not private, are subject to routine monitoring, interception, and search, and may be disclosed or used for any USG-authorized purpose.</li>' +
'<li>This IS includes security measures (e.g., authentication and access controls) to protect USG interests -- not for your personal benefit or privacy.</li>' +
'<li>Notwithstanding the above, using this IS does not constitute consent to PM, LE or CI investigative searching or monitoring of the content of privileged communications, or work product, related to personal representation or services by attorneys, psychotherapists, or clergy, and their assistants. Such communications and work product are private and confidential. See User Agreement for details.</li>' +
'</ul>';
// fancy div to insert into pages
function bannerdiv(border) {
var ret = mknod('div', null, bannertext);
if (border)
ret.setAttribute("style", "border:1em solid var(--fg); border-width:.3em 0; margin:3em 0");
return ret;
}
// keep all of these false and then selectively enable them in the if-blocks below
var show_msgbox = false,
login_top = false,
top = false,
bottom = false,
top_bordered = false,
bottom_bordered = false;
if (QS("h1#cc") && QS("a#k")) {
// this is the controlpanel
// (you probably want to keep just one of these enabled)
show_msgbox = true;
login_top = true;
bottom = true;
}
else if (ebi("swin") && ebi("smac")) {
// this is the connect-page, same deal here
show_msgbox = true;
top_bordered = true;
bottom_bordered = true;
}
else if (ebi("op_cfg") || ebi("div#mw") ) {
// we're running in the main filebrowser (op_cfg) or markdown-viewer/editor (div#mw),
// fragile pages which break if you do something too fancy
show_msgbox = true;
}
// shows a fullscreen messagebox; works on all pages
if (show_msgbox) {
var now = Math.floor(Date.now() / 1000),
last_shown = sread("bannerts") || 0;
// 60 * 60 * 17 = 17 hour cooldown
if (now - last_shown > 60 * 60 * 17) {
swrite("bannerts", now);
modal.confirm(bannertext, null, function () {
location = 'https://this-page-intentionally-left-blank.org/';
});
}
}
// show a message on the page footer; only works on the connect-page
if (top || top_bordered) {
var dst = ebi('wrap');
dst.insertBefore(bannerdiv(top_bordered), dst.firstChild);
}
// show a message on the page footer; only works on the controlpanel and connect-page
if (bottom || bottom_bordered) {
ebi('wrap').appendChild(bannerdiv(bottom_bordered));
}
// show a message on the top of the page; only works on the controlpanel
if (login_top) {
var dst = QS('h1');
dst.parentNode.insertBefore(bannerdiv(false), dst);
}
})();

116
contrib/themes/bsod.css Normal file
View File

@@ -0,0 +1,116 @@
/* copy bsod.* into a folder named ".themes" in your webroot and then
--themes=10 --theme=9 --css-browser=/.themes/bsod.css
*/
html.ey {
--w2: #3d7bbc;
--w3: #5fcbec;
--fg: #fff;
--fg-max: #fff;
--fg-weak: var(--w3);
--bg: #2067b2;
--bg-d3: var(--bg);
--bg-d2: var(--w2);
--bg-d1: var(--fg-weak);
--bg-u2: var(--bg);
--bg-u3: var(--bg);
--bg-u5: var(--w2);
--tab-alt: var(--fg-weak);
--row-alt: var(--w2);
--scroll: var(--w3);
--a: #fff;
--a-b: #fff;
--a-hil: #fff;
--a-h-bg: var(--fg-weak);
--a-dark: var(--a);
--a-gray: var(--fg-weak);
--btn-fg: var(--a);
--btn-bg: var(--w2);
--btn-h-fg: var(--w2);
--btn-1-fg: var(--bg);
--btn-1-bg: var(--a);
--txt-sh: a;
--txt-bg: var(--w2);
--u2-b1-bg: var(--w2);
--u2-b2-bg: var(--w2);
--u2-txt-bg: var(--w2);
--u2-tab-bg: a;
--u2-tab-1-bg: var(--w2);
--sort-1: var(--a);
--sort-1: var(--fg-weak);
--tree-bg: var(--bg);
--g-b1: a;
--g-b2: a;
--g-f-bg: var(--w2);
--f-sh1: 0.1;
--f-sh2: 0.02;
--f-sh3: 0.1;
--f-h-b1: a;
--srv-1: var(--a);
--srv-3: var(--a);
--mp-sh: a;
}
html.ey {
background: url('bsod.png') top 5em right 4.5em no-repeat fixed var(--bg);
}
html.ey body#b {
background: var(--bg); /*sandbox*/
}
html.ey #ops {
margin: 1.7em 1.5em 0 1.5em;
border-radius: .3em;
border-width: 1px 0;
}
html.ey #ops a {
text-shadow: 1px 1px 0 rgba(0,0,0,0.5);
}
html.ey .opbox {
margin: 1.5em 0 0 0;
}
html.ey #tree {
box-shadow: none;
}
html.ey #tt {
border-color: var(--w2);
background: var(--w2);
}
html.ey .mdo a {
background: none;
text-decoration: underline;
}
html.ey .mdo pre,
html.ey .mdo code {
color: #fff;
background: var(--w2);
border: none;
}
html.ey .mdo h1,
html.ey .mdo h2 {
background: none;
border-color: var(--w2);
}
html.ey .mdo ul ul,
html.ey .mdo ul ol,
html.ey .mdo ol ul,
html.ey .mdo ol ol {
border-color: var(--w2);
}
html.ey .mdo p>em,
html.ey .mdo li>em,
html.ey .mdo td>em {
color: #fd0;
}

BIN
contrib/themes/bsod.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

View File

@@ -16,9 +16,12 @@ except:
TYPE_CHECKING = False
if True:
from typing import Any, Callable
from types import ModuleType
from typing import Any, Callable, Optional
PY2 = sys.version_info < (3,)
PY36 = sys.version_info > (3, 6)
if not PY2:
unicode: Callable[[Any], str] = str
else:
@@ -50,9 +53,64 @@ try:
except:
CORES = (os.cpu_count() if hasattr(os, "cpu_count") else 0) or 2
# all embedded resources to be retrievable over http
zs = """
web/a/partyfuse.py
web/a/u2c.py
web/a/webdav-cfg.bat
web/baguettebox.js
web/browser.css
web/browser.html
web/browser.js
web/browser2.html
web/cf.html
web/copyparty.gif
web/dd/2.png
web/dd/3.png
web/dd/4.png
web/dd/5.png
web/deps/busy.mp3
web/deps/easymde.css
web/deps/easymde.js
web/deps/marked.js
web/deps/fuse.py
web/deps/mini-fa.css
web/deps/mini-fa.woff
web/deps/prism.css
web/deps/prism.js
web/deps/prismd.css
web/deps/scp.woff2
web/deps/sha512.ac.js
web/deps/sha512.hw.js
web/md.css
web/md.html
web/md.js
web/md2.css
web/md2.js
web/mde.css
web/mde.html
web/mde.js
web/msg.css
web/msg.html
web/shares.css
web/shares.html
web/shares.js
web/splash.css
web/splash.html
web/splash.js
web/svcs.html
web/svcs.js
web/ui.css
web/up2k.js
web/util.js
web/w.hash.js
"""
RES = set(zs.strip().split("\n"))
class EnvParams(object):
def __init__(self) -> None:
self.pkg: Optional[ModuleType] = None
self.t0 = time.time()
self.mod = ""
self.cfg = ""

View File

@@ -27,6 +27,7 @@ from .__init__ import (
EXE,
MACOS,
PY2,
PY36,
VT100,
WINDOWS,
E,
@@ -54,7 +55,10 @@ from .util import (
Daemon,
align_tab,
ansi_re,
b64enc,
dedent,
has_resource,
load_resource,
min_ex,
pybin,
termsize,
@@ -67,7 +71,13 @@ if True: # pylint: disable=using-constant-test
from typing import Any, Optional
if PY2:
range = xrange # type: ignore
try:
if os.environ.get("PRTY_NO_TLS"):
raise Exception()
HAVE_SSL = True
import ssl
except:
@@ -198,7 +208,7 @@ def init_E(EE: EnvParams) -> None:
errs.append("Using [%s] instead" % (p,))
if errs:
print("WARNING: " + ". ".join(errs))
warn(". ".join(errs))
return p # type: ignore
except Exception as ex:
@@ -208,6 +218,8 @@ def init_E(EE: EnvParams) -> None:
raise Exception("could not find a writable path for config")
assert __package__ # !rm
E.pkg = sys.modules[__package__]
E.mod = os.path.dirname(os.path.realpath(__file__))
if E.mod.endswith("__init__"):
E.mod = os.path.dirname(E.mod)
@@ -228,7 +240,7 @@ def init_E(EE: EnvParams) -> None:
raise
def get_srvname() -> str:
def get_srvname(verbose) -> str:
try:
ret: str = unicode(socket.gethostname()).split(".")[0]
except:
@@ -238,7 +250,8 @@ def get_srvname() -> str:
return ret
fp = os.path.join(E.cfg, "name.txt")
lprint("using hostname from {}\n".format(fp))
if verbose:
lprint("using hostname from {}\n".format(fp))
try:
with open(fp, "rb") as f:
ret = f.read().decode("utf-8", "replace").strip()
@@ -260,7 +273,7 @@ def get_fk_salt() -> str:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(18))
ret = b64enc(os.urandom(18))
with open(fp, "wb") as f:
f.write(ret + b"\n")
@@ -273,7 +286,7 @@ def get_dk_salt() -> str:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(30))
ret = b64enc(os.urandom(30))
with open(fp, "wb") as f:
f.write(ret + b"\n")
@@ -286,7 +299,7 @@ def get_ah_salt() -> str:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(18))
ret = b64enc(os.urandom(18))
with open(fp, "wb") as f:
f.write(ret + b"\n")
@@ -316,8 +329,7 @@ def ensure_locale() -> None:
def ensure_webdeps() -> None:
ap = os.path.join(E.mod, "web/deps/mini-fa.woff")
if os.path.exists(ap):
if has_resource(E, "web/deps/mini-fa.woff"):
return
warn(
@@ -344,7 +356,7 @@ def configure_ssl_ver(al: argparse.Namespace) -> None:
# oh man i love openssl
# check this out
# hold my beer
assert ssl
assert ssl # type: ignore # !rm
ptn = re.compile(r"^OP_NO_(TLS|SSL)v")
sslver = terse_sslver(al.ssl_ver).split(",")
flags = [k for k in ssl.__dict__ if ptn.match(k)]
@@ -378,7 +390,7 @@ def configure_ssl_ver(al: argparse.Namespace) -> None:
def configure_ssl_ciphers(al: argparse.Namespace) -> None:
assert ssl
assert ssl # type: ignore # !rm
ctx = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
if al.ssl_ver:
ctx.options &= ~al.ssl_flags_en
@@ -491,27 +503,75 @@ def disable_quickedit() -> None:
def sfx_tpoke(top: str):
files = [os.path.join(dp, p) for dp, dd, df in os.walk(top) for p in dd + df]
if os.environ.get("PRTY_NO_TPOKE"):
return
files = [top] + [
os.path.join(dp, p) for dp, dd, df in os.walk(top) for p in dd + df
]
while True:
t = int(time.time())
for f in [top] + files:
os.utime(f, (t, t))
for f in list(files):
try:
os.utime(f, (t, t))
except Exception as ex:
lprint("<TPOKE> [%s] %r" % (f, ex))
files.remove(f)
time.sleep(78123)
def showlic() -> None:
p = os.path.join(E.mod, "res", "COPYING.txt")
if not os.path.exists(p):
try:
with load_resource(E, "res/COPYING.txt") as f:
buf = f.read()
except:
buf = b""
if buf:
print(buf.decode("utf-8", "replace"))
else:
print("no relevant license info to display")
return
with open(p, "rb") as f:
print(f.read().decode("utf-8", "replace"))
def get_sects():
return [
[
"bind",
"configure listening",
dedent(
"""
\033[33m-i\033[0m takes a comma-separated list of interfaces to listen on;
IP-addresses and/or unix-sockets (Unix Domain Sockets)
the default (\033[32m-i ::\033[0m) means all IPv4 and IPv6 addresses
\033[32m-i 0.0.0.0\033[0m listens on all IPv4 NICs/subnets
\033[32m-i 127.0.0.1\033[0m listens on IPv4 localhost only
\033[32m-i 127.1\033[0m listens on IPv4 localhost only
\033[32m-i 127.1,192.168.123.1\033[0m = IPv4 localhost and 192.168.123.1
\033[33m-p\033[0m takes a comma-separated list of tcp ports to listen on;
the default is \033[32m-p 3923\033[0m but as root you can \033[32m-p 80,443,3923\033[0m
when running behind a reverse-proxy, it's recommended to
use unix-sockets for improved performance and security;
\033[32m-i unix:770:www:\033[33m/tmp/a.sock\033[0m listens on \033[33m/tmp/a.sock\033[0m with
permissions \033[33m0770\033[0m; only accessible to members of the \033[33mwww\033[0m
group. This is the best approach. Alternatively,
\033[32m-i unix:777:\033[33m/tmp/a.sock\033[0m sets perms \033[33m0777\033[0m so anyone can
access it; bad unless it's inside a restricted folder
\033[32m-i unix:\033[33m/tmp/a.sock\033[0m keeps umask-defined permissions
(usually \033[33m0600\033[0m) and the same user/group as copyparty
\033[33m-p\033[0m (tcp ports) is ignored for unix sockets
"""
),
],
[
"accounts",
"accounts and volumes",
@@ -689,6 +749,11 @@ def get_sects():
\033[36mxban\033[0m can be used to overrule / cancel a user ban event;
if the program returns 0 (true/OK) then the ban will NOT happen
effects can be used to redirect uploads into other
locations, and to delete or index other files based
on new uploads, but with certain limitations. See
bin/hooks/reloc* and docs/devnotes.md#hook-effects
except for \033[36mxm\033[0m, only one hook / one action can run at a time,
so it's recommended to use the \033[36mf\033[0m flag unless you really need
to wait for the hook to finish before continuing (without \033[36mf\033[0m
@@ -917,6 +982,16 @@ def add_fs(ap):
ap2.add_argument("--mtab-age", metavar="SEC", type=int, default=60, help="rebuild mountpoint cache every \033[33mSEC\033[0m to keep track of sparse-files support; keep low on servers with removable media")
def add_share(ap):
db_path = os.path.join(E.cfg, "shares.db")
ap2 = ap.add_argument_group('share-url options')
ap2.add_argument("--shr", metavar="DIR", type=u, default="", help="toplevel virtual folder for shared files/folders, for example [\033[32m/share\033[0m]")
ap2.add_argument("--shr-db", metavar="FILE", type=u, default=db_path, help="database to store shares in")
ap2.add_argument("--shr-adm", metavar="U,U", type=u, default="", help="comma-separated list of users allowed to view/delete any share")
ap2.add_argument("--shr-rt", metavar="MIN", type=int, default=1440, help="shares can be revived by their owner if they expired less than MIN minutes ago; [\033[32m60\033[0m]=hour, [\033[32m1440\033[0m]=day, [\033[32m10080\033[0m]=week")
ap2.add_argument("--shr-v", action="store_true", help="debug")
def add_upload(ap):
ap2 = ap.add_argument_group('upload options')
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads, hiding them from clients unless \033[33m-ed\033[0m")
@@ -927,9 +1002,10 @@ def add_upload(ap):
ap2.add_argument("--reg-cap", metavar="N", type=int, default=38400, help="max number of uploads to keep in memory when running without \033[33m-e2d\033[0m; roughly 1 MiB RAM per 600")
ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload (bad idea to enable this on windows and/or cow filesystems)")
ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even when it might be dangerous (multiprocessing, filesystems lacking sparse-files support, ...)")
ap2.add_argument("--hardlink", action="store_true", help="prefer hardlinks instead of symlinks when possible (within same filesystem) (volflag=hardlink)")
ap2.add_argument("--never-symlink", action="store_true", help="do not fallback to symlinks when a hardlink cannot be made (volflag=neversymlink)")
ap2.add_argument("--no-dedup", action="store_true", help="disable symlink/hardlink creation; copy file contents instead (volflag=copydupes)")
ap2.add_argument("--dedup", action="store_true", help="enable symlink-based upload deduplication (volflag=dedup)")
ap2.add_argument("--safe-dedup", metavar="N", type=int, default=50, help="how careful to be when deduplicating files; [\033[32m1\033[0m] = just verify the filesize, [\033[32m50\033[0m] = verify file contents have not been altered (volflag=safededup)")
ap2.add_argument("--hardlink", action="store_true", help="enable hardlink-based dedup; will fallback on symlinks when that is impossible (across filesystems) (volflag=hardlink)")
ap2.add_argument("--hardlink-only", action="store_true", help="do not fallback to symlinks when a hardlink cannot be made (volflag=hardlinkonly)")
ap2.add_argument("--no-dupe", action="store_true", help="reject duplicate files during upload; only matches within the same volume (volflag=nodupe)")
ap2.add_argument("--no-snap", action="store_true", help="disable snapshots -- forget unfinished uploads on shutdown; don't create .hist/up2k.snap files -- abandoned/interrupted uploads must be cleaned up manually")
ap2.add_argument("--snap-wri", metavar="SEC", type=int, default=300, help="write upload state to ./hist/up2k.snap every \033[33mSEC\033[0m seconds; allows resuming incomplete uploads after a server crash")
@@ -942,14 +1018,15 @@ def add_upload(ap):
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good for low-latency (same-country) connections, 4-8 for android clients, 16 for cross-atlantic (max=64)")
ap2.add_argument("--u2sz", metavar="N,N,N", type=u, default="1,64,96", help="web-client: default upload chunksize (MiB); sets \033[33mmin,default,max\033[0m in the settings gui. Each HTTP POST will aim for this size. Cloudflare max is 96. Big values are good for cross-atlantic but may increase HDD fragmentation on some FS. Disable this optimization with [\033[32m1,1,1\033[0m]")
ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; [\033[32ms\033[0m]=smallest-first, [\033[32mn\033[0m]=alphabetical, [\033[32mfs\033[0m]=force-s, [\033[32mfn\033[0m]=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine")
ap2.add_argument("--write-uplog", action="store_true", help="write POST reports to textfiles in working-directory")
def add_network(ap):
ap2 = ap.add_argument_group('network options')
ap2.add_argument("-i", metavar="IP", type=u, default="::", help="ip to bind (comma-sep.), default: all IPv4 and IPv6")
ap2.add_argument("-p", metavar="PORT", type=u, default="3923", help="ports to bind (comma/range)")
ap2.add_argument("-i", metavar="IP", type=u, default="::", help="IPs and/or unix-sockets to listen on (see \033[33m--help-bind\033[0m). Default: all IPv4 and IPv6")
ap2.add_argument("-p", metavar="PORT", type=u, default="3923", help="ports to listen on (comma/range); ignored for unix-sockets")
ap2.add_argument("--ll", action="store_true", help="include link-local IPv4/IPv6 in mDNS replies, even if the NIC has routable IPs (breaks some mDNS clients)")
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to associate clients with; [\033[32m0\033[0m]=tcp, [\033[32m1\033[0m]=origin (first x-fwd, unsafe), [\033[32m2\033[0m]=outermost-proxy, [\033[32m3\033[0m]=second-proxy, [\033[32m-1\033[0m]=closest-proxy")
ap2.add_argument("--xff-hdr", metavar="NAME", type=u, default="x-forwarded-for", help="if reverse-proxied, which http header to read the client's real ip from")
@@ -1000,6 +1077,7 @@ def add_cert(ap, cert_path):
def add_auth(ap):
ses_db = os.path.join(E.cfg, "sessions.db")
ap2 = ap.add_argument_group('IdP / identity provider / user authentication options')
ap2.add_argument("--idp-h-usr", metavar="HN", type=u, default="", help="bypass the copyparty authentication checks and assume the request-header \033[33mHN\033[0m contains the username of the requesting user (for use with authentik/oauth/...)\n\033[1;31mWARNING:\033[0m if you enable this, make sure clients are unable to specify this header themselves; must be washed away and replaced by a reverse-proxy")
ap2.add_argument("--idp-h-grp", metavar="HN", type=u, default="", help="assume the request-header \033[33mHN\033[0m contains the groupname of the requesting user; can be referenced in config files for group-based access control")
@@ -1007,6 +1085,19 @@ def add_auth(ap):
ap2.add_argument("--idp-gsep", metavar="RE", type=u, default="|:;+,", help="if there are multiple groups in \033[33m--idp-h-grp\033[0m, they are separated by one of the characters in \033[33mRE\033[0m")
ap2.add_argument("--no-bauth", action="store_true", help="disable basic-authentication support; do not accept passwords from the 'Authenticate' header at all. NOTE: This breaks support for the android app")
ap2.add_argument("--bauth-last", action="store_true", help="keeps basic-authentication enabled, but only as a last-resort; if a cookie is also provided then the cookie wins")
ap2.add_argument("--ses-db", metavar="PATH", type=u, default=ses_db, help="where to store the sessions database (if you run multiple copyparty instances, make sure they use different DBs)")
ap2.add_argument("--ses-len", metavar="CHARS", type=int, default=20, help="session key length; default is 120 bits ((20//4)*4*6)")
ap2.add_argument("--no-ses", action="store_true", help="disable sessions; use plaintext passwords in cookies")
def add_chpw(ap):
db_path = os.path.join(E.cfg, "chpw.json")
ap2 = ap.add_argument_group('user-changeable passwords options')
ap2.add_argument("--chpw", action="store_true", help="allow users to change their own passwords")
ap2.add_argument("--chpw-no", metavar="U,U,U", type=u, action="append", help="do not allow password-changes for this comma-separated list of usernames")
ap2.add_argument("--chpw-db", metavar="PATH", type=u, default=db_path, help="where to store the passwords database (if you run multiple copyparty instances, make sure they use different DBs)")
ap2.add_argument("--chpw-len", metavar="N", type=int, default=8, help="minimum password length")
ap2.add_argument("--chpw-v", metavar="LVL", type=int, default=2, help="verbosity of summary on config load [\033[32m0\033[0m] = nothing at all, [\033[32m1\033[0m] = number of users, [\033[32m2\033[0m] = list users with default-pw, [\033[32m3\033[0m] = list all users")
def add_zeroconf(ap):
@@ -1090,7 +1181,7 @@ def add_smb(ap):
ap2.add_argument("--smbw", action="store_true", help="enable write support (please dont)")
ap2.add_argument("--smb1", action="store_true", help="disable SMBv2, only enable SMBv1 (CIFS)")
ap2.add_argument("--smb-port", metavar="PORT", type=int, default=445, help="port to listen on -- if you change this value, you must NAT from TCP:445 to this port using iptables or similar")
ap2.add_argument("--smb-nwa-1", action="store_true", help="disable impacket#1433 workaround (truncate directory listings to 64kB)")
ap2.add_argument("--smb-nwa-1", action="store_true", help="truncate directory listings to 64kB (~400 files); avoids impacket-0.11 bug, fixes impacket-0.12 performance")
ap2.add_argument("--smb-nwa-2", action="store_true", help="disable impacket workaround for filecopy globs")
ap2.add_argument("--smba", action="store_true", help="small performance boost: disable per-account permissions, enables account coalescing instead (if one user has write/delete-access, then everyone does)")
ap2.add_argument("--smbv", action="store_true", help="verbose")
@@ -1116,6 +1207,7 @@ def add_hooks(ap):
ap2.add_argument("--xad", metavar="CMD", type=u, action="append", help="execute \033[33mCMD\033[0m after a file delete")
ap2.add_argument("--xm", metavar="CMD", type=u, action="append", help="execute \033[33mCMD\033[0m on message")
ap2.add_argument("--xban", metavar="CMD", type=u, action="append", help="execute \033[33mCMD\033[0m if someone gets banned (pw/404/403/url)")
ap2.add_argument("--hook-v", action="store_true", help="verbose hooks")
def add_stats(ap):
@@ -1148,6 +1240,7 @@ def add_optouts(ap):
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
ap2.add_argument("--no-up-list", action="store_true", help="don't show list of incoming files in controlpanel")
ap2.add_argument("--no-pipe", action="store_true", help="disable race-the-beam (lockstep download of files which are currently being uploaded) (volflag=nopipe)")
ap2.add_argument("--no-db-ip", action="store_true", help="do not write uploader IPs into the database")
@@ -1208,6 +1301,7 @@ def add_logging(ap):
ap2.add_argument("--ansi", action="store_true", help="force colors; overrides environment-variable NO_COLOR")
ap2.add_argument("--no-logflush", action="store_true", help="don't flush the logfile after each write; tiny bit faster")
ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup")
ap2.add_argument("--log-utc", action="store_true", help="do not use local timezone; assume the TZ env-var is UTC (tiny bit faster)")
ap2.add_argument("--log-tdec", metavar="N", type=int, default=3, help="timestamp resolution / number of timestamp decimals")
ap2.add_argument("--log-badpwd", metavar="N", type=int, default=1, help="log failed login attempt passwords: 0=terse, 1=plaintext, 2=hashed")
ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs")
@@ -1266,7 +1360,7 @@ def add_transcoding(ap):
def add_db_general(ap, hcores):
noidx = APPLESAN_TXT if MACOS else ""
ap2 = ap.add_argument_group('general db options')
ap2.add_argument("-e2d", action="store_true", help="enable up2k database, making files searchable + enables upload deduplication")
ap2.add_argument("-e2d", action="store_true", help="enable up2k database; this enables file search, upload-undo, improves deduplication")
ap2.add_argument("-e2ds", action="store_true", help="scan writable folders for new files on startup; sets \033[33m-e2d\033[0m")
ap2.add_argument("-e2dsa", action="store_true", help="scans all folders on startup; sets \033[33m-e2ds\033[0m")
ap2.add_argument("-e2v", action="store_true", help="verify file integrity; rehash all files and compare with db")
@@ -1275,11 +1369,13 @@ def add_db_general(ap, hcores):
ap2.add_argument("--hist", metavar="PATH", type=u, default="", help="where to store volume data (db, thumbs); default is a folder named \".hist\" inside each volume (volflag=hist)")
ap2.add_argument("--no-hash", metavar="PTN", type=u, default="", help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-dirsz", action="store_true", help="do not show total recursive size of folders in listings, show inode size instead; slightly faster (volflag=nodirsz)")
ap2.add_argument("--re-dirsz", action="store_true", help="if the directory-sizes in the UI are bonkers, use this along with \033[33m-e2dsa\033[0m to rebuild the index from scratch")
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
ap2.add_argument("--re-dhash", action="store_true", help="force a cache rebuild on startup; enable this once if it gets out of sync (should never be necessary)")
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice -- only useful for offloading uploads to a cloud service or something (volflag=noforget)")
ap2.add_argument("--dbd", metavar="PROFILE", default="wal", help="database durability profile; sets the tradeoff between robustness and speed, see \033[33m--help-dbd\033[0m (volflag=dbd)")
ap2.add_argument("--xlink", action="store_true", help="on upload: check all volumes for dupes, not just the target volume (volflag=xlink)")
ap2.add_argument("--xlink", action="store_true", help="on upload: check all volumes for dupes, not just the target volume (probably buggy, not recommended) (volflag=xlink)")
ap2.add_argument("--hash-mt", metavar="CORES", type=int, default=hcores, help="num cpu cores to use for file hashing; set 0 or 1 for single-core hashing")
ap2.add_argument("--re-maxage", metavar="SEC", type=int, default=0, help="rescan filesystem for changes every \033[33mSEC\033[0m seconds; 0=off (volflag=scan)")
ap2.add_argument("--db-act", metavar="SEC", type=float, default=10.0, help="defer any scheduled volume reindexing until \033[33mSEC\033[0m seconds after last db write (uploads, renames, ...)")
@@ -1336,7 +1432,7 @@ def add_ui(ap, retry):
ap2 = ap.add_argument_group('ui options')
ap2.add_argument("--grid", action="store_true", help="show grid/thumbnails by default (volflag=grid)")
ap2.add_argument("--gsel", action="store_true", help="select files in grid by ctrl-click (volflag=gsel)")
ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language; one of the following: \033[32meng nor\033[0m")
ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language; one of the following: \033[32meng nor chi\033[0m")
ap2.add_argument("--theme", metavar="NUM", type=int, default=0, help="default theme to use (0..7)")
ap2.add_argument("--themes", metavar="NUM", type=int, default=8, help="number of themes installed")
ap2.add_argument("--au-vol", metavar="0-100", type=int, default=50, choices=range(0, 101), help="default audio/video volume percent")
@@ -1344,9 +1440,10 @@ def add_ui(ap, retry):
ap2.add_argument("--unlist", metavar="REGEX", type=u, default="", help="don't show files matching \033[33mREGEX\033[0m in file list. Purely cosmetic! Does not affect API calls, just the browser. Example: [\033[32m\\.(js|css)$\033[0m] (volflag=unlist)")
ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="\033[33mfavicon-text\033[0m [ \033[33mforeground\033[0m [ \033[33mbackground\033[0m ] ], set blank to disable")
ap2.add_argument("--mpmc", metavar="URL", type=u, default="", help="change the mediaplayer-toggle mouse cursor; URL to a folder with {2..5}.png inside (or disable with [\033[32m.\033[0m])")
ap2.add_argument("--js-browser", metavar="L", type=u, default="", help="URL to additional JS to include")
ap2.add_argument("--css-browser", metavar="L", type=u, default="", help="URL to additional CSS to include")
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages; can be @PATH to send the contents of a file at PATH, and/or begin with %% to render as jinja2 template (volflag=html_head)")
ap2.add_argument("--css-browser", metavar="L", type=u, default="", help="URL to additional CSS to include in the filebrowser html")
ap2.add_argument("--js-browser", metavar="L", type=u, default="", help="URL to additional JS to include in the filebrowser html")
ap2.add_argument("--js-other", metavar="L", type=u, default="", help="URL to additional JS to include in all other pages")
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages (except for basic-browser); can be @PATH to send the contents of a file at PATH, and/or begin with %% to render as jinja2 template (volflag=html_head)")
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI, or add ?v to URL for override)")
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
@@ -1365,12 +1462,14 @@ def add_debug(ap):
ap2 = ap.add_argument_group('debug options')
ap2.add_argument("--vc", action="store_true", help="verbose config file parser (explain config)")
ap2.add_argument("--cgen", action="store_true", help="generate config file from current config (best-effort; probably buggy)")
ap2.add_argument("--deps", action="store_true", help="list information about detected optional dependencies")
if hasattr(select, "poll"):
ap2.add_argument("--no-poll", action="store_true", help="kernel-bug workaround: disable poll; use select instead (limits max num clients to ~700)")
ap2.add_argument("--no-sendfile", action="store_true", help="kernel-bug workaround: disable sendfile; do a safe and slow read-send-loop instead")
ap2.add_argument("--no-scandir", action="store_true", help="kernel-bug workaround: disable scandir; do a listdir + stat on each file instead")
ap2.add_argument("--no-fastboot", action="store_true", help="wait for initial filesystem indexing before accepting client requests")
ap2.add_argument("--no-htp", action="store_true", help="disable httpserver threadpool, create threads as-needed instead")
ap2.add_argument("--rm-sck", action="store_true", help="when listening on unix-sockets, do a basic delete+bind instead of the default atomic bind")
ap2.add_argument("--srch-dbg", action="store_true", help="explain search processing, and do some extra expensive sanity checks")
ap2.add_argument("--rclone-mdns", action="store_true", help="use mdns-domain instead of server-ip on /?hc")
ap2.add_argument("--stackmon", metavar="P,S", type=u, default="", help="write stacktrace to \033[33mP\033[0math every \033[33mS\033[0m second, for example --stackmon=\033[32m./st/%%Y-%%m/%%d/%%H%%M.xz,60")
@@ -1385,10 +1484,11 @@ def add_debug(ap):
def run_argparse(
argv: list[str], formatter: Any, retry: bool, nc: int
argv: list[str], formatter: Any, retry: bool, nc: int, verbose=True
) -> argparse.Namespace:
ap = argparse.ArgumentParser(
formatter_class=formatter,
usage=argparse.SUPPRESS,
prog="copyparty",
description="http file sharing hub v{} ({})".format(S_VERSION, S_BUILD_DT),
)
@@ -1406,18 +1506,20 @@ def run_argparse(
tty = os.environ.get("TERM", "").lower() == "linux"
srvname = get_srvname()
srvname = get_srvname(verbose)
add_general(ap, nc, srvname)
add_network(ap)
add_tls(ap, cert_path)
add_cert(ap, cert_path)
add_auth(ap)
add_chpw(ap)
add_qr(ap, tty)
add_zeroconf(ap)
add_zc_mdns(ap)
add_zc_ssdp(ap)
add_fs(ap)
add_share(ap)
add_upload(ap)
add_db_general(ap, hcores)
add_db_metadata(ap)
@@ -1471,16 +1573,13 @@ def run_argparse(
return ret
def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
def main(argv: Optional[list[str]] = None) -> None:
time.strptime("19970815", "%Y%m%d") # python#7980
if WINDOWS:
os.system("rem") # enables colors
init_E(E)
if rsrc: # pyz
E.mod = rsrc
if argv is None:
argv = sys.argv
@@ -1537,6 +1636,7 @@ def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
("--hdr-au-usr", "--idp-h-usr"),
("--idp-h-sep", "--idp-gsep"),
("--th-no-crop", "--th-crop=n"),
("--never-symlink", "--hardlink-only"),
]
for dk, nk in deprecated:
idx = -1
@@ -1561,7 +1661,7 @@ def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
argv.extend(["--qr"])
if ANYWIN or not os.geteuid():
# win10 allows symlinks if admin; can be unexpected
argv.extend(["-p80,443,3923", "--ign-ebind", "--no-dedup"])
argv.extend(["-p80,443,3923", "--ign-ebind"])
except:
pass
@@ -1583,7 +1683,7 @@ def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
for fmtr in [RiceFormatter, RiceFormatter, Dodge11874, BasicDodge11874]:
try:
al = run_argparse(argv, fmtr, retry, nc)
dal = run_argparse([], fmtr, retry, nc)
dal = run_argparse([], fmtr, retry, nc, False)
break
except SystemExit:
raise
@@ -1667,7 +1767,7 @@ def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
print("error: python2 cannot --smb")
return
if sys.version_info < (3, 6):
if not PY36:
al.no_scandir = True
if not hasattr(os, "sendfile"):

View File

@@ -1,8 +1,8 @@
# coding: utf-8
VERSION = (1, 13, 4)
CODENAME = "race the beam"
BUILD_DT = (2024, 7, 16)
VERSION = (1, 15, 4)
CODENAME = "fill the drives"
BUILD_DT = (2024, 10, 4)
S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -4,6 +4,7 @@ from __future__ import print_function, unicode_literals
import argparse
import base64
import hashlib
import json
import os
import re
import stat
@@ -12,7 +13,7 @@ import threading
import time
from datetime import datetime
from .__init__ import ANYWIN, TYPE_CHECKING, WINDOWS, E
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, WINDOWS, E
from .bos import bos
from .cfg import flagdescs, permdescs, vf_bmap, vf_cmap, vf_vmap
from .pwhash import PWHash
@@ -34,9 +35,11 @@ from .util import (
odfusion,
relchk,
statdir,
ub64enc,
uncyg,
undot,
unhumanize,
vjoin,
vsplit,
)
@@ -56,6 +59,9 @@ if TYPE_CHECKING:
# Vflags: TypeAlias = dict[str, Any]
# Mflags: TypeAlias = dict[str, Vflags]
if PY2:
range = xrange # type: ignore
LEELOO_DALLAS = "leeloo_dallas"
@@ -338,6 +344,8 @@ class VFS(object):
self.histtab: dict[str, str] = {} # all realpath->histpath
self.dbv: Optional[VFS] = None # closest full/non-jump parent
self.lim: Optional[Lim] = None # upload limits; only set for dbv
self.shr_src: Optional[tuple[VFS, str]] = None # source vfs+rem of a share
self.shr_files: set[str] = set() # filenames to include from shr_src
self.aread: dict[str, list[str]] = {}
self.awrite: dict[str, list[str]] = {}
self.amove: dict[str, list[str]] = {}
@@ -362,6 +370,9 @@ class VFS(object):
self.all_aps = []
self.all_vps = []
self.get_dbv = self._get_dbv
self.ls = self._ls
def __repr__(self) -> str:
return "VFS(%s)" % (
", ".join(
@@ -441,7 +452,7 @@ class VFS(object):
def _find(self, vpath: str) -> tuple["VFS", str]:
"""return [vfs,remainder]"""
if vpath == "":
if not vpath:
return self, ""
if "/" in vpath:
@@ -451,7 +462,7 @@ class VFS(object):
rem = ""
if name in self.nodes:
return self.nodes[name]._find(undot(rem))
return self.nodes[name]._find(rem)
return self, vpath
@@ -518,12 +529,20 @@ class VFS(object):
t = "{} has no {} in [{}] => [{}] => [{}]"
self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6)
t = 'you don\'t have %s-access in "/%s"'
raise Pebkac(err, t % (msg, cvpath))
t = 'you don\'t have %s-access in "/%s" or below "/%s"'
raise Pebkac(err, t % (msg, cvpath, vn.vpath))
return vn, rem
def get_dbv(self, vrem: str) -> tuple["VFS", str]:
def _get_share_src(self, vrem: str) -> tuple["VFS", str]:
src = self.shr_src
if not src:
return self._get_dbv(vrem)
shv, srem = src
return shv, vjoin(srem, vrem)
def _get_dbv(self, vrem: str) -> tuple["VFS", str]:
dbv = self.dbv
if not dbv:
return self, vrem
@@ -549,7 +568,26 @@ class VFS(object):
ad, fn = os.path.split(ap)
return os.path.join(absreal(ad), fn)
def ls(
def _ls_nope(
self, *a, **ka
) -> tuple[str, list[tuple[str, os.stat_result]], dict[str, "VFS"]]:
raise Pebkac(500, "nope.avi")
def _ls_shr(
self,
rem: str,
uname: str,
scandir: bool,
permsets: list[list[bool]],
lstat: bool = False,
) -> tuple[str, list[tuple[str, os.stat_result]], dict[str, "VFS"]]:
"""replaces _ls for certain shares (single-file, or file selection)"""
vn, rem = self.shr_src # type: ignore
abspath, real, _ = vn.ls(rem, "\n", scandir, permsets, lstat)
real = [x for x in real if os.path.basename(x[0]) in self.shr_files]
return abspath, real, {}
def _ls(
self,
rem: str,
uname: str,
@@ -802,8 +840,11 @@ class AuthSrv(object):
# fwd-decl
self.vfs = VFS(log_func, "", "", AXS(), {})
self.acct: dict[str, str] = {}
self.iacct: dict[str, str] = {}
self.acct: dict[str, str] = {} # uname->pw
self.iacct: dict[str, str] = {} # pw->uname
self.ases: dict[str, str] = {} # uname->session
self.sesa: dict[str, str] = {} # session->uname
self.defpw: dict[str, str] = {}
self.grps: dict[str, list[str]] = {}
self.re_pwd: Optional[re.Pattern] = None
@@ -814,6 +855,7 @@ class AuthSrv(object):
self.idp_accs: dict[str, list[str]] = {} # username->groupnames
self.idp_usr_gh: dict[str, str] = {} # username->group-header-value (cache)
self.hid_cache: dict[str, str] = {}
self.mutex = threading.Lock()
self.reload()
@@ -1349,7 +1391,7 @@ class AuthSrv(object):
flags[name] = vals
self._e("volflag [{}] += {} ({})".format(name, vals, desc))
def reload(self) -> None:
def reload(self, verbosity: int = 9) -> None:
"""
construct a flat list of mountpoints and usernames
first from the commandline arguments
@@ -1357,9 +1399,9 @@ class AuthSrv(object):
before finally building the VFS
"""
with self.mutex:
self._reload()
self._reload(verbosity)
def _reload(self) -> None:
def _reload(self, verbosity: int = 9) -> None:
acct: dict[str, str] = {} # username:password
grps: dict[str, list[str]] = {} # groupname:usernames
daxs: dict[str, AXS] = {}
@@ -1437,6 +1479,8 @@ class AuthSrv(object):
raise
self.setup_pwhash(acct)
defpw = acct.copy()
self.setup_chpw(acct)
# case-insensitive; normalize
if WINDOWS:
@@ -1452,9 +1496,8 @@ class AuthSrv(object):
vfs = VFS(self.log_func, absreal("."), "", axs, {})
elif "" not in mount:
# there's volumes but no root; make root inaccessible
vfs = VFS(self.log_func, "", "", AXS(), {})
vfs.flags["tcolor"] = self.args.tcolor
vfs.flags["d2d"] = True
zsd = {"d2d": True, "tcolor": self.args.tcolor}
vfs = VFS(self.log_func, "", "", AXS(), zsd)
maxdepth = 0
for dst in sorted(mount.keys(), key=lambda x: (x.count("/"), len(x))):
@@ -1483,6 +1526,56 @@ class AuthSrv(object):
vol.all_vps.sort(key=lambda x: len(x[0]), reverse=True)
vol.root = vfs
enshare = self.args.shr
shr = enshare[1:-1]
shrs = enshare[1:]
if enshare:
import sqlite3
shv = VFS(self.log_func, "", shr, AXS(), {})
db_path = self.args.shr_db
db = sqlite3.connect(db_path)
cur = db.cursor()
cur2 = db.cursor()
now = time.time()
for row in cur.execute("select * from sh"):
s_k, s_pw, s_vp, s_pr, s_nf, s_un, s_t0, s_t1 = row
if s_t1 and s_t1 < now:
continue
if self.args.shr_v:
t = "loading %s share [%s] by [%s] => [%s]"
self.log(t % (s_pr, s_k, s_un, s_vp))
if s_pw:
# gotta reuse the "account" for all shares with this pw,
# so do a light scramble as this appears in the web-ui
zb = hashlib.sha512(s_pw.encode("utf-8")).digest()
sun = "s_%s" % (ub64enc(zb)[4:16].decode("ascii"),)
acct[sun] = s_pw
else:
sun = "*"
s_axs = AXS(
[sun] if "r" in s_pr else [],
[sun] if "w" in s_pr else [],
[sun] if "m" in s_pr else [],
[sun] if "d" in s_pr else [],
)
# don't know the abspath yet + wanna ensure the user
# still has the privs they granted, so nullmap it
shv.nodes[s_k] = VFS(
self.log_func, "", "%s/%s" % (shr, s_k), s_axs, shv.flags.copy()
)
vfs.nodes[shr] = vfs.all_vols[shr] = shv
for vol in shv.nodes.values():
vfs.all_vols[vol.vpath] = vol
vol.get_dbv = vol._get_share_src
vol.ls = vol._ls_nope
zss = set(acct)
zss.update(self.idp_accs)
zss.discard("*")
@@ -1501,7 +1594,7 @@ class AuthSrv(object):
for usr in unames:
for vp, vol in vfs.all_vols.items():
zx = getattr(vol.axs, axs_key)
if usr in zx:
if usr in zx and (not enshare or not vp.startswith(shrs)):
umap[usr].append(vp)
umap[usr].sort()
setattr(vfs, "a" + perm, umap)
@@ -1551,6 +1644,8 @@ class AuthSrv(object):
for usr in acct:
if usr not in associated_users:
if enshare and usr.startswith("s_"):
continue
if len(vfs.all_vols) > 1:
# user probably familiar enough that the verbose message is not necessary
t = "account [%s] is not mentioned in any volume definitions; see --help-accounts"
@@ -1562,8 +1657,12 @@ class AuthSrv(object):
promote = []
demote = []
for vol in vfs.all_vols.values():
zb = hashlib.sha512(afsenc(vol.realpath)).digest()
hid = base64.b32encode(zb).decode("ascii").lower()
hid = self.hid_cache.get(vol.realpath)
if not hid:
zb = hashlib.sha512(afsenc(vol.realpath)).digest()
hid = base64.b32encode(zb).decode("ascii").lower()
self.hid_cache[vol.realpath] = hid
vflag = vol.flags.get("hist")
if vflag == "-":
pass
@@ -1799,6 +1898,11 @@ class AuthSrv(object):
if len(zs) == 3: # fc5 => ffcc55
vol.flags["tcolor"] = "".join([x * 2 for x in zs])
if vol.flags.get("neversymlink"):
vol.flags["hardlinkonly"] = True # was renamed
if vol.flags.get("hardlinkonly"):
vol.flags["hardlink"] = True
for k1, k2 in IMPLICATIONS:
if k1 in vol.flags:
vol.flags[k2] = True
@@ -1895,7 +1999,7 @@ class AuthSrv(object):
self.log(t.format(vol.vpath), 1)
del vol.flags["lifetime"]
needs_e2d = [x for x in hooks if x != "xm"]
needs_e2d = [x for x in hooks if x in ("xau", "xiu")]
drop = [x for x in needs_e2d if vol.flags.get(x)]
if drop:
t = 'removing [{}] from volume "/{}" because e2d is disabled'
@@ -1903,9 +2007,6 @@ class AuthSrv(object):
for x in drop:
vol.flags.pop(x)
if vol.flags.get("neversymlink") and not vol.flags.get("hardlink"):
vol.flags["copydupes"] = True
# verify tags mentioned by -mt[mp] are used by -mte
local_mtp = {}
local_only_mtp = {}
@@ -1984,11 +2085,16 @@ class AuthSrv(object):
have_e2d = False
have_e2t = False
have_dedup = False
unsafe_dedup = []
t = "volumes and permissions:\n"
for zv in vfs.all_vols.values():
if not self.warn_anonwrite:
if not self.warn_anonwrite or verbosity < 5:
break
if enshare and (zv.vpath == shr or zv.vpath.startswith(shrs)):
continue
t += '\n\033[36m"/{}" \033[33m{}\033[0m'.format(zv.vpath, zv.realpath)
for txt, attr in [
[" read", "uread"],
@@ -2013,9 +2119,14 @@ class AuthSrv(object):
if "e2t" in zv.flags:
have_e2t = True
if "dedup" in zv.flags:
have_dedup = True
if "e2d" not in zv.flags and "hardlink" not in zv.flags:
unsafe_dedup.append("/" + zv.vpath)
t += "\n"
if self.warn_anonwrite:
if self.warn_anonwrite and verbosity > 4:
if not self.args.no_voldump:
self.log(t)
@@ -2025,10 +2136,17 @@ class AuthSrv(object):
self.log("\n\033[{}\033[0m\n".format(t))
if not have_e2t:
t = "hint: argument -e2ts enables multimedia indexing (artist/title/...)"
t = "hint: enable multimedia indexing (artist/title/...) with argument -e2ts"
self.log(t, 6)
else:
t = "hint: argument -e2dsa enables searching, upload-undo, and better deduplication"
t = "hint: enable searching and upload-undo with argument -e2dsa"
self.log(t, 6)
if unsafe_dedup:
t = "WARNING: symlink-based deduplication is enabled for some volumes, but without indexing. Please enable -e2dsa and/or --hardlink to avoid problems when moving/renaming files. Affected volumes: %s"
self.log(t % (", ".join(unsafe_dedup)), 3)
elif not have_dedup:
t = "hint: enable upload deduplication with --dedup (but see readme for consequences)"
self.log(t, 6)
zv, _ = vfs.get("/", "*", False, False)
@@ -2039,7 +2157,7 @@ class AuthSrv(object):
try:
zv, _ = vfs.get("", "*", False, True, err=999)
if self.warn_anonwrite and os.getcwd() == zv.realpath:
if self.warn_anonwrite and verbosity > 4 and os.getcwd() == zv.realpath:
t = "anyone can write to the current directory: {}\n"
self.log(t.format(zv.realpath), c=1)
@@ -2066,11 +2184,15 @@ class AuthSrv(object):
self.vfs = vfs
self.acct = acct
self.defpw = defpw
self.grps = grps
self.iacct = {v: k for k, v in acct.items()}
self.load_sessions()
self.re_pwd = None
pwds = [re.escape(x) for x in self.iacct.keys()]
pwds.extend(list(self.sesa))
if pwds:
if self.ah.on:
zs = r"(\[H\] pw:.*|[?&]pw=)([^&]+)"
@@ -2086,6 +2208,235 @@ class AuthSrv(object):
MIMES[ext] = mime
EXTS.update({v: k for k, v in MIMES.items()})
if enshare:
# hide shares from controlpanel
vfs.all_vols = {
x: y
for x, y in vfs.all_vols.items()
if x != shr and not x.startswith(shrs)
}
assert db and cur and cur2 and shv # type: ignore
for row in cur.execute("select * from sh"):
s_k, s_pw, s_vp, s_pr, s_nf, s_un, s_t0, s_t1 = row
shn = shv.nodes.get(s_k, None)
if not shn:
continue
try:
s_vfs, s_rem = vfs.get(
s_vp, s_un, "r" in s_pr, "w" in s_pr, "m" in s_pr, "d" in s_pr
)
except Exception as ex:
t = "removing share [%s] by [%s] to [%s] due to %r"
self.log(t % (s_k, s_un, s_vp, ex), 3)
shv.nodes.pop(s_k)
continue
fns = []
if s_nf:
q = "select vp from sf where k = ?"
for (s_fn,) in cur2.execute(q, (s_k,)):
fns.append(s_fn)
shn.shr_files = set(fns)
shn.ls = shn._ls_shr
else:
shn.ls = shn._ls
shn.shr_src = (s_vfs, s_rem)
shn.realpath = s_vfs.canonical(s_rem)
if self.args.shr_v:
t = "mapped %s share [%s] by [%s] => [%s] => [%s]"
self.log(t % (s_pr, s_k, s_un, s_vp, shn.realpath))
# transplant shadowing into shares
for vn in shv.nodes.values():
svn, srem = vn.shr_src # type: ignore
if srem:
continue # free branch, safe
ap = svn.canonical(srem)
if bos.path.isfile(ap):
continue # also fine
for zs in svn.nodes.keys():
# hide subvolume
vn.nodes[zs] = VFS(self.log_func, "", "", AXS(), {})
cur2.close()
cur.close()
db.close()
def load_sessions(self, quiet=False) -> None:
# mutex me
if self.args.no_ses:
self.ases = {}
self.sesa = {}
return
import sqlite3
ases = {}
blen = (self.args.ses_len // 4) * 4 # 3 bytes in 4 chars
blen = (blen * 3) // 4 # bytes needed for ses_len chars
db = sqlite3.connect(self.args.ses_db)
cur = db.cursor()
for uname, sid in cur.execute("select un, si from us"):
if uname in self.acct:
ases[uname] = sid
n = []
q = "insert into us values (?,?,?)"
for uname in self.acct:
if uname not in ases:
sid = ub64enc(os.urandom(blen)).decode("ascii")
cur.execute(q, (uname, sid, int(time.time())))
ases[uname] = sid
n.append(uname)
if n:
db.commit()
cur.close()
db.close()
self.ases = ases
self.sesa = {v: k for k, v in ases.items()}
if n and not quiet:
t = ", ".join(n[:3])
if len(n) > 3:
t += "..."
self.log("added %d new sessions (%s)" % (len(n), t))
def forget_session(self, broker: Optional["BrokerCli"], uname: str) -> None:
with self.mutex:
self._forget_session(uname)
if broker:
broker.ask("_reload_sessions").get()
def _forget_session(self, uname: str) -> None:
if self.args.no_ses:
return
import sqlite3
db = sqlite3.connect(self.args.ses_db)
cur = db.cursor()
cur.execute("delete from us where un = ?", (uname,))
db.commit()
cur.close()
db.close()
self.sesa.pop(self.ases.get(uname, ""), "")
self.ases.pop(uname, "")
def chpw(self, broker: Optional["BrokerCli"], uname, pw) -> tuple[bool, str]:
if not self.args.chpw:
return False, "feature disabled in server config"
if uname == "*" or uname not in self.defpw:
return False, "not logged in"
if uname in self.args.chpw_no:
return False, "not allowed for this account"
if len(pw) < self.args.chpw_len:
t = "minimum password length: %d characters"
return False, t % (self.args.chpw_len,)
hpw = self.ah.hash(pw) if self.ah.on else pw
if hpw == self.acct[uname]:
return False, "that's already your password my dude"
if hpw in self.iacct or hpw in self.sesa:
return False, "password is taken"
with self.mutex:
ap = self.args.chpw_db
if not bos.path.exists(ap):
pwdb = {}
else:
with open(ap, "r", encoding="utf-8") as f:
pwdb = json.load(f)
pwdb = [x for x in pwdb if x[0] != uname]
pwdb.append((uname, self.defpw[uname], hpw))
with open(ap, "w", encoding="utf-8") as f:
json.dump(pwdb, f, separators=(",\n", ": "))
self.log("reinitializing due to password-change for user [%s]" % (uname,))
if not broker:
# only true for tests
self._reload()
return True, "new password OK"
broker.ask("_reload_blocking", False, False).get()
return True, "new password OK"
def setup_chpw(self, acct: dict[str, str]) -> None:
ap = self.args.chpw_db
if not self.args.chpw or not bos.path.exists(ap):
return
with open(ap, "r", encoding="utf-8") as f:
pwdb = json.load(f)
useen = set()
urst = set()
uok = set()
for usr, orig, mod in pwdb:
useen.add(usr)
if usr not in acct:
# previous user, no longer known
continue
if acct[usr] != orig:
urst.add(usr)
continue
uok.add(usr)
acct[usr] = mod
if not self.args.chpw_v:
return
for usr in acct:
if usr not in useen:
urst.add(usr)
for zs in uok:
urst.discard(zs)
if self.args.chpw_v == 1 or (self.args.chpw_v == 2 and not urst):
t = "chpw: %d changed, %d unchanged"
self.log(t % (len(uok), len(urst)))
return
elif self.args.chpw_v == 2:
t = "chpw: %d changed" % (len(uok))
if urst:
t += ", \033[0munchanged:\033[35m %s" % (", ".join(list(urst)))
self.log(t, 6)
return
msg = ""
if uok:
t = "\033[0mchanged: \033[32m%s"
msg += t % (", ".join(list(uok)),)
if urst:
t = "%s\033[0munchanged: \033[35m%s"
msg += t % (
", " if msg else "",
", ".join(list(urst)),
)
self.log("chpw: " + msg, 6)
def setup_pwhash(self, acct: dict[str, str]) -> None:
self.ah = PWHash(self.args)
if not self.ah.on:

View File

@@ -9,14 +9,14 @@ import queue
from .__init__ import CORES, TYPE_CHECKING
from .broker_mpw import MpWorker
from .broker_util import ExceptionalQueue, try_exec
from .broker_util import ExceptionalQueue, NotExQueue, try_exec
from .util import Daemon, mp
if TYPE_CHECKING:
from .svchub import SvcHub
if True: # pylint: disable=using-constant-test
from typing import Any
from typing import Any, Union
class MProcess(mp.Process):
@@ -76,6 +76,10 @@ class BrokerMp(object):
for _, proc in enumerate(self.procs):
proc.q_pend.put((0, "reload", []))
def reload_sessions(self) -> None:
for _, proc in enumerate(self.procs):
proc.q_pend.put((0, "reload_sessions", []))
def collector(self, proc: MProcess) -> None:
"""receive message from hub in other process"""
while True:
@@ -104,7 +108,7 @@ class BrokerMp(object):
if retq_id:
proc.q_pend.put((retq_id, "retq", rv))
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
def ask(self, dest: str, *args: Any) -> Union[ExceptionalQueue, NotExQueue]:
# new non-ipc invoking managed service in hub
obj = self.hub

View File

@@ -11,7 +11,7 @@ import queue
from .__init__ import ANYWIN
from .authsrv import AuthSrv
from .broker_util import BrokerCli, ExceptionalQueue
from .broker_util import BrokerCli, ExceptionalQueue, NotExQueue
from .httpsrv import HttpSrv
from .util import FAKE_MP, Daemon, HMaccas
@@ -94,6 +94,10 @@ class MpWorker(BrokerCli):
self.asrv.reload()
self.logw("mpw.asrv reloaded")
elif dest == "reload_sessions":
with self.asrv.mutex:
self.asrv.load_sessions()
elif dest == "listen":
self.httpsrv.listen(args[0], args[1])
@@ -110,7 +114,7 @@ class MpWorker(BrokerCli):
else:
raise Exception("what is " + str(dest))
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
def ask(self, dest: str, *args: Any) -> Union[ExceptionalQueue, NotExQueue]:
retq = ExceptionalQueue(1)
retq_id = id(retq)
with self.retpend_mutex:

View File

@@ -5,7 +5,7 @@ import os
import threading
from .__init__ import TYPE_CHECKING
from .broker_util import BrokerCli, ExceptionalQueue, try_exec
from .broker_util import BrokerCli, ExceptionalQueue, NotExQueue
from .httpsrv import HttpSrv
from .util import HMaccas
@@ -13,7 +13,7 @@ if TYPE_CHECKING:
from .svchub import SvcHub
if True: # pylint: disable=using-constant-test
from typing import Any
from typing import Any, Union
class BrokerThr(BrokerCli):
@@ -34,6 +34,7 @@ class BrokerThr(BrokerCli):
self.iphash = HMaccas(os.path.join(self.args.E.cfg, "iphash"), 8)
self.httpsrv = HttpSrv(self, None)
self.reload = self.noop
self.reload_sessions = self.noop
def shutdown(self) -> None:
# self.log("broker", "shutting down")
@@ -42,19 +43,14 @@ class BrokerThr(BrokerCli):
def noop(self) -> None:
pass
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
def ask(self, dest: str, *args: Any) -> Union[ExceptionalQueue, NotExQueue]:
# new ipc invoking managed service in hub
obj = self.hub
for node in dest.split("."):
obj = getattr(obj, node)
rv = try_exec(True, obj, *args)
# pretend we're broker_mp
retq = ExceptionalQueue(1)
retq.put(rv)
return retq
return NotExQueue(obj(*args)) # type: ignore
def say(self, dest: str, *args: Any) -> None:
if dest == "listen":
@@ -70,4 +66,4 @@ class BrokerThr(BrokerCli):
for node in dest.split("."):
obj = getattr(obj, node)
try_exec(False, obj, *args)
obj(*args) # type: ignore

View File

@@ -28,11 +28,23 @@ class ExceptionalQueue(Queue, object):
if rv[1] == "pebkac":
raise Pebkac(*rv[2:])
else:
raise Exception(rv[2])
raise rv[2]
return rv
class NotExQueue(object):
"""
BrokerThr uses this instead of ExceptionalQueue; 7x faster
"""
def __init__(self, rv: Any) -> None:
self.rv = rv
def get(self) -> Any:
return self.rv
class BrokerCli(object):
"""
helps mypy understand httpsrv.broker but still fails a few levels deeper,
@@ -48,7 +60,7 @@ class BrokerCli(object):
def __init__(self) -> None:
pass
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
def ask(self, dest: str, *args: Any) -> Union[ExceptionalQueue, NotExQueue]:
return ExceptionalQueue(1)
def say(self, dest: str, *args: Any) -> None:
@@ -65,8 +77,8 @@ def try_exec(want_retval: Union[bool, int], func: Any, *args: list[Any]) -> Any:
return ["exception", "pebkac", ex.code, str(ex)]
except:
except Exception as ex:
if not want_retval:
raise
return ["exception", "stack", traceback.format_exc()]
return ["exception", "stack", ex]

View File

@@ -7,9 +7,9 @@ import shutil
import time
from .__init__ import ANYWIN
from .util import Netdev, runcmd, wrename, wunlink
from .util import Netdev, load_resource, runcmd, wrename, wunlink
HAVE_CFSSL = True
HAVE_CFSSL = not os.environ.get("PRTY_NO_CFSSL")
if True: # pylint: disable=using-constant-test
from .util import NamedLogger, RootLogger
@@ -29,13 +29,15 @@ def ensure_cert(log: "RootLogger", args) -> None:
i feel awful about this and so should they
"""
cert_insec = os.path.join(args.E.mod, "res/insecure.pem")
with load_resource(args.E, "res/insecure.pem") as f:
cert_insec = f.read()
cert_appdata = os.path.join(args.E.cfg, "cert.pem")
if not os.path.isfile(args.cert):
if cert_appdata != args.cert:
raise Exception("certificate file does not exist: " + args.cert)
shutil.copy(cert_insec, args.cert)
with open(args.cert, "wb") as f:
f.write(cert_insec)
with open(args.cert, "rb") as f:
buf = f.read()
@@ -50,7 +52,9 @@ def ensure_cert(log: "RootLogger", args) -> None:
raise Exception(m + "private key must appear before server certificate")
try:
if filecmp.cmp(args.cert, cert_insec):
with open(args.cert, "rb") as f:
active_cert = f.read()
if active_cert == cert_insec:
t = "using default TLS certificate; https will be insecure:\033[36m {}"
log("cert", t.format(args.cert), 3)
except:
@@ -151,14 +155,22 @@ def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
raise Exception("no useable cert found")
expired = time.time() + args.crt_sdays * 60 * 60 * 24 * 0.5 > expiry
cert_insec = os.path.join(args.E.mod, "res/insecure.pem")
if expired:
raise Exception("old server-cert has expired")
for n in names:
if n not in inf["sans"]:
raise Exception("does not have {}".format(n))
if expired:
raise Exception("old server-cert has expired")
if not filecmp.cmp(args.cert, cert_insec):
with load_resource(args.E, "res/insecure.pem") as f:
cert_insec = f.read()
with open(args.cert, "rb") as f:
active_cert = f.read()
if active_cert and active_cert != cert_insec:
return
except Exception as ex:
log("cert", "will create new server-cert; {}".format(ex))

View File

@@ -2,7 +2,7 @@
from __future__ import print_function, unicode_literals
# awk -F\" '/add_argument\("-[^-]/{print(substr($2,2))}' copyparty/__main__.py | sort | tr '\n' ' '
zs = "a c e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vp e2vu ed emp i j lo mcr mte mth mtm mtp nb nc nid nih nw p q s ss sss v z zv"
zs = "a c e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vp e2vu ed emp i j lo mcr mte mth mtm mtp nb nc nid nih nth nw p q s ss sss v z zv"
onedash = set(zs.split())
@@ -12,8 +12,8 @@ def vf_bmap() -> dict[str, str]:
"dav_auth": "davauth",
"dav_rt": "davrt",
"ed": "dots",
"never_symlink": "neversymlink",
"no_dedup": "copydupes",
"hardlink_only": "hardlinkonly",
"no_dirsz": "nodirsz",
"no_dupe": "nodupe",
"no_forget": "noforget",
"no_pipe": "nopipe",
@@ -23,6 +23,7 @@ def vf_bmap() -> dict[str, str]:
"no_athumb": "dathumb",
}
for k in (
"dedup",
"dotsrch",
"e2d",
"e2ds",
@@ -58,6 +59,7 @@ def vf_vmap() -> dict[str, str]:
"no_hash": "nohash",
"no_idx": "noidx",
"re_maxage": "scan",
"safe_dedup": "safededup",
"th_convt": "convt",
"th_size": "thsize",
"th_crop": "crop",
@@ -129,10 +131,11 @@ permdescs = {
flagcats = {
"uploads, general": {
"dedup": "enable symlink-based file deduplication",
"hardlink": "enable hardlink-based file deduplication,\nwith fallback on symlinks when that is impossible",
"hardlinkonly": "dedup with hardlink only, never symlink;\nmake a full copy if hardlink is impossible",
"safededup": "verify on-disk data before using it for dedup",
"nodupe": "rejects existing files (instead of symlinking them)",
"hardlink": "does dedup with hardlinks instead of symlinks",
"neversymlink": "disables symlink fallback; full copy instead",
"copydupes": "disables dedup, always saves full copies of dupes",
"sparse": "force use of sparse files, mainly for s3-backed storage",
"daw": "enable full WebDAV write support (dangerous);\nPUT-operations will now \033[1;31mOVERWRITE\033[0;35m existing files",
"nosub": "forces all uploads into the top folder of the vfs",
@@ -159,7 +162,7 @@ flagcats = {
"lifetime=3600": "uploads are deleted after 1 hour",
},
"database, general": {
"e2d": "enable database; makes files searchable + enables upload dedup",
"e2d": "enable database; makes files searchable + enables upload-undo",
"e2ds": "scan writable folders for new files on startup; also sets -e2d",
"e2dsa": "scans all folders for new files on startup; also sets -e2d",
"e2t": "enable multimedia indexing; makes it possible to search for tags",
@@ -177,7 +180,7 @@ flagcats = {
"noforget": "don't forget files when deleted from disk",
"fat32": "avoid excessive reindexing on android sdcardfs",
"dbd=[acid|swal|wal|yolo]": "database speed-durability tradeoff",
"xlink": "cross-volume dupe detection / linking",
"xlink": "cross-volume dupe detection / linking (dangerous)",
"xdev": "do not descend into other filesystems",
"xvol": "do not follow symlinks leaving the volume root",
"dotsrch": "show dotfiles in search results",

View File

@@ -9,12 +9,12 @@ import time
from .__init__ import ANYWIN, MACOS
from .authsrv import AXS, VFS
from .bos import bos
from .util import chkcmd, min_ex
from .util import chkcmd, min_ex, undot
if True: # pylint: disable=using-constant-test
from typing import Optional, Union
from .util import RootLogger
from .util import RootLogger, undot
class Fstab(object):
@@ -52,7 +52,7 @@ class Fstab(object):
self.log(msg.format(path, fs, min_ex()), 3)
return fs
path = path.lstrip("/")
path = undot(path)
try:
return self.cache[path]
except:
@@ -119,12 +119,12 @@ class Fstab(object):
self.srctab = srctab
def relabel(self, path: str, nval: str) -> None:
assert self.tab
assert self.tab # !rm
self.cache = {}
if ANYWIN:
path = self._winpath(path)
path = path.lstrip("/")
path = undot(path)
ptn = re.compile(r"^[^\\/]*")
vn, rem = self.tab._find(path)
if not self.trusted:
@@ -156,7 +156,7 @@ class Fstab(object):
self.log("failed to build tab:\n{}".format(min_ex()), 3)
self.build_fallback()
assert self.tab
assert self.tab # !rm
ret = self.tab._find(path)[0]
if self.trusted or path == ret.vpath:
return ret.realpath.split("/")[0]
@@ -167,6 +167,6 @@ class Fstab(object):
if not self.tab:
self.build_fallback()
assert self.tab
assert self.tab # !rm
ret = self.tab._find(path)[0]
return ret.realpath

View File

@@ -41,6 +41,9 @@ if True: # pylint: disable=using-constant-test
import typing
from typing import Any, Optional, Union
if PY2:
range = xrange # type: ignore
class FSE(FilesystemError):
def __init__(self, msg: str, severity: int = 0) -> None:
@@ -160,7 +163,7 @@ class FtpFs(AbstractedFS):
t = "Unsupported characters in [{}]"
raise FSE(t.format(vpath), 1)
fn = sanitize_fn(fn or "", "", [".prologue.html", ".epilogue.html"])
fn = sanitize_fn(fn or "", "")
vpath = vjoin(rd, fn)
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, r, w, m, d)
if not vfs.realpath:
@@ -350,7 +353,7 @@ class FtpFs(AbstractedFS):
svp = join(self.cwd, src).lstrip("/")
dvp = join(self.cwd, dst).lstrip("/")
try:
self.hub.up2k.handle_mv(self.uname, svp, dvp)
self.hub.up2k.handle_mv(self.uname, self.h.cli_ip, svp, dvp)
except Exception as ex:
raise FSE(str(ex))
@@ -468,6 +471,9 @@ class FtpHandler(FTPHandler):
xbu = vfs.flags.get("xbu")
if xbu and not runhook(
None,
None,
self.hub.up2k,
"xbu.ftpd",
xbu,
ap,
vp,
@@ -477,7 +483,7 @@ class FtpHandler(FTPHandler):
0,
0,
self.cli_ip,
0,
time.time(),
"",
):
raise FSE("Upload blocked by xbu server config")
@@ -580,9 +586,15 @@ class Ftpd(object):
if "::" in ips:
ips.append("0.0.0.0")
ips = [x for x in ips if "unix:" not in x]
if self.args.ftp4:
ips = [x for x in ips if ":" not in x]
if not ips:
lgr.fatal("cannot start ftp-server; no compatible IPs in -i")
return
ips = list(ODict.fromkeys(ips)) # dedup
ioloop = IOLoop()

File diff suppressed because it is too large Load Diff

View File

@@ -9,6 +9,9 @@ import threading # typechk
import time
try:
if os.environ.get("PRTY_NO_TLS"):
raise Exception()
HAVE_SSL = True
import ssl
except:
@@ -100,9 +103,6 @@ class HttpConn(object):
self.log_src = ("%s \033[%dm%d" % (ip, color, self.addr[1])).ljust(26)
return self.log_src
def respath(self, res_name: str) -> str:
return os.path.join(self.E.mod, "web", res_name)
def log(self, msg: str, c: Union[int, str] = 0) -> None:
self.log_func(self.log_src, msg, c)
@@ -162,6 +162,7 @@ class HttpConn(object):
self.log_src = self.log_src.replace("[36m", "[35m")
try:
assert ssl # type: ignore # !rm
ctx = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ctx.load_cert_chain(self.args.cert)
if self.args.ssl_ver:
@@ -187,7 +188,7 @@ class HttpConn(object):
if self.args.ssl_dbg and hasattr(self.s, "shared_ciphers"):
ciphers = self.s.shared_ciphers()
assert ciphers
assert ciphers # !rm
overlap = [str(y[::-1]) for y in ciphers]
self.log("TLS cipher overlap:" + "\n".join(overlap))
for k, v in [

View File

@@ -1,7 +1,6 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import base64
import math
import os
import re
@@ -12,7 +11,7 @@ import time
import queue
from .__init__ import ANYWIN, CORES, EXE, MACOS, TYPE_CHECKING, EnvParams, unicode
from .__init__ import ANYWIN, CORES, EXE, MACOS, PY2, TYPE_CHECKING, EnvParams, unicode
try:
MNFE = ModuleNotFoundError
@@ -67,14 +66,16 @@ from .util import (
Magician,
Netdev,
NetMap,
absreal,
build_netmap,
has_resource,
ipnorm,
load_resource,
min_ex,
shut_socket,
spack,
start_log_thrs,
start_stackmon,
ub64enc,
)
if TYPE_CHECKING:
@@ -84,6 +85,17 @@ if TYPE_CHECKING:
if True: # pylint: disable=using-constant-test
from typing import Any, Optional
if PY2:
range = xrange # type: ignore
if not hasattr(socket, "AF_UNIX"):
setattr(socket, "AF_UNIX", -9001)
def load_jinja2_resource(E: EnvParams, name: str):
with load_resource(E, "web/" + name, "r") as f:
return f.read()
class HttpSrv(object):
"""
@@ -146,22 +158,30 @@ class HttpSrv(object):
self.u2idx_free: dict[str, U2idx] = {}
self.u2idx_n = 0
assert jinja2 # type: ignore # !rm
env = jinja2.Environment()
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"]
env.loader = jinja2.FunctionLoader(lambda f: load_jinja2_resource(self.E, f))
jn = [
"splash",
"shares",
"svcs",
"browser",
"browser2",
"msg",
"md",
"mde",
"cf",
]
self.j2 = {x: env.get_template(x + ".html") for x in jn}
zs = os.path.join(self.E.mod, "web", "deps", "prism.js.gz")
self.prism = os.path.exists(zs)
self.prism = has_resource(self.E, "web/deps/prism.js.gz")
self.ipa_nm = build_netmap(self.args.ipa)
self.xff_nm = build_netmap(self.args.xff_src)
self.xff_lan = build_netmap("lan")
self.statics: set[str] = set()
self._build_statics()
self.ptn_cc = re.compile(r"[\x00-\x1f]")
self.ptn_hsafe = re.compile(r"[\x00-\x1f<>\"'&]")
self.uparam_cc_ok = set("doc move tree".split())
self.mallow = "GET HEAD POST PUT DELETE OPTIONS".split()
if not self.args.no_dav:
@@ -193,14 +213,6 @@ class HttpSrv(object):
except:
pass
def _build_statics(self) -> None:
for dp, _, df in os.walk(os.path.join(self.E.mod, "web")):
for fn in df:
ap = absreal(os.path.join(dp, fn))
self.statics.add(ap)
if ap.endswith(".gz"):
self.statics.add(ap[:-3])
def set_netdevs(self, netdevs: dict[str, Netdev]) -> None:
ips = set()
for ip, _ in self.bound:
@@ -221,7 +233,7 @@ class HttpSrv(object):
if self.args.log_htp:
self.log(self.name, "workers -= {} = {}".format(n, self.tp_nthr), 6)
assert self.tp_q
assert self.tp_q # !rm
for _ in range(n):
self.tp_q.put(None)
@@ -240,15 +252,24 @@ class HttpSrv(object):
return
def listen(self, sck: socket.socket, nlisteners: int) -> None:
tcp = sck.family != socket.AF_UNIX
if self.args.j != 1:
# lost in the pickle; redefine
if not ANYWIN or self.args.reuseaddr:
sck.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sck.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
if tcp:
sck.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
sck.settimeout(None) # < does not inherit, ^ opts above do
ip, port = sck.getsockname()[:2]
if tcp:
ip, port = sck.getsockname()[:2]
else:
ip = re.sub(r"\.[0-9]+$", "", sck.getsockname().split("/")[-1])
port = 0
self.srvs.append(sck)
self.bound.add((ip, port))
self.nclimax = math.ceil(self.args.nc * 1.0 / nlisteners)
@@ -260,10 +281,19 @@ class HttpSrv(object):
def thr_listen(self, srv_sck: socket.socket) -> None:
"""listens on a shared tcp server"""
ip, port = srv_sck.getsockname()[:2]
fno = srv_sck.fileno()
hip = "[{}]".format(ip) if ":" in ip else ip
msg = "subscribed @ {}:{} f{} p{}".format(hip, port, fno, os.getpid())
if srv_sck.family == socket.AF_UNIX:
ip = re.sub(r"\.[0-9]+$", "", srv_sck.getsockname())
msg = "subscribed @ %s f%d p%d" % (ip, fno, os.getpid())
ip = ip.split("/")[-1]
port = 0
tcp = False
else:
tcp = True
ip, port = srv_sck.getsockname()[:2]
hip = "[%s]" % (ip,) if ":" in ip else ip
msg = "subscribed @ %s:%d f%d p%d" % (hip, port, fno, os.getpid())
self.log(self.name, msg)
Daemon(self.broker.say, "sig-hsrv-up1", ("cb_httpsrv_up",))
@@ -335,11 +365,13 @@ class HttpSrv(object):
try:
sck, saddr = srv_sck.accept()
cip = unicode(saddr[0])
if cip.startswith("::ffff:"):
cip = cip[7:]
addr = (cip, saddr[1])
if tcp:
cip = unicode(saddr[0])
if cip.startswith("::ffff:"):
cip = cip[7:]
addr = (cip, saddr[1])
else:
addr = ("127.8.3.7", sck.fileno())
except (OSError, socket.error) as ex:
if self.stopping:
break
@@ -395,7 +427,7 @@ class HttpSrv(object):
)
def thr_poolw(self) -> None:
assert self.tp_q
assert self.tp_q # !rm
while True:
task = self.tp_q.get()
if not task:
@@ -507,8 +539,8 @@ class HttpSrv(object):
except:
pass
v = base64.urlsafe_b64encode(spack(b">xxL", int(v)))
self.cb_v = v.decode("ascii")[-4:]
# spack gives 4 lsb, take 3 lsb, get 4 ch
self.cb_v = ub64enc(spack(b">L", int(v))[1:]).decode("ascii")
self.cb_ts = time.time()
return self.cb_v

View File

@@ -74,7 +74,7 @@ class Ico(object):
try:
_, _, tw, th = pb.textbbox((0, 0), ext)
except:
tw, th = pb.textsize(ext)
tw, th = pb.textsize(ext) # type: ignore
tw += len(ext)
cw = tw // len(ext)

View File

@@ -88,7 +88,7 @@ class Metrics(object):
addg("cpp_total_bans", str(self.hsrv.nban), t)
if not args.nos_vst:
x = self.hsrv.broker.ask("up2k.get_state")
x = self.hsrv.broker.ask("up2k.get_state", True, "")
vs = json.loads(x.get())
nvidle = 0

View File

@@ -32,6 +32,17 @@ if True: # pylint: disable=using-constant-test
from .util import NamedLogger, RootLogger
try:
if os.environ.get("PRTY_NO_MUTAGEN"):
raise Exception()
from mutagen import version # noqa: F401
HAVE_MUTAGEN = True
except:
HAVE_MUTAGEN = False
def have_ff(scmd: str) -> bool:
if ANYWIN:
scmd += ".exe"
@@ -48,8 +59,8 @@ def have_ff(scmd: str) -> bool:
return bool(shutil.which(scmd))
HAVE_FFMPEG = have_ff("ffmpeg")
HAVE_FFPROBE = have_ff("ffprobe")
HAVE_FFMPEG = not os.environ.get("PRTY_NO_FFMPEG") and have_ff("ffmpeg")
HAVE_FFPROBE = not os.environ.get("PRTY_NO_FFPROBE") and have_ff("ffprobe")
class MParser(object):
@@ -336,9 +347,7 @@ class MTag(object):
if self.backend == "mutagen":
self._get = self.get_mutagen
try:
from mutagen import version # noqa: F401
except:
if not HAVE_MUTAGEN:
self.log("could not load Mutagen, trying FFprobe instead", c=3)
self.backend = "ffprobe"
@@ -578,7 +587,7 @@ class MTag(object):
continue
if k == ".aq":
v /= 1000
v /= 1000 # type: ignore
if k == "ac" and v.startswith("mp4a.40."):
v = "aac"

View File

@@ -4,11 +4,21 @@ from __future__ import print_function, unicode_literals
import argparse
import base64
import hashlib
import os
import sys
import threading
from .__init__ import unicode
try:
if os.environ.get("PRTY_NO_ARGON2"):
raise Exception()
HAVE_ARGON2 = True
from argon2 import __version__ as argon2ver
except:
HAVE_ARGON2 = False
class PWHash(object):
def __init__(self, args: argparse.Namespace):

View File

@@ -12,7 +12,7 @@ from types import SimpleNamespace
from .__init__ import ANYWIN, EXE, TYPE_CHECKING
from .authsrv import LEELOO_DALLAS, VFS
from .bos import bos
from .util import Daemon, min_ex, pybin, runhook
from .util import Daemon, absreal, min_ex, pybin, runhook, vjoin
if True: # pylint: disable=using-constant-test
from typing import Any, Union
@@ -151,6 +151,8 @@ class SMB(object):
def _uname(self) -> str:
if self.noacc:
return LEELOO_DALLAS
if not self.asrv.acct:
return "*"
try:
# you found it! my single worst bit of code so far
@@ -187,7 +189,9 @@ class SMB(object):
debug('%s("%s", %s) %s @%s\033[K\033[0m', caller, vpath, str(a), perms, uname)
vfs, rem = self.asrv.vfs.get(vpath, uname, *perms)
return vfs, vfs.canonical(rem)
if not vfs.realpath:
raise Exception("unmapped vfs")
return vfs, vjoin(vfs.realpath, rem)
def _listdir(self, vpath: str, *a: Any, **ka: Any) -> list[str]:
vpath = vpath.replace("\\", "/").lstrip("/")
@@ -195,6 +199,8 @@ class SMB(object):
uname = self._uname()
# debug('listdir("%s", %s) @%s\033[K\033[0m', vpath, str(a), uname)
vfs, rem = self.asrv.vfs.get(vpath, uname, False, False)
if not vfs.realpath:
raise Exception("unmapped vfs")
_, vfs_ls, vfs_virt = vfs.ls(
rem, uname, not self.args.no_scandir, [[False, False]]
)
@@ -209,7 +215,7 @@ class SMB(object):
sz = 112 * 2 # ['.', '..']
for n, fn in enumerate(ls):
if sz >= 64000:
t = "listing only %d of %d files (%d byte) in /%s; see impacket#1433"
t = "listing only %d of %d files (%d byte) in /%s for performance; see --smb-nwa-1"
warning(t, n, len(ls), sz, vpath)
break
@@ -238,9 +244,24 @@ class SMB(object):
t = "blocked write (no-write-acc %s): /%s @%s"
yeet(t % (vfs.axs.uwrite, vpath, uname))
ap = absreal(ap)
xbu = vfs.flags.get("xbu")
if xbu and not runhook(
self.nlog, xbu, ap, vpath, "", "", "", 0, 0, "1.7.6.2", 0, ""
self.nlog,
None,
self.hub.up2k,
"xbu.smb",
xbu,
ap,
vpath,
"",
"",
"",
0,
0,
"1.7.6.2",
time.time(),
"",
):
yeet("blocked by xbu server config: " + vpath)
@@ -297,7 +318,7 @@ class SMB(object):
t = "blocked rename (no-move-acc %s): /%s @%s"
yeet(t % (vfs1.axs.umove, vp1, uname))
self.hub.up2k.handle_mv(uname, vp1, vp2)
self.hub.up2k.handle_mv(uname, "1.7.6.2", vp1, vp2)
try:
bos.makedirs(ap2)
except:

View File

@@ -5,11 +5,11 @@ import errno
import re
import select
import socket
from email.utils import formatdate
import time
from .__init__ import TYPE_CHECKING
from .multicast import MC_Sck, MCast
from .util import CachedSet, html_escape, min_ex
from .util import CachedSet, formatdate, html_escape, min_ex
if TYPE_CHECKING:
from .broker_util import BrokerCli
@@ -229,7 +229,7 @@ CONFIGID.UPNP.ORG: 1
"""
v4 = srv.ip.replace("::ffff:", "")
zs = zs.format(formatdate(usegmt=True), v4, srv.hport, self.args.zsid)
zs = zs.format(formatdate(), v4, srv.hport, self.args.zsid)
zb = zs[1:].replace("\n", "\r\n").encode("utf-8", "replace")
srv.sck.sendto(zb, addr[:2])

View File

@@ -12,6 +12,12 @@ from .label import DNSBuffer, DNSLabel
from .ranges import IP4, IP6, H, I, check_bytes
try:
range = xrange
except:
pass
class DNSError(Exception):
pass

View File

@@ -11,7 +11,21 @@ import os
from ._shared import IP, Adapter
if os.name == "nt":
def nope(include_unconfigured=False):
return []
try:
S390X = os.uname().machine == "s390x"
except:
S390X = False
if os.environ.get("PRTY_NO_IFADDR") or S390X:
# s390x deadlocks at libc.getifaddrs
get_adapters = nope
elif os.name == "nt":
from ._win32 import get_adapters
elif os.name == "posix":
from ._posix import get_adapters

View File

@@ -17,6 +17,7 @@ if not PY2:
U: Callable[[str], str] = str
else:
U = unicode # noqa: F821 # pylint: disable=undefined-variable,self-assigning-variable
range = xrange # noqa: F821 # pylint: disable=undefined-variable,self-assigning-variable
class Adapter(object):

View File

@@ -16,6 +16,11 @@ if True: # pylint: disable=using-constant-test
from typing import Callable, List, Optional, Tuple, Union
try:
range = xrange
except:
pass
def num_char_count_bits(ver: int) -> int:
return 16 if (ver + 7) // 17 else 8
@@ -589,3 +594,20 @@ def _get_bit(x: int, i: int) -> bool:
class DataTooLongError(ValueError):
pass
def qr2svg(qr: QrCode, border: int) -> str:
parts: list[str] = []
for y in range(qr.size):
sy = border + y
for x in range(qr.size):
if qr.modules[y][x]:
parts.append("M%d,%dh1v1h-1z" % (border + x, sy))
t = """\
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" version="1.1" viewBox="0 0 {0} {0}" stroke="none">
<rect width="100%" height="100%" fill="#F7F7F7"/>
<path d="{1}" fill="#111111"/>
</svg>
"""
return t.format(qr.size + border * 2, " ".join(parts))

View File

@@ -6,7 +6,7 @@ import tempfile
from datetime import datetime
from .__init__ import CORES
from .authsrv import AuthSrv, VFS
from .authsrv import VFS, AuthSrv
from .bos import bos
from .th_cli import ThumbCli
from .util import UTC, vjoin, vol_san

View File

@@ -2,8 +2,6 @@
from __future__ import print_function, unicode_literals
import argparse
import base64
import calendar
import errno
import gzip
import logging
@@ -16,7 +14,7 @@ import string
import sys
import threading
import time
from datetime import datetime, timedelta
from datetime import datetime
# from inspect import currentframe
# print(currentframe().f_lineno)
@@ -28,18 +26,30 @@ if True: # pylint: disable=using-constant-test
import typing
from typing import Any, Optional, Union
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, E, EnvParams, unicode
from .__init__ import ANYWIN, EXE, MACOS, PY2, TYPE_CHECKING, E, EnvParams, unicode
from .authsrv import BAD_CFG, AuthSrv
from .cert import ensure_cert
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, HAVE_MUTAGEN
from .pwhash import HAVE_ARGON2
from .tcpsrv import TcpSrv
from .th_srv import HAVE_PIL, HAVE_VIPS, HAVE_WEBP, ThumbSrv
from .th_srv import (
HAVE_AVIF,
HAVE_FFMPEG,
HAVE_FFPROBE,
HAVE_HEIF,
HAVE_PIL,
HAVE_VIPS,
HAVE_WEBP,
ThumbSrv,
)
from .up2k import Up2k
from .util import (
DEF_EXP,
DEF_MTE,
DEF_MTH,
FFMPEG_URL,
HAVE_PSUTIL,
HAVE_SQLITE3,
UTC,
VERSIONS,
Daemon,
@@ -56,6 +66,7 @@ from .util import (
pybin,
start_log_thrs,
start_stackmon,
ub64enc,
)
if TYPE_CHECKING:
@@ -65,6 +76,9 @@ if TYPE_CHECKING:
except:
pass
if PY2:
range = xrange # type: ignore
class SvcHub(object):
"""
@@ -89,8 +103,10 @@ class SvcHub(object):
self.argv = argv
self.E: EnvParams = args.E
self.no_ansi = args.no_ansi
self.tz = UTC if args.log_utc else None
self.logf: Optional[typing.TextIO] = None
self.logf_base_fn = ""
self.is_dut = False # running in unittest; always False
self.stop_req = False
self.stopping = False
self.stopped = False
@@ -102,7 +118,8 @@ class SvcHub(object):
self.httpsrv_up = 0
self.log_mutex = threading.Lock()
self.next_day = 0
self.cday = 0
self.cmon = 0
self.tstack = 0.0
self.iphash = HMaccas(os.path.join(self.E.cfg, "iphash"), 8)
@@ -193,6 +210,23 @@ class SvcHub(object):
t = "WARNING: --s-rd-sz (%d) is larger than --iobuf (%d); this may lead to reduced performance"
self.log("root", t % (args.s_rd_sz, args.iobuf), 3)
if args.chpw and args.idp_h_usr:
t = "ERROR: user-changeable passwords is incompatible with IdP/identity-providers; you must disable either --chpw or --idp-h-usr"
self.log("root", t, 1)
raise Exception(t)
noch = set()
for zs in args.chpw_no or []:
zsl = [x.strip() for x in zs.split(",")]
noch.update([x for x in zsl if x])
args.chpw_no = noch
if not self.args.no_ses:
self.setup_session_db()
if args.shr:
self.setup_share_db()
bri = "zy"[args.theme % 2 :][:1]
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
args.theme = "{0}{1} {0} {1}".format(ch, bri)
@@ -232,6 +266,8 @@ class SvcHub(object):
self.up2k = Up2k(self)
self._feature_test()
decs = {k: 1 for k in self.args.th_dec.split(",")}
if not HAVE_VIPS:
decs.pop("vips", None)
@@ -336,6 +372,151 @@ class SvcHub(object):
self.broker = Broker(self)
def setup_session_db(self) -> None:
if not HAVE_SQLITE3:
self.args.no_ses = True
t = "WARNING: sqlite3 not available; disabling sessions, will use plaintext passwords in cookies"
self.log("root", t, 3)
return
import sqlite3
create = True
db_path = self.args.ses_db
self.log("root", "opening sessions-db %s" % (db_path,))
for n in range(2):
try:
db = sqlite3.connect(db_path)
cur = db.cursor()
try:
cur.execute("select count(*) from us").fetchone()
create = False
break
except:
pass
except Exception as ex:
if n:
raise
t = "sessions-db corrupt; deleting and recreating: %r"
self.log("root", t % (ex,), 3)
try:
cur.close() # type: ignore
except:
pass
try:
db.close() # type: ignore
except:
pass
os.unlink(db_path)
sch = [
r"create table kv (k text, v int)",
r"create table us (un text, si text, t0 int)",
# username, session-id, creation-time
r"create index us_un on us(un)",
r"create index us_si on us(si)",
r"create index us_t0 on us(t0)",
r"insert into kv values ('sver', 1)",
]
assert db # type: ignore # !rm
assert cur # type: ignore # !rm
if create:
for cmd in sch:
cur.execute(cmd)
self.log("root", "created new sessions-db")
db.commit()
cur.close()
db.close()
def setup_share_db(self) -> None:
al = self.args
if not HAVE_SQLITE3:
self.log("root", "sqlite3 not available; disabling --shr", 1)
al.shr = ""
return
import sqlite3
al.shr = al.shr.strip("/")
if "/" in al.shr or not al.shr:
t = "config error: --shr must be the name of a virtual toplevel directory to put shares inside"
self.log("root", t, 1)
raise Exception(t)
al.shr = "/%s/" % (al.shr,)
create = True
modified = False
db_path = self.args.shr_db
self.log("root", "opening shares-db %s" % (db_path,))
for n in range(2):
try:
db = sqlite3.connect(db_path)
cur = db.cursor()
try:
cur.execute("select count(*) from sh").fetchone()
create = False
break
except:
pass
except Exception as ex:
if n:
raise
t = "shares-db corrupt; deleting and recreating: %r"
self.log("root", t % (ex,), 3)
try:
cur.close() # type: ignore
except:
pass
try:
db.close() # type: ignore
except:
pass
os.unlink(db_path)
sch1 = [
r"create table kv (k text, v int)",
r"create table sh (k text, pw text, vp text, pr text, st int, un text, t0 int, t1 int)",
# sharekey, password, src, perms, numFiles, owner, created, expires
]
sch2 = [
r"create table sf (k text, vp text)",
r"create index sf_k on sf(k)",
r"create index sh_k on sh(k)",
r"create index sh_t1 on sh(t1)",
]
assert db # type: ignore # !rm
assert cur # type: ignore # !rm
if create:
dver = 2
modified = True
for cmd in sch1 + sch2:
cur.execute(cmd)
self.log("root", "created new shares-db")
else:
(dver,) = cur.execute("select v from kv where k = 'sver'").fetchall()[0]
if dver == 1:
modified = True
for cmd in sch2:
cur.execute(cmd)
cur.execute("update sh set st = 0")
self.log("root", "shares-db schema upgrade ok")
if modified:
for cmd in [
r"delete from kv where k = 'sver'",
r"insert into kv values ('sver', %d)" % (2,),
]:
cur.execute(cmd)
db.commit()
cur.close()
db.close()
def start_ftpd(self) -> None:
time.sleep(30)
@@ -420,6 +601,58 @@ class SvcHub(object):
Daemon(self.sd_notify, "sd-notify")
def _feature_test(self) -> None:
fok = []
fng = []
t_ff = "transcode audio, create spectrograms, video thumbnails"
to_check = [
(HAVE_SQLITE3, "sqlite", "sessions and file/media indexing"),
(HAVE_PIL, "pillow", "image thumbnails (plenty fast)"),
(HAVE_VIPS, "vips", "image thumbnails (faster, eats more ram)"),
(HAVE_WEBP, "pillow-webp", "create thumbnails as webp files"),
(HAVE_FFMPEG, "ffmpeg", t_ff + ", good-but-slow image thumbnails"),
(HAVE_FFPROBE, "ffprobe", t_ff + ", read audio/media tags"),
(HAVE_MUTAGEN, "mutagen", "read audio tags (ffprobe is better but slower)"),
(HAVE_ARGON2, "argon2", "secure password hashing (advanced users only)"),
(HAVE_HEIF, "pillow-heif", "read .heif images with pillow (rarely useful)"),
(HAVE_AVIF, "pillow-avif", "read .avif images with pillow (rarely useful)"),
]
if ANYWIN:
to_check += [
(HAVE_PSUTIL, "psutil", "improved plugin cleanup (rarely useful)")
]
verbose = self.args.deps
if verbose:
self.log("dependencies", "")
for have, feat, what in to_check:
lst = fok if have else fng
lst.append((feat, what))
if verbose:
zi = 2 if have else 5
sgot = "found" if have else "missing"
t = "%7s: %s \033[36m(%s)"
self.log("dependencies", t % (sgot, feat, what), zi)
if verbose:
self.log("dependencies", "")
return
sok = ", ".join(x[0] for x in fok)
sng = ", ".join(x[0] for x in fng)
t = ""
if sok:
t += "OK: \033[32m" + sok
if sng:
if t:
t += ", "
t += "\033[0mNG: \033[35m" + sng
t += "\033[0m, see --deps"
self.log("dependencies", t, 6)
def _check_env(self) -> None:
try:
files = os.listdir(E.cfg)
@@ -620,7 +853,7 @@ class SvcHub(object):
self.args.nc = min(self.args.nc, soft // 2)
def _logname(self) -> str:
dt = datetime.now(UTC)
dt = datetime.now(self.tz)
fn = str(self.args.lo)
for fs in "YmdHMS":
fs = "%" + fs
@@ -746,18 +979,21 @@ class SvcHub(object):
Daemon(self._reload, "reloading")
return "reload initiated"
def _reload(self, rescan_all_vols: bool = True) -> None:
def _reload(self, rescan_all_vols: bool = True, up2k: bool = True) -> None:
with self.up2k.mutex:
if self.reloading != 1:
return
self.reloading = 2
self.log("root", "reloading config")
self.asrv.reload()
self.up2k.reload(rescan_all_vols)
self.asrv.reload(9 if up2k else 4)
if up2k:
self.up2k.reload(rescan_all_vols)
else:
self.log("root", "reload done")
self.broker.reload()
self.reloading = 0
def _reload_blocking(self, rescan_all_vols: bool = True) -> None:
def _reload_blocking(self, rescan_all_vols: bool = True, up2k: bool = True) -> None:
while True:
with self.up2k.mutex:
if self.reloading < 2:
@@ -768,7 +1004,12 @@ class SvcHub(object):
# try to handle multiple pending IdP reloads at once:
time.sleep(0.2)
self._reload(rescan_all_vols=rescan_all_vols)
self._reload(rescan_all_vols=rescan_all_vols, up2k=up2k)
def _reload_sessions(self) -> None:
with self.asrv.mutex:
self.asrv.load_sessions(True)
self.broker.reload_sessions()
def stop_thr(self) -> None:
while not self.stop_req:
@@ -890,12 +1131,12 @@ class SvcHub(object):
return
with self.log_mutex:
zd = datetime.now(UTC)
dt = datetime.now(self.tz)
ts = self.log_dfmt % (
zd.year,
zd.month * 100 + zd.day,
(zd.hour * 100 + zd.minute) * 100 + zd.second,
zd.microsecond // self.log_div,
dt.year,
dt.month * 100 + dt.day,
(dt.hour * 100 + dt.minute) * 100 + dt.second,
dt.microsecond // self.log_div,
)
if c and not self.args.no_ansi:
@@ -916,41 +1157,26 @@ class SvcHub(object):
if not self.args.no_logflush:
self.logf.flush()
now = time.time()
if int(now) >= self.next_day:
self._set_next_day()
if dt.day != self.cday or dt.month != self.cmon:
self._set_next_day(dt)
def _set_next_day(self) -> None:
if self.next_day and self.logf and self.logf_base_fn != self._logname():
def _set_next_day(self, dt: datetime) -> None:
if self.cday and self.logf and self.logf_base_fn != self._logname():
self.logf.close()
self._setup_logfile("")
dt = datetime.now(UTC)
# unix timestamp of next 00:00:00 (leap-seconds safe)
day_now = dt.day
while dt.day == day_now:
dt += timedelta(hours=12)
dt = dt.replace(hour=0, minute=0, second=0)
try:
tt = dt.utctimetuple()
except:
# still makes me hella uncomfortable
tt = dt.timetuple()
self.next_day = calendar.timegm(tt)
self.cday = dt.day
self.cmon = dt.month
def _log_enabled(self, src: str, msg: str, c: Union[int, str] = 0) -> None:
"""handles logging from all components"""
with self.log_mutex:
now = time.time()
if int(now) >= self.next_day:
dt = datetime.fromtimestamp(now, UTC)
dt = datetime.now(self.tz)
if dt.day != self.cday or dt.month != self.cmon:
zs = "{}\n" if self.no_ansi else "\033[36m{}\033[0m\n"
zs = zs.format(dt.strftime("%Y-%m-%d"))
print(zs, end="")
self._set_next_day()
self._set_next_day(dt)
if self.logf:
self.logf.write(zs)
@@ -969,12 +1195,11 @@ class SvcHub(object):
else:
msg = "%s%s\033[0m" % (c, msg)
zd = datetime.fromtimestamp(now, UTC)
ts = self.log_efmt % (
zd.hour,
zd.minute,
zd.second,
zd.microsecond // self.log_div,
dt.hour,
dt.minute,
dt.second,
dt.microsecond // self.log_div,
)
msg = fmt % (ts, src, msg)
try:
@@ -1072,5 +1297,5 @@ class SvcHub(object):
zs = "{}\n{}".format(VERSIONS, alltrace())
zb = zs.encode("utf-8", "replace")
zb = gzip.compress(zb)
zs = base64.b64encode(zb).decode("ascii")
zs = ub64enc(zb).decode("ascii")
self.log("stacks", zs)

View File

@@ -37,9 +37,7 @@ def dostime2unix(buf: bytes) -> int:
def unixtime2dos(ts: int) -> bytes:
tt = time.gmtime(ts + 1)
dy, dm, dd, th, tm, ts = list(tt)[:6]
dy, dm, dd, th, tm, ts, _, _, _ = time.gmtime(ts + 1)
bd = ((dy - 1980) << 9) + (dm << 5) + dd
bt = (th << 11) + (tm << 5) + ts // 2
try:
@@ -107,7 +105,7 @@ def gen_hdr(
ret += spack(b"<LL", vsz, vsz)
# windows support (the "?" replace below too)
fn = sanitize_fn(fn, "/", [])
fn = sanitize_fn(fn, "/")
bfn = fn.encode("utf-8" if utf8 else "cp437", "replace").replace(b"?", b"_")
# add ntfs (0x24) and/or unix (0x10) extrafields for utc, add z64 if requested

View File

@@ -17,18 +17,23 @@ from .util import (
E_UNREACH,
HAVE_IPV6,
IP6ALL,
VF_CAREFUL,
Netdev,
atomic_move,
min_ex,
sunpack,
termsize,
)
if True:
from typing import Generator
from typing import Generator, Union
if TYPE_CHECKING:
from .svchub import SvcHub
if not hasattr(socket, "AF_UNIX"):
setattr(socket, "AF_UNIX", -9001)
if not hasattr(socket, "IPPROTO_IPV6"):
setattr(socket, "IPPROTO_IPV6", 41)
@@ -217,14 +222,41 @@ class TcpSrv(object):
if self.args.qr or self.args.qrs:
self.qr = self._qr(qr1, qr2)
def nlog(self, msg: str, c: Union[int, str] = 0) -> None:
self.log("tcpsrv", msg, c)
def _listen(self, ip: str, port: int) -> None:
ipv = socket.AF_INET6 if ":" in ip else socket.AF_INET
uds_perm = uds_gid = -1
if "unix:" in ip:
tcp = False
ipv = socket.AF_UNIX
uds = ip.split(":")
ip = uds[-1]
if len(uds) > 2:
uds_perm = int(uds[1], 8)
if len(uds) > 3:
try:
uds_gid = int(uds[2])
except:
import grp
uds_gid = grp.getgrnam(uds[2]).gr_gid
elif ":" in ip:
tcp = True
ipv = socket.AF_INET6
else:
tcp = True
ipv = socket.AF_INET
srv = socket.socket(ipv, socket.SOCK_STREAM)
if not ANYWIN or self.args.reuseaddr:
srv.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
srv.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
if tcp:
srv.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
srv.settimeout(None) # < does not inherit, ^ opts above do
try:
@@ -236,8 +268,25 @@ class TcpSrv(object):
srv.setsockopt(socket.SOL_IP, socket.IP_FREEBIND, 1)
try:
srv.bind((ip, port))
sport = srv.getsockname()[1]
if tcp:
srv.bind((ip, port))
else:
if ANYWIN or self.args.rm_sck:
if os.path.exists(ip):
os.unlink(ip)
srv.bind(ip)
else:
tf = "%s.%d" % (ip, os.getpid())
if os.path.exists(tf):
os.unlink(tf)
srv.bind(tf)
if uds_gid != -1:
os.chown(tf, -1, uds_gid)
if uds_perm != -1:
os.chmod(tf, uds_perm)
atomic_move(self.nlog, tf, ip, VF_CAREFUL)
sport = srv.getsockname()[1] if tcp else port
if port != sport:
# linux 6.0.16 lets you bind a port which is in use
# except it just gives you a random port instead
@@ -249,12 +298,23 @@ class TcpSrv(object):
except:
pass
e = ""
if ex.errno in E_ADDR_IN_USE:
e = "\033[1;31mport {} is busy on interface {}\033[0m".format(port, ip)
if not tcp:
e = "\033[1;31munix-socket {} is busy\033[0m".format(ip)
elif ex.errno in E_ADDR_NOT_AVAIL:
e = "\033[1;31minterface {} does not exist\033[0m".format(ip)
else:
if not e:
if not tcp:
t = "\n\n\n NOTE: this crash may be due to a unix-socket bug; try --rm-sck\n"
self.log("tcpsrv", t, 2)
raise
if not tcp and not self.args.rm_sck:
e += "; maybe this is a bug? try --rm-sck"
raise Exception(e)
def run(self) -> None:
@@ -262,7 +322,14 @@ class TcpSrv(object):
bound: list[tuple[str, int]] = []
srvs: list[socket.socket] = []
for srv in self.srv:
ip, port = srv.getsockname()[:2]
if srv.family == socket.AF_UNIX:
tcp = False
ip = re.sub(r"\.[0-9]+$", "", srv.getsockname())
port = 0
else:
tcp = True
ip, port = srv.getsockname()[:2]
if ip == IP6ALL:
ip = "::" # jython
@@ -294,8 +361,12 @@ class TcpSrv(object):
bound.append((ip, port))
srvs.append(srv)
fno = srv.fileno()
hip = "[{}]".format(ip) if ":" in ip else ip
msg = "listening @ {}:{} f{} p{}".format(hip, port, fno, os.getpid())
if tcp:
hip = "[{}]".format(ip) if ":" in ip else ip
msg = "listening @ {}:{} f{} p{}".format(hip, port, fno, os.getpid())
else:
msg = "listening @ {} f{} p{}".format(ip, fno, os.getpid())
self.log("tcpsrv", msg)
if self.args.q:
print(msg)
@@ -348,6 +419,8 @@ class TcpSrv(object):
def detect_interfaces(self, listen_ips: list[str]) -> dict[str, Netdev]:
from .stolen.ifaddr import get_adapters
listen_ips = [x for x in listen_ips if "unix:" not in x]
nics = get_adapters(True)
eps: dict[str, Netdev] = {}
for nic in nics:

View File

@@ -36,7 +36,7 @@ from partftpy.TftpShared import TftpException
from .__init__ import EXE, PY2, TYPE_CHECKING
from .authsrv import VFS
from .bos import bos
from .util import BytesIO, Daemon, ODict, exclude_dotfiles, min_ex, runhook, undot
from .util import UTC, BytesIO, Daemon, ODict, exclude_dotfiles, min_ex, runhook, undot
if True: # pylint: disable=using-constant-test
from typing import Any, Union
@@ -44,6 +44,9 @@ if True: # pylint: disable=using-constant-test
if TYPE_CHECKING:
from .svchub import SvcHub
if PY2:
range = xrange # type: ignore
lg = logging.getLogger("tftp")
debug, info, warning, error = (lg.debug, lg.info, lg.warning, lg.error)
@@ -163,9 +166,16 @@ class Tftpd(object):
if "::" in ips:
ips.append("0.0.0.0")
ips = [x for x in ips if "unix:" not in x]
if self.args.tftp4:
ips = [x for x in ips if ":" not in x]
if not ips:
t = "cannot start tftp-server; no compatible IPs in -i"
self.nlog(t, 1)
return
ips = list(ODict.fromkeys(ips)) # dedup
for ip in ips:
@@ -241,6 +251,8 @@ class Tftpd(object):
debug('%s("%s", %s) %s\033[K\033[0m', caller, vpath, str(a), perms)
vfs, rem = self.asrv.vfs.get(vpath, "*", *perms)
if not vfs.realpath:
raise Exception("unmapped vfs")
return vfs, vfs.canonical(rem)
def _ls(self, vpath: str, raddress: str, rport: int, force=False) -> Any:
@@ -262,7 +274,7 @@ class Tftpd(object):
dirs1 = [(v.st_mtime, v.st_size, k + "/") for k, v in vfs_ls if k in dnames]
fils1 = [(v.st_mtime, v.st_size, k) for k, v in vfs_ls if k not in dnames]
real1 = dirs1 + fils1
realt = [(datetime.fromtimestamp(mt), sz, fn) for mt, sz, fn in real1]
realt = [(datetime.fromtimestamp(mt, UTC), sz, fn) for mt, sz, fn in real1]
reals = [
(
"%04d-%02d-%02d %02d:%02d:%02d"
@@ -328,7 +340,21 @@ class Tftpd(object):
xbu = vfs.flags.get("xbu")
if xbu and not runhook(
self.nlog, xbu, ap, vpath, "", "", "", 0, 0, "8.3.8.7", 0, ""
self.nlog,
None,
self.hub.up2k,
"xbu.tftpd",
xbu,
ap,
vpath,
"",
"",
"",
0,
0,
"8.3.8.7",
time.time(),
"",
):
yeet("blocked by xbu server config: " + vpath)
@@ -336,7 +362,7 @@ class Tftpd(object):
return self._ls(vpath, "", 0, True)
if not a:
a = [self.args.iobuf]
a = (self.args.iobuf,)
return open(ap, mode, *a, **ka)
@@ -377,7 +403,7 @@ class Tftpd(object):
bos.stat(ap)
return True
except:
return False
return vpath == "/"
def _p_isdir(self, vpath: str) -> bool:
try:
@@ -385,7 +411,7 @@ class Tftpd(object):
ret = stat.S_ISDIR(st.st_mode)
return ret
except:
return False
return vpath == "/"
def _hook(self, *a: Any, **ka: Any) -> None:
src = inspect.currentframe().f_back.f_code.co_name

View File

@@ -59,7 +59,8 @@ class ThumbCli(object):
want_opus = fmt in ("opus", "caf", "mp3")
is_au = ext in self.fmt_ffa
if is_au:
is_vau = want_opus and ext in self.fmt_ffv
if is_au or is_vau:
if want_opus:
if self.args.no_acode:
return None
@@ -107,7 +108,7 @@ class ThumbCli(object):
fmt = sfmt
elif fmt[:1] == "p" and not is_au:
elif fmt[:1] == "p" and not is_au and not is_vid:
t = "cannot thumbnail [%s]: png only allowed for waveforms"
self.log(t % (rem), 6)
return None

View File

@@ -1,7 +1,6 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import base64
import hashlib
import logging
import os
@@ -12,7 +11,7 @@ import time
from queue import Queue
from .__init__ import ANYWIN, TYPE_CHECKING
from .__init__ import ANYWIN, PY2, TYPE_CHECKING
from .authsrv import VFS
from .bos import bos
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, au_unpk, ffprobe
@@ -27,6 +26,7 @@ from .util import (
min_ex,
runcmd,
statdir,
ub64enc,
vsplit,
wrename,
wunlink,
@@ -38,6 +38,9 @@ if True: # pylint: disable=using-constant-test
if TYPE_CHECKING:
from .svchub import SvcHub
if PY2:
range = xrange # type: ignore
HAVE_PIL = False
HAVE_PILF = False
HAVE_HEIF = False
@@ -45,22 +48,34 @@ HAVE_AVIF = False
HAVE_WEBP = False
try:
if os.environ.get("PRTY_NO_PIL"):
raise Exception()
from PIL import ExifTags, Image, ImageFont, ImageOps
HAVE_PIL = True
try:
if os.environ.get("PRTY_NO_PILF"):
raise Exception()
ImageFont.load_default(size=16)
HAVE_PILF = True
except:
pass
try:
if os.environ.get("PRTY_NO_PIL_WEBP"):
raise Exception()
Image.new("RGB", (2, 2)).save(BytesIO(), format="webp")
HAVE_WEBP = True
except:
pass
try:
if os.environ.get("PRTY_NO_PIL_HEIF"):
raise Exception()
from pyheif_pillow_opener import register_heif_opener
register_heif_opener()
@@ -69,6 +84,9 @@ try:
pass
try:
if os.environ.get("PRTY_NO_PIL_AVIF"):
raise Exception()
import pillow_avif # noqa: F401 # pylint: disable=unused-import
HAVE_AVIF = True
@@ -80,6 +98,9 @@ except:
pass
try:
if os.environ.get("PRTY_NO_VIPS"):
raise Exception()
HAVE_VIPS = True
import pyvips
@@ -88,6 +109,9 @@ except:
HAVE_VIPS = False
th_dir_cache = {}
def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -> str:
# base16 = 16 = 256
# b64-lc = 38 = 1444
@@ -101,14 +125,20 @@ def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -
if ext in ffa and fmt[:2] in ("wf", "jf"):
fmt = fmt.replace("f", "")
rd += "\n" + fmt
h = hashlib.sha512(afsenc(rd)).digest()
b64 = base64.urlsafe_b64encode(h).decode("ascii")[:24]
rd = ("%s/%s/" % (b64[:2], b64[2:4])).lower() + b64
dcache = th_dir_cache
rd_key = rd + "\n" + fmt
rd = dcache.get(rd_key)
if not rd:
h = hashlib.sha512(afsenc(rd_key)).digest()
b64 = ub64enc(h).decode("ascii")[:24]
rd = ("%s/%s/" % (b64[:2], b64[2:4])).lower() + b64
if len(dcache) > 9001:
dcache.clear()
dcache[rd_key] = rd
# could keep original filenames but this is safer re pathlen
h = hashlib.sha512(afsenc(fn)).digest()
fn = base64.urlsafe_b64encode(h).decode("ascii")[:24]
fn = ub64enc(h).decode("ascii")[:24]
if fmt in ("opus", "caf", "mp3"):
cat = "ac"
@@ -304,23 +334,31 @@ class ThumbSrv(object):
ap_unpk = abspath
if not bos.path.exists(tpath):
want_mp3 = tpath.endswith(".mp3")
want_opus = tpath.endswith(".opus") or tpath.endswith(".caf")
want_png = tpath.endswith(".png")
want_au = want_mp3 or want_opus
for lib in self.args.th_dec:
can_au = lib == "ff" and (
ext in self.fmt_ffa or ext in self.fmt_ffv
)
if lib == "pil" and ext in self.fmt_pil:
funs.append(self.conv_pil)
elif lib == "vips" and ext in self.fmt_vips:
funs.append(self.conv_vips)
elif lib == "ff" and ext in self.fmt_ffi or ext in self.fmt_ffv:
funs.append(self.conv_ffmpeg)
elif lib == "ff" and ext in self.fmt_ffa:
if tpath.endswith(".opus") or tpath.endswith(".caf"):
elif can_au and (want_png or want_au):
if want_opus:
funs.append(self.conv_opus)
elif tpath.endswith(".mp3"):
elif want_mp3:
funs.append(self.conv_mp3)
elif tpath.endswith(".png"):
elif want_png:
funs.append(self.conv_waves)
png_ok = True
else:
funs.append(self.conv_spec)
elif lib == "ff" and (ext in self.fmt_ffi or ext in self.fmt_ffv):
funs.append(self.conv_ffmpeg)
elif lib == "ff" and ext in self.fmt_ffa and not want_au:
funs.append(self.conv_spec)
tdir, tfn = os.path.split(tpath)
ttpath = os.path.join(tdir, "w", tfn)
@@ -450,7 +488,7 @@ class ThumbSrv(object):
if c == crops[-1]:
raise
assert img # type: ignore
assert img # type: ignore # !rm
img.write_to_file(tpath, Q=40)
def conv_ffmpeg(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:

View File

@@ -8,7 +8,7 @@ import threading
import time
from operator import itemgetter
from .__init__ import ANYWIN, TYPE_CHECKING, unicode
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, unicode
from .authsrv import LEELOO_DALLAS, VFS
from .bos import bos
from .up2k import up2k_wark_from_hashlist
@@ -38,6 +38,9 @@ if True: # pylint: disable=using-constant-test
if TYPE_CHECKING:
from .httpsrv import HttpSrv
if PY2:
range = xrange # type: ignore
class U2idx(object):
def __init__(self, hsrv: "HttpSrv") -> None:
@@ -50,12 +53,16 @@ class U2idx(object):
self.log("your python does not have sqlite3; searching will be disabled")
return
assert sqlite3 # type: ignore # !rm
self.active_id = ""
self.active_cur: Optional["sqlite3.Cursor"] = None
self.cur: dict[str, "sqlite3.Cursor"] = {}
self.mem_cur = sqlite3.connect(":memory:", check_same_thread=False).cursor()
self.mem_cur.execute(r"create table a (b text)")
self.sh_cur: Optional["sqlite3.Cursor"] = None
self.p_end = 0.0
self.p_dur = 0.0
@@ -92,17 +99,31 @@ class U2idx(object):
except:
raise Pebkac(500, min_ex())
def get_cur(self, vn: VFS) -> Optional["sqlite3.Cursor"]:
if not HAVE_SQLITE3:
def get_shr(self) -> Optional["sqlite3.Cursor"]:
if self.sh_cur:
return self.sh_cur
if not HAVE_SQLITE3 or not self.args.shr:
return None
assert sqlite3 # type: ignore # !rm
db = sqlite3.connect(self.args.shr_db, timeout=2, check_same_thread=False)
cur = db.cursor()
cur.execute('pragma table_info("sh")').fetchall()
self.sh_cur = cur
return cur
def get_cur(self, vn: VFS) -> Optional["sqlite3.Cursor"]:
cur = self.cur.get(vn.realpath)
if cur:
return cur
if "e2d" not in vn.flags:
if not HAVE_SQLITE3 or "e2d" not in vn.flags:
return None
assert sqlite3 # type: ignore # !rm
ptop = vn.realpath
histpath = self.asrv.vfs.histtab.get(ptop)
if not histpath:
@@ -448,5 +469,5 @@ class U2idx(object):
return
if identifier == self.active_id:
assert self.active_cur
assert self.active_cur # !rm
self.active_cur.connection.interrupt()

File diff suppressed because it is too large Load Diff

View File

@@ -3,7 +3,8 @@ from __future__ import print_function, unicode_literals
import argparse
import base64
import contextlib
import binascii
import codecs
import errno
import hashlib
import hmac
@@ -26,18 +27,24 @@ import threading
import time
import traceback
from collections import Counter
from email.utils import formatdate
from ipaddress import IPv4Address, IPv4Network, IPv6Address, IPv6Network
from queue import Queue
from .__init__ import ANYWIN, EXE, MACOS, PY2, TYPE_CHECKING, VT100, WINDOWS
from .__init__ import (
ANYWIN,
EXE,
MACOS,
PY2,
PY36,
TYPE_CHECKING,
VT100,
WINDOWS,
EnvParams,
)
from .__version__ import S_BUILD_DT, S_VERSION
from .stolen import surrogateescape
ub64dec = base64.urlsafe_b64decode
ub64enc = base64.urlsafe_b64encode
try:
from datetime import datetime, timezone
@@ -60,8 +67,12 @@ except:
UTC = _UTC()
if PY2:
range = xrange # type: ignore
if sys.version_info >= (3, 7) or (
sys.version_info >= (3, 6) and platform.python_implementation() == "CPython"
PY36 and platform.python_implementation() == "CPython"
):
ODict = dict
else:
@@ -99,6 +110,9 @@ except:
pass
try:
if os.environ.get("PRTY_NO_SQLITE"):
raise Exception()
HAVE_SQLITE3 = True
import sqlite3
@@ -107,6 +121,9 @@ except:
HAVE_SQLITE3 = False
try:
if os.environ.get("PRTY_NO_PSUTIL"):
raise Exception()
HAVE_PSUTIL = True
import psutil
except:
@@ -117,7 +134,7 @@ if True: # pylint: disable=using-constant-test
from collections.abc import Callable, Iterable
import typing
from typing import Any, Generator, Optional, Pattern, Protocol, Union
from typing import IO, Any, Generator, Optional, Pattern, Protocol, Union
try:
from typing import LiteralString
@@ -137,10 +154,15 @@ if TYPE_CHECKING:
import magic
from .authsrv import VFS
from .broker_util import BrokerCli
from .up2k import Up2k
FAKE_MP = False
try:
if os.environ.get("PRTY_NO_MP"):
raise ImportError()
import multiprocessing as mp
# import multiprocessing.dummy as mp
@@ -150,15 +172,14 @@ except ImportError:
if not PY2:
from io import BytesIO
from urllib.parse import quote_from_bytes as quote
from urllib.parse import unquote_to_bytes as unquote
else:
from StringIO import StringIO as BytesIO # type: ignore
from urllib import quote # type: ignore # pylint: disable=no-name-in-module
from urllib import unquote # type: ignore # pylint: disable=no-name-in-module
try:
if os.environ.get("PRTY_NO_IPV6"):
raise Exception()
socket.inet_pton(socket.AF_INET6, "::1")
HAVE_IPV6 = True
except:
@@ -199,7 +220,7 @@ else:
FS_ENCODING = sys.getfilesystemencoding()
SYMTIME = sys.version_info > (3, 6) and os.utime in os.supports_follow_symlinks
SYMTIME = PY36 and os.utime in os.supports_follow_symlinks
META_NOBOTS = '<meta name="robots" content="noindex, nofollow">\n'
@@ -243,6 +264,8 @@ IMPLICATIONS = [
["e2vu", "e2v"],
["e2vp", "e2v"],
["e2v", "e2d"],
["hardlink_only", "hardlink"],
["hardlink", "dedup"],
["tftpvv", "tftpv"],
["smbw", "smb"],
["smb1", "smb"],
@@ -267,6 +290,30 @@ if ANYWIN:
UNPLICATIONS = [["no_dav", "daw"]]
DAV_ALLPROP_L = [
"contentclass",
"creationdate",
"defaultdocument",
"displayname",
"getcontentlanguage",
"getcontentlength",
"getcontenttype",
"getlastmodified",
"href",
"iscollection",
"ishidden",
"isreadonly",
"isroot",
"isstructureddocument",
"lastaccessed",
"name",
"parentname",
"resourcetype",
"supportedlock",
]
DAV_ALLPROPS = set(DAV_ALLPROP_L)
MIMES = {
"opus": "audio/ogg; codecs=opus",
}
@@ -319,7 +366,7 @@ MAGIC_MAP = {"jpeg": "jpg"}
DEF_EXP = "self.ip self.ua self.uname self.host cfg.name cfg.logout vf.scan vf.thsize hdr.cf_ipcountry srv.itime srv.htime"
DEF_MTE = "circle,album,.tn,artist,title,.bpm,key,.dur,.q,.vq,.aq,vc,ac,fmt,res,.fps,ahash,vhash"
DEF_MTE = ".files,circle,album,.tn,artist,title,.bpm,key,.dur,.q,.vq,.aq,vc,ac,fmt,res,.fps,ahash,vhash"
DEF_MTH = ".vq,.aq,vc,ac,fmt,res,.fps"
@@ -421,7 +468,7 @@ def py_desc() -> str:
def _sqlite_ver() -> str:
assert sqlite3 # type: ignore
assert sqlite3 # type: ignore # !rm
try:
co = sqlite3.connect(":memory:")
cur = co.cursor()
@@ -469,17 +516,36 @@ VERSIONS = (
)
_: Any = (mp, BytesIO, quote, unquote, SQLITE_VER, JINJA_VER, PYFTPD_VER, PARTFTPY_VER)
__all__ = [
"mp",
"BytesIO",
"quote",
"unquote",
"SQLITE_VER",
"JINJA_VER",
"PYFTPD_VER",
"PARTFTPY_VER",
]
try:
_b64_enc_tl = bytes.maketrans(b"+/", b"-_")
_b64_dec_tl = bytes.maketrans(b"-_", b"+/")
def ub64enc(bs: bytes) -> bytes:
x = binascii.b2a_base64(bs, newline=False)
return x.translate(_b64_enc_tl)
def ub64dec(bs: bytes) -> bytes:
bs = bs.translate(_b64_dec_tl)
return binascii.a2b_base64(bs)
def b64enc(bs: bytes) -> bytes:
return binascii.b2a_base64(bs, newline=False)
def b64dec(bs: bytes) -> bytes:
return binascii.a2b_base64(bs)
zb = b">>>????"
zb2 = base64.urlsafe_b64encode(zb)
if zb2 != ub64enc(zb) or zb != ub64dec(zb2):
raise Exception("bad smoke")
except Exception as ex:
ub64enc = base64.urlsafe_b64encode # type: ignore
ub64dec = base64.urlsafe_b64decode # type: ignore
b64enc = base64.b64encode # type: ignore
b64dec = base64.b64decode # type: ignore
if not PY36:
print("using fallback base64 codec due to %r" % (ex,))
class Daemon(threading.Thread):
@@ -719,6 +785,7 @@ class _Unrecv(object):
self.buf = buf + self.buf
# !rm.yes>
class _LUnrecv(object):
"""
with expensive debug logging
@@ -775,6 +842,9 @@ class _LUnrecv(object):
print(t.format(buf, self.buf))
# !rm.no>
Unrecv = _Unrecv
@@ -794,7 +864,7 @@ class CachedSet(object):
c = self.c = {k: v for k, v in self.c.items() if now - v < self.maxage}
try:
self.oldest = c[min(c, key=c.get)]
self.oldest = c[min(c, key=c.get)] # type: ignore
except:
self.oldest = now
@@ -898,7 +968,6 @@ class ProgressPrinter(threading.Thread):
self.msg = ""
self.end = False
self.n = -1
self.start()
def run(self) -> None:
sigblock()
@@ -1012,7 +1081,7 @@ class MTHash(object):
if self.stop:
return nch, "", ofs0, chunk_sz
assert f
assert f # !rm
hashobj = hashlib.sha512()
while chunk_rem > 0:
with self.imutex:
@@ -1027,7 +1096,7 @@ class MTHash(object):
ofs += len(buf)
bdig = hashobj.digest()[:33]
udig = base64.urlsafe_b64encode(bdig).decode("utf-8")
udig = ub64enc(bdig).decode("ascii")
return nch, udig, ofs0, chunk_sz
@@ -1053,7 +1122,7 @@ class HMaccas(object):
self.cache = {}
zb = hmac.new(self.key, msg, hashlib.sha512).digest()
zs = base64.urlsafe_b64encode(zb)[: self.retlen].decode("utf-8")
zs = ub64enc(zb)[: self.retlen].decode("ascii")
self.cache[msg] = zs
return zs
@@ -1377,25 +1446,20 @@ def vol_san(vols: list["VFS"], txt: bytes) -> bytes:
def min_ex(max_lines: int = 8, reverse: bool = False) -> str:
et, ev, tb = sys.exc_info()
stb = traceback.extract_tb(tb) if tb else traceback.extract_stack()[:-1]
fmt = "%s @ %d <%s>: %s"
fmt = "%s:%d <%s>: %s"
ex = [fmt % (fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in stb]
if et or ev or tb:
ex.append("[%s] %s" % (et.__name__ if et else "(anonymous)", ev))
return "\n".join(ex[-max_lines:][:: -1 if reverse else 1])
@contextlib.contextmanager
def ren_open(
fname: str, *args: Any, **kwargs: Any
) -> Generator[dict[str, tuple[typing.IO[Any], str]], None, None]:
def ren_open(fname: str, *args: Any, **kwargs: Any) -> tuple[typing.IO[Any], str]:
fun = kwargs.pop("fun", open)
fdir = kwargs.pop("fdir", None)
suffix = kwargs.pop("suffix", None)
if fname == os.devnull:
with fun(fname, *args, **kwargs) as f:
yield {"orz": (f, fname)}
return
return fun(fname, *args, **kwargs), fname
if suffix:
ext = fname.split(".")[-1]
@@ -1417,6 +1481,7 @@ def ren_open(
asciified = False
b64 = ""
while True:
f = None
try:
if fdir:
fpath = os.path.join(fdir, fname)
@@ -1428,19 +1493,20 @@ def ren_open(
fname += suffix
ext += suffix
with fun(fsenc(fpath), *args, **kwargs) as f:
if b64:
assert fdir
fp2 = "fn-trunc.%s.txt" % (b64,)
fp2 = os.path.join(fdir, fp2)
with open(fsenc(fp2), "wb") as f2:
f2.write(orig_name.encode("utf-8"))
f = fun(fsenc(fpath), *args, **kwargs)
if b64:
assert fdir # !rm
fp2 = "fn-trunc.%s.txt" % (b64,)
fp2 = os.path.join(fdir, fp2)
with open(fsenc(fp2), "wb") as f2:
f2.write(orig_name.encode("utf-8"))
yield {"orz": (f, fname)}
return
return f, fname
except OSError as ex_:
ex = ex_
if f:
f.close()
# EPERM: android13
if ex.errno in (errno.EINVAL, errno.EPERM) and not asciified:
@@ -1461,8 +1527,7 @@ def ren_open(
if not b64:
zs = ("%s\n%s" % (orig_name, suffix)).encode("utf-8", "replace")
zs = hashlib.sha512(zs).digest()[:12]
b64 = base64.urlsafe_b64encode(zs).decode("utf-8")
b64 = ub64enc(hashlib.sha512(zs).digest()[:12]).decode("ascii")
badlen = len(fname)
while len(fname) >= badlen:
@@ -1695,7 +1760,7 @@ class MultipartParser(object):
returns the value of the next field in the multipart body,
raises if the field name is not as expected
"""
assert self.gen
assert self.gen # !rm
p_field, p_fname, p_data = next(self.gen)
if p_field != field_name:
raise WrongPostKey(field_name, p_field, p_fname, p_data)
@@ -1704,7 +1769,7 @@ class MultipartParser(object):
def drop(self) -> None:
"""discards the remaining multipart body"""
assert self.gen
assert self.gen # !rm
for _, _, data in self.gen:
for _ in data:
pass
@@ -1743,7 +1808,7 @@ def read_header(sr: Unrecv, t_idle: int, t_tot: int) -> list[str]:
ofs = ret.find(b"\r\n\r\n")
if ofs < 0:
if len(ret) > 1024 * 64:
if len(ret) > 1024 * 32:
raise Pebkac(400, "header 2big")
else:
continue
@@ -1768,9 +1833,8 @@ def rand_name(fdir: str, fn: str, rnd: int) -> str:
nc = rnd + extra
nb = (6 + 6 * nc) // 8
zb = os.urandom(nb)
zb = base64.urlsafe_b64encode(zb)
fn = zb[:nc].decode("utf-8") + ext
zb = ub64enc(os.urandom(nb))
fn = zb[:nc].decode("ascii") + ext
ok = not os.path.exists(fsenc(os.path.join(fdir, fn)))
return fn
@@ -1783,7 +1847,7 @@ def gen_filekey(alg: int, salt: str, fspath: str, fsize: int, inode: int) -> str
zs = "%s %s" % (salt, fspath)
zb = zs.encode("utf-8", "replace")
return base64.urlsafe_b64encode(hashlib.sha512(zb).digest()).decode("ascii")
return ub64enc(hashlib.sha512(zb).digest()).decode("ascii")
def gen_filekey_dbg(
@@ -1797,7 +1861,7 @@ def gen_filekey_dbg(
) -> str:
ret = gen_filekey(alg, salt, fspath, fsize, inode)
assert log_ptn
assert log_ptn # !rm
if log_ptn.search(fspath):
try:
import inspect
@@ -1821,10 +1885,21 @@ def gen_filekey_dbg(
return ret
WKDAYS = "Mon Tue Wed Thu Fri Sat Sun".split()
MONTHS = "Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec".split()
RFC2822 = "%s, %02d %s %04d %02d:%02d:%02d GMT"
def formatdate(ts: Optional[float] = None) -> str:
# gmtime ~= datetime.fromtimestamp(ts, UTC).timetuple()
y, mo, d, h, mi, s, wd, _, _ = time.gmtime(ts)
return RFC2822 % (WKDAYS[wd], d, MONTHS[mo - 1], y, h, mi, s)
def gencookie(k: str, v: str, r: str, tls: bool, dur: int = 0, txt: str = "") -> str:
v = v.replace("%", "%25").replace(";", "%3B")
if dur:
exp = formatdate(time.time() + dur, usegmt=True)
exp = formatdate(time.time() + dur)
else:
exp = "Fri, 15 Aug 1997 01:00:00 GMT"
@@ -1839,12 +1914,10 @@ def humansize(sz: float, terse: bool = False) -> str:
sz /= 1024.0
ret = " ".join([str(sz)[:4].rstrip("."), unit])
if not terse:
return ret
return ret.replace("iB", "").replace(" ", "")
if terse:
return "%s%s" % (str(sz)[:4].rstrip("."), unit[:1])
else:
return "%s %s" % (str(sz)[:4].rstrip("."), unit)
def unhumanize(sz: str) -> int:
@@ -1896,7 +1969,7 @@ def uncyg(path: str) -> str:
def undot(path: str) -> str:
ret: list[str] = []
for node in path.split("/"):
if node in ["", "."]:
if node == "." or not node:
continue
if node == "..":
@@ -1909,13 +1982,10 @@ def undot(path: str) -> str:
return "/".join(ret)
def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
def sanitize_fn(fn: str, ok: str) -> str:
if "/" not in ok:
fn = fn.replace("\\", "/").split("/")[-1]
if fn.lower() in bad:
fn = "_" + fn
if ANYWIN:
remap = [
["<", ""],
@@ -1941,9 +2011,9 @@ def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
return fn.strip()
def sanitize_vpath(vp: str, ok: str, bad: list[str]) -> str:
def sanitize_vpath(vp: str, ok: str) -> str:
parts = vp.replace(os.sep, "/").split("/")
ret = [sanitize_fn(x, ok, bad) for x in parts]
ret = [sanitize_fn(x, ok) for x in parts]
return "/".join(ret)
@@ -2047,25 +2117,70 @@ def html_bescape(s: bytes, quot: bool = False, crlf: bool = False) -> bytes:
def _quotep2(txt: str) -> str:
"""url quoter which deals with bytes correctly"""
if not txt:
return ""
btxt = w8enc(txt)
quot = quote(btxt, safe=b"/")
return w8dec(quot.replace(b" ", b"+"))
return w8dec(quot.replace(b" ", b"+")) # type: ignore
def _quotep3(txt: str) -> str:
"""url quoter which deals with bytes correctly"""
if not txt:
return ""
btxt = w8enc(txt)
quot = quote(btxt, safe=b"/").encode("utf-8")
return w8dec(quot.replace(b" ", b"+"))
quotep = _quotep3 if not PY2 else _quotep2
if not PY2:
_uqsb = b"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789_.-~/"
_uqtl = {
n: ("%%%02X" % (n,) if n not in _uqsb else chr(n)).encode("utf-8")
for n in range(256)
}
_uqtl[b" "] = b"+"
def _quotep3b(txt: str) -> str:
"""url quoter which deals with bytes correctly"""
if not txt:
return ""
btxt = w8enc(txt)
if btxt.rstrip(_uqsb):
lut = _uqtl
btxt = b"".join([lut[ch] for ch in btxt])
return w8dec(btxt)
quotep = _quotep3b
_hexd = "0123456789ABCDEFabcdef"
_hex2b = {(a + b).encode(): bytes.fromhex(a + b) for a in _hexd for b in _hexd}
def unquote(btxt: bytes) -> bytes:
h2b = _hex2b
parts = iter(btxt.split(b"%"))
ret = [next(parts)]
for item in parts:
c = h2b.get(item[:2])
if c is None:
ret.append(b"%")
ret.append(item)
else:
ret.append(c)
ret.append(item[2:])
return b"".join(ret)
from urllib.parse import quote_from_bytes as quote
else:
from urllib import quote # type: ignore # pylint: disable=no-name-in-module
from urllib import unquote # type: ignore # pylint: disable=no-name-in-module
quotep = _quotep2
def unquotep(txt: str) -> str:
"""url unquoter which deals with bytes correctly"""
btxt = w8enc(txt)
# btxt = btxt.replace(b"+", b" ")
unq2 = unquote(btxt)
return w8dec(unq2)
@@ -2093,6 +2208,72 @@ def ujoin(rd: str, fn: str) -> str:
return rd or fn
def log_reloc(
log: "NamedLogger",
re: dict[str, str],
pm: tuple[str, str, str, tuple["VFS", str]],
ap: str,
vp: str,
fn: str,
vn: "VFS",
rem: str,
) -> None:
nap, nvp, nfn, (nvn, nrem) = pm
t = "reloc %s:\nold ap [%s]\nnew ap [%s\033[36m/%s\033[0m]\nold vp [%s]\nnew vp [%s\033[36m/%s\033[0m]\nold fn [%s]\nnew fn [%s]\nold vfs [%s]\nnew vfs [%s]\nold rem [%s]\nnew rem [%s]"
log(t % (re, ap, nap, nfn, vp, nvp, nfn, fn, nfn, vn.vpath, nvn.vpath, rem, nrem))
def pathmod(
vfs: "VFS", ap: str, vp: str, mod: dict[str, str]
) -> Optional[tuple[str, str, str, tuple["VFS", str]]]:
# vfs: authsrv.vfs
# ap: original abspath to a file
# vp: original urlpath to a file
# mod: modification (ap/vp/fn)
nvp = "\n" # new vpath
ap = os.path.dirname(ap)
vp, fn = vsplit(vp)
if mod.get("fn"):
fn = mod["fn"]
nvp = vp
for ref, k in ((ap, "ap"), (vp, "vp")):
if k not in mod:
continue
ms = mod[k].replace(os.sep, "/")
if ms.startswith("/"):
np = ms
elif k == "vp":
np = undot(vjoin(ref, ms))
else:
np = os.path.abspath(os.path.join(ref, ms))
if k == "vp":
nvp = np.lstrip("/")
continue
# try to map abspath to vpath
np = np.replace("/", os.sep)
for vn_ap, vn in vfs.all_aps:
if not np.startswith(vn_ap):
continue
zs = np[len(vn_ap) :].replace(os.sep, "/")
nvp = vjoin(vn.vpath, zs)
break
if nvp == "\n":
return None
vn, rem = vfs.get(nvp, "*", False, False)
if not vn.realpath:
raise Exception("unmapped vfs")
ap = vn.canonical(rem)
return ap, nvp, fn, (vn, rem)
def _w8dec2(txt: bytes) -> str:
"""decodes filesystem-bytes to wtf8"""
return surrogateescape.decodefilename(txt)
@@ -2145,12 +2326,12 @@ w8enc = _w8enc3 if not PY2 else _w8enc2
def w8b64dec(txt: str) -> str:
"""decodes base64(filesystem-bytes) to wtf8"""
return w8dec(base64.urlsafe_b64decode(txt.encode("ascii")))
return w8dec(ub64dec(txt.encode("ascii")))
def w8b64enc(txt: str) -> str:
"""encodes wtf8 to base64(filesystem-bytes)"""
return base64.urlsafe_b64encode(w8enc(txt)).decode("ascii")
return ub64enc(w8enc(txt)).decode("ascii")
if not PY2 and WINDOWS:
@@ -2308,7 +2489,7 @@ def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
def get_df(abspath: str) -> tuple[Optional[int], Optional[int]]:
try:
# some fuses misbehave
assert ctypes
assert ctypes # type: ignore # !rm
if ANYWIN:
bfree = ctypes.c_ulonglong(0)
ctypes.windll.kernel32.GetDiskFreeSpaceExW( # type: ignore
@@ -2526,8 +2707,7 @@ def hashcopy(
if slp:
time.sleep(slp)
digest = hashobj.digest()[:33]
digest_b64 = base64.urlsafe_b64encode(digest).decode("utf-8")
digest_b64 = ub64enc(hashobj.digest()[:33]).decode("ascii")
return tlen, hashobj.hexdigest(), digest_b64
@@ -2709,30 +2889,30 @@ def rmdirs_up(top: str, stop: str) -> tuple[list[str], list[str]]:
def unescape_cookie(orig: str) -> str:
# mw=idk; doot=qwe%2Crty%3Basd+fgh%2Bjkl%25zxc%26vbn # qwe,rty;asd fgh+jkl%zxc&vbn
ret = ""
ret = []
esc = ""
for ch in orig:
if ch == "%":
if len(esc) > 0:
ret += esc
if esc:
ret.append(esc)
esc = ch
elif len(esc) > 0:
elif esc:
esc += ch
if len(esc) == 3:
try:
ret += chr(int(esc[1:], 16))
ret.append(chr(int(esc[1:], 16)))
except:
ret += esc
ret.append(esc)
esc = ""
else:
ret += ch
ret.append(ch)
if len(esc) > 0:
ret += esc
if esc:
ret.append(esc)
return ret
return "".join(ret)
def guess_mime(url: str, fallback: str = "application/octet-stream") -> str:
@@ -2767,7 +2947,7 @@ def getalive(pids: list[int], pgid: int) -> list[int]:
alive.append(pid)
else:
# windows doesn't have pgroups; assume
assert psutil
assert psutil # type: ignore # !rm
psutil.Process(pid)
alive.append(pid)
except:
@@ -2785,7 +2965,7 @@ def killtree(root: int) -> None:
pgid = 0
if HAVE_PSUTIL:
assert psutil
assert psutil # type: ignore # !rm
pids = [root]
parent = psutil.Process(root)
for child in parent.children(recursive=True):
@@ -3106,6 +3286,7 @@ def runihook(
def _runhook(
log: Optional["NamedLogger"],
src: str,
cmd: str,
ap: str,
vp: str,
@@ -3117,14 +3298,16 @@ def _runhook(
ip: str,
at: float,
txt: str,
) -> bool:
) -> dict[str, Any]:
ret = {"rc": 0}
areq, chk, fork, jtxt, wait, sp_ka, acmd = _parsehook(log, cmd)
if areq:
for ch in areq:
if ch not in perms:
t = "user %s not allowed to run hook %s; need perms %s, have %s"
log(t % (uname, cmd, areq, perms))
return True # fallthrough to next hook
if log:
log(t % (uname, cmd, areq, perms))
return ret # fallthrough to next hook
if jtxt:
ja = {
"ap": ap,
@@ -3136,6 +3319,7 @@ def _runhook(
"host": host,
"user": uname,
"perms": perms,
"src": src,
"txt": txt,
}
arg = json.dumps(ja)
@@ -3154,18 +3338,34 @@ def _runhook(
else:
rc, v, err = runcmd(bcmd, **sp_ka) # type: ignore
if chk and rc:
ret["rc"] = rc
retchk(rc, bcmd, err, log, 5)
return False
else:
try:
ret = json.loads(v)
except:
ret = {}
try:
if "stdout" not in ret:
ret["stdout"] = v
if "rc" not in ret:
ret["rc"] = rc
except:
ret = {"rc": rc, "stdout": v}
wait -= time.time() - t0
if wait > 0:
time.sleep(wait)
return True
return ret
def runhook(
log: Optional["NamedLogger"],
broker: Optional["BrokerCli"],
up2k: Optional["Up2k"],
src: str,
cmds: list[str],
ap: str,
vp: str,
@@ -3177,19 +3377,42 @@ def runhook(
ip: str,
at: float,
txt: str,
) -> bool:
) -> dict[str, Any]:
assert broker or up2k # !rm
asrv = (broker or up2k).asrv
args = (broker or up2k).args
vp = vp.replace("\\", "/")
ret = {"rc": 0}
for cmd in cmds:
try:
if not _runhook(log, cmd, ap, vp, host, uname, perms, mt, sz, ip, at, txt):
return False
hr = _runhook(
log, src, cmd, ap, vp, host, uname, perms, mt, sz, ip, at, txt
)
if log and args.hook_v:
log("hook(%s) [%s] => \033[32m%s" % (src, cmd, hr), 6)
if not hr:
return {}
for k, v in hr.items():
if k in ("idx", "del") and v:
if broker:
broker.say("up2k.hook_fx", k, v, vp)
else:
up2k.fx_backlog.append((k, v, vp))
elif k == "reloc" and v:
# idk, just take the last one ig
ret["reloc"] = v
elif k in ret:
if k == "rc" and v:
ret[k] = v
else:
ret[k] = v
except Exception as ex:
(log or print)("hook: {}".format(ex))
if ",c," in "," + cmd:
return False
return {}
break
return True
return ret
def loadpy(ap: str, hot: bool) -> Any:
@@ -3220,9 +3443,15 @@ def loadpy(ap: str, hot: bool) -> Any:
def gzip_orig_sz(fn: str) -> int:
with open(fsenc(fn), "rb") as f:
f.seek(-4, 2)
rv = f.read(4)
return sunpack(b"I", rv)[0] # type: ignore
return gzip_file_orig_sz(f)
def gzip_file_orig_sz(f) -> int:
start = f.tell()
f.seek(-4, 2)
rv = f.read(4)
f.seek(start, 0)
return sunpack(b"I", rv)[0] # type: ignore
def align_tab(lines: list[str]) -> list[str]:
@@ -3349,7 +3578,7 @@ def termsize() -> tuple[int, int]:
def hidedir(dp) -> None:
if ANYWIN:
try:
assert ctypes
assert ctypes # type: ignore # !rm
k32 = ctypes.WinDLL("kernel32")
attrs = k32.GetFileAttributesW(dp)
if attrs >= 0:
@@ -3358,6 +3587,104 @@ def hidedir(dp) -> None:
pass
try:
if sys.version_info < (3, 10):
# py3.8 doesn't have .files
# py3.9 has broken .is_file
raise ImportError()
import importlib.resources as impresources
except ImportError:
try:
import importlib_resources as impresources
except ImportError:
impresources = None
try:
if sys.version_info > (3, 10):
raise ImportError()
import pkg_resources
except ImportError:
pkg_resources = None
def _pkg_resource_exists(pkg: str, name: str) -> bool:
if not pkg_resources:
return False
try:
return pkg_resources.resource_exists(pkg, name)
except NotImplementedError:
return False
def stat_resource(E: EnvParams, name: str):
path = os.path.join(E.mod, name)
if os.path.exists(path):
return os.stat(fsenc(path))
return None
def _find_impresource(E: EnvParams, name: str):
assert impresources # !rm
try:
files = impresources.files(E.pkg)
except ImportError:
return None
return files.joinpath(name)
_rescache_has = {}
def _has_resource(E: EnvParams, name: str):
try:
return _rescache_has[name]
except:
pass
if len(_rescache_has) > 999:
_rescache_has.clear()
if impresources:
res = _find_impresource(E, name)
if res and res.is_file():
_rescache_has[name] = True
return True
if pkg_resources:
if _pkg_resource_exists(E.pkg.__name__, name):
_rescache_has[name] = True
return True
_rescache_has[name] = False
return False
def has_resource(E: EnvParams, name: str):
return _has_resource(E, name) or os.path.exists(os.path.join(E.mod, name))
def load_resource(E: EnvParams, name: str, mode="rb") -> IO[bytes]:
enc = None if "b" in mode else "utf-8"
if impresources:
res = _find_impresource(E, name)
if res and res.is_file():
if enc:
return res.open(mode, encoding=enc)
else:
# throws if encoding= is mentioned at all
return res.open(mode)
if pkg_resources:
if _pkg_resource_exists(E.pkg.__name__, name):
stream = pkg_resources.resource_stream(E.pkg.__name__, name)
if enc:
stream = codecs.getreader(enc)(stream)
return stream
return open(os.path.join(E.mod, name), mode, encoding=enc)
class Pebkac(Exception):
def __init__(
self, code: int, msg: Optional[str] = None, log: Optional[str] = None
@@ -3385,3 +3712,16 @@ class WrongPostKey(Pebkac):
self.got = got
self.fname = fname
self.datagen = datagen
_: Any = (mp, BytesIO, quote, unquote, SQLITE_VER, JINJA_VER, PYFTPD_VER, PARTFTPY_VER)
__all__ = [
"mp",
"BytesIO",
"quote",
"unquote",
"SQLITE_VER",
"JINJA_VER",
"PYFTPD_VER",
"PARTFTPY_VER",
]

View File

@@ -29,6 +29,7 @@ window.baguetteBox = (function () {
isOverlayVisible = false,
touch = {}, // start-pos
touchFlag = false, // busy
scrollCSS = ['', ''],
scrollTimer = 0,
re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i,
@@ -567,6 +568,12 @@ window.baguetteBox = (function () {
function showOverlay(chosenImageIndex) {
if (options.noScrollbars) {
var a = document.documentElement.style.overflowY,
b = document.body.style.overflowY;
if (a != 'hidden' || b != 'scroll')
scrollCSS = [a, b];
document.documentElement.style.overflowY = 'hidden';
document.body.style.overflowY = 'scroll';
}
@@ -615,8 +622,8 @@ window.baguetteBox = (function () {
playvid(false);
removeFromCache('#files');
if (options.noScrollbars) {
document.documentElement.style.overflowY = 'auto';
document.body.style.overflowY = 'auto';
document.documentElement.style.overflowY = scrollCSS[0];
document.body.style.overflowY = scrollCSS[1];
}
try {
@@ -1060,7 +1067,7 @@ window.baguetteBox = (function () {
return showNextImage();
show_buttons('t');
if (Date.now() - ctime <= 500 && !IPHONE)
tglfull();

View File

@@ -10,7 +10,6 @@
--fg2-max: #fff;
--fg-weak: #bbb;
--bg-u7: #555;
--bg-u6: #4c4c4c;
--bg-u5: #444;
--bg-u4: #383838;
@@ -43,8 +42,14 @@
--btn-h-bg: #805;
--btn-1-fg: #400;
--btn-1-bg: var(--a);
--btn-h-bs: var(--btn-bs);
--btn-h-bb: var(--btn-bb);
--btn-1-bs: var(--btn-bs);
--btn-1-bb: var(--btn-bb);
--btn-1h-fg: var(--btn-1-fg);
--btn-1h-bg: #fe8;
--btn-1h-bs: var(--btn-1-bs);
--btn-1h-bb: var(--btn-1-bb);
--chk-fg: var(--tab-alt);
--txt-sh: var(--bg-d2);
--txt-bg: var(--btn-bg);
@@ -59,7 +64,7 @@
--u2-tab-bg: linear-gradient(to bottom, var(--bg), var(--bg-u1));
--u2-tab-b1: rgba(128,128,128,0.8);
--u2-tab-1-fg: #fd7;
--u2-tab-1-bg: linear-gradient(to bottom, var(#353), var(--bg) 80%);
--u2-tab-1-bg: linear-gradient(to bottom, #353, var(--bg) 80%);
--u2-tab-1-b1: #7c5;
--u2-tab-1-b2: #583;
--u2-tab-1-sh: #280;
@@ -212,22 +217,19 @@ html.y {
html.a {
--op-aa-sh: 0 0 .2em var(--bg-d3) inset;
--u2-o-bg: #603;
--u2-o-b1: #a16;
--u2-o-sh: #a00;
--u2-o-h-bg: var(--u2-o-bg);
--u2-o-h-b1: #fb0;
--u2-o-h-sh: #fb0;
--u2-o-1-bg: #6a1;
--u2-o-1-b1: #efa;
--u2-o-1-sh: #0c0;
--u2-o-1h-bg: var(--u2-o-1-bg);
--btn-bs: 0 0 .2em var(--bg-d3);
}
html.az {
--btn-1-bs: 0 0 .1em var(--fg) inset;
}
html.ay {
--op-aa-sh: 0 .1em .2em #ccc;
--op-aa-bg: var(--bg-max);
}
html.b {
--btn-bs: 0 .05em 0 var(--bg-d3) inset;
--btn-1-bs: 0 .05em 0 var(--btn-1h-bg) inset;
--tree-bg: var(--bg);
--g-bg: var(--bg);
@@ -244,17 +246,13 @@ html.b {
--u2-b1-bg: rgba(128,128,128,0.15);
--u2-b2-bg: var(--u2-b1-bg);
--u2-o-bg: var(--btn-bg);
--u2-o-h-bg: var(--btn-h-bg);
--u2-o-1-bg: var(--a);
--u2-o-1h-bg: var(--a-hil);
--f-sh1: 0.1;
--mp-b-bg: transparent;
}
html.bz {
--fg: #cce;
--fg-weak: #bbd;
--bg-u5: #3b3f58;
--bg-u4: #1e2130;
--bg-u3: #1e2130;
@@ -266,12 +264,14 @@ html.bz {
--row-alt: #181a27;
--a-b: #fb4;
--btn-bg: #202231;
--btn-h-bg: #2d2f45;
--btn-1-bg: #ba2959;
--btn-1-is: #f59;
--btn-1-fg: #fff;
--btn-1-bg: #eb6;
--btn-1-fg: #000;
--btn-1h-fg: #000;
--btn-1h-bg: #ff9;
--txt-sh: a;
--u2-tab-b1: var(--bg-u5);
@@ -306,6 +306,7 @@ html.by {
}
html.c {
font-weight: bold;
--fg: #fff;
--fg-weak: #cef;
--bg-u5: #409;
@@ -326,17 +327,25 @@ html.c {
--chk-fg: #d90;
--op-aa-bg: #f9dd22;
--u2-o-1-bg: #4cf;
--srv-1: #ea0;
--mp-b-bg: transparent;
}
html.cz {
--bgg: var(--bg-u2);
--sel-bg: var(--bg-u5);
--sel-fg: var(--fg);
--btn-bb: .2em solid #709;
--btn-bs: 0 .1em .6em rgba(255,0,185,0.5);
--btn-1-bb: .2em solid #e90;
--btn-1-bs: 0 .1em .8em rgba(255,205,0,0.9);
--srv-3: #fff;
--u2-tab-b1: var(--bg-d3);
--u2-tab-1-bg: a;
}
html.cy {
--fg: #fff;
@@ -363,16 +372,20 @@ html.cy {
--btn-h-fg: #fff;
--btn-1-bg: #ff0;
--btn-1-fg: #000;
--btn-bs: 0 .25em 0 #f00;
--chk-fg: #fd0;
--txt-bg: #000;
--srv-1: #f00;
--srv-3: #fff;
--op-aa-bg: #fff;
--u2-b1-bg: #f00;
--u2-b2-bg: #f00;
--u2-o-bg: #ff0;
--u2-o-1-bg: #f00;
--g-sel-fg: #fff;
--g-sel-bg: #aaa;
--g-fsel-bg: #aaa;
}
html.dz {
--fg: #4d4;
@@ -380,7 +393,6 @@ html.dz {
--fg2-max: #fff;
--fg-weak: #2a2;
--bg-u7: #020;
--bg-u6: #020;
--bg-u5: #050;
--bg-u4: #020;
@@ -413,6 +425,9 @@ html.dz {
--btn-1-bg: #4f4;
--btn-1h-fg: var(--btn-1-fg);
--btn-1h-bg: #3f3;
--btn-bs: 0 0 0 .1em #080 inset;
--btn-1-bs: a;
--chk-fg: var(--tab-alt);
--txt-sh: var(--bg-d2);
--txt-bg: var(--btn-bg);
@@ -427,19 +442,13 @@ html.dz {
--u2-tab-bg: linear-gradient(to bottom, var(--bg), var(--bg-u1));
--u2-tab-b1: var(--fg-weak);
--u2-tab-1-fg: #fff;
--u2-tab-1-bg: linear-gradient(to bottom, var(#353), var(--bg) 80%);
--u2-tab-1-bg: linear-gradient(to bottom, #151, var(--bg) 80%);
--u2-tab-1-b1: #7c5;
--u2-tab-1-b2: #583;
--u2-tab-1-sh: #280;
--u2-b-fg: #fff;
--u2-b1-bg: #3a3;
--u2-b2-bg: #3a3;
--u2-o-bg: var(--btn-bg);
--u2-o-b1: var(--bg-u5);
--u2-o-h-bg: var(--fg-weak);
--u2-o-1-bg: var(--fg-weak);
--u2-o-1-b1: var(--a);
--u2-o-1h-bg: var(--a);
--u2-inf-bg: #07a;
--u2-inf-b1: #0be;
--u2-ok-bg: #380;
@@ -551,10 +560,6 @@ html.dy {
--u2-tab-1-bg: a;
--u2-b1-bg: #000;
--u2-b2-bg: #000;
--u2-o-h-bg: #999;
--u2-o-1h-bg: #999;
--u2-o-bg: #eee;
--u2-o-1-bg: #000;
--ud-b1: a;
@@ -599,7 +604,7 @@ html.dy {
background: var(--sel-bg);
text-shadow: none;
}
html,body,tr,th,td,#files,a {
html,body,tr,th,td,#files,a,#blogout {
color: inherit;
background: none;
font-weight: inherit;
@@ -627,6 +632,7 @@ pre, code, tt, #doc, #doc>code {
overflow: hidden;
width: 0;
height: 0;
color: var(--bg);
}
html .ayjump:focus {
z-index: 80386;
@@ -681,11 +687,15 @@ html.y #path {
#files tbody div a {
color: var(--tab-alt);
}
a, #files tbody div a:last-child {
a, #blogout, #files tbody div a:last-child {
color: var(--a);
padding: .2em;
text-decoration: none;
}
#blogout {
margin: -.2em;
}
#blogout:hover,
a:hover {
color: var(--a-hil);
background: var(--a-h-bg);
@@ -929,6 +939,9 @@ html.y #path a:hover {
color: var(--srv-3);
border-bottom: 1px solid var(--srv-3b);
}
#flogout {
display: inline;
}
#goh+span {
color: var(--bg-u5);
padding-left: .5em;
@@ -963,6 +976,8 @@ html.y #path a:hover {
#files tbody tr.play a:hover {
color: var(--btn-1h-fg);
background: var(--btn-1h-bg);
box-shadow: var(--btn-1h-bs);
border-bottom: var(--btn-1h-bb);
}
#ggrid {
margin: -.2em -.5em;
@@ -971,6 +986,7 @@ html.y #path a:hover {
overflow: hidden;
display: block;
display: -webkit-box;
line-clamp: var(--grid-ln);
-webkit-line-clamp: var(--grid-ln);
-webkit-box-orient: vertical;
padding-top: .3em;
@@ -1017,9 +1033,6 @@ html.y #path a:hover {
color: var(--g-dfg);
}
#ggrid>a.au:before {
content: '💾';
}
html.np_open #ggrid>a.au:before {
content: '▶';
}
#ggrid>a:before {
@@ -1148,6 +1161,7 @@ html.y #widget.open {
width: 100%;
height: 100%;
}
#fshr,
#wtgrid,
#wtico {
position: relative;
@@ -1334,6 +1348,7 @@ html.y #widget.open {
#widget.cmp #wtoggle {
font-size: 1.2em;
}
#widget.cmp #fshr,
#widget.cmp #wtgrid {
display: none;
}
@@ -1347,6 +1362,7 @@ html.y #widget.open {
}
#widget.cmp #barpos,
#widget.cmp #barbuf {
height: 1.6em;
width: calc(100% - 11em);
border-radius: 0;
left: 5em;
@@ -1434,7 +1450,11 @@ input[type="checkbox"]+label {
input[type="radio"]:checked+label,
input[type="checkbox"]:checked+label {
color: #0e0;
color: var(--a);
color: var(--btn-1-bg);
}
input[type="checkbox"]:checked+label {
box-shadow: var(--btn-1-bs);
border-bottom: var(--btn-1-bb);
}
html.dz input {
font-family: 'scp', monospace, monospace;
@@ -1612,6 +1632,8 @@ html {
color: var(--btn-fg);
background: #eee;
background: var(--btn-bg);
box-shadow: var(--btn-bs);
border-bottom: var(--btn-bb);
border-radius: .3em;
padding: .2em .4em;
font-size: 1.2em;
@@ -1625,20 +1647,14 @@ html.c .btn,
html.a .btn {
border-radius: .2em;
}
html.cz .btn {
box-shadow: 0 .1em .6em rgba(255,0,185,0.5);
border-bottom: .2em solid #709;
}
html.dz .btn {
font-size: 1em;
box-shadow: 0 0 0 .1em #080 inset;
}
html.dz .tgl.btn.on {
box-shadow: 0 0 0 .1em var(--btn-1-bg) inset;
}
.btn:hover {
color: var(--btn-h-fg);
background: var(--btn-h-bg);
box-shadow: var(--btn-h-bs);
border-bottom: var(--btn-h-bb);
}
.tgl.btn.on {
background: #000;
@@ -1646,14 +1662,14 @@ html.dz .tgl.btn.on {
color: #fff;
color: var(--btn-1-fg);
text-shadow: none;
}
html.cz .tgl.btn.on {
box-shadow: 0 .1em .8em rgba(255,205,0,0.9);
border-bottom: .2em solid #e90;
box-shadow: var(--btn-1-bs);
border-bottom: var(--btn-1-bb);
}
.tgl.btn.on:hover {
background: var(--btn-1h-bg);
color: var(--btn-1h-fg);
background: var(--btn-1h-bg);
box-shadow: var(--btn-1h-bs);
border-bottom: var(--btn-1h-bb);
}
#detree {
padding: .3em .5em;
@@ -1858,6 +1874,7 @@ html.y #tree.nowrap .ntree a+a:hover {
#unpost td:nth-child(4) {
text-align: right;
}
#shui,
#rui {
background: #fff;
background: var(--bg);
@@ -1873,13 +1890,25 @@ html.y #tree.nowrap .ntree a+a:hover {
padding: 1em;
z-index: 765;
}
#shui div+div,
#rui div+div {
margin-top: 1em;
}
#shui table,
#rui table {
width: 100%;
border-collapse: collapse;
}
#shui button {
margin: 0 1em 0 0;
}
#shui .btn {
font-size: 1em;
}
#shui td {
padding: .8em 0;
}
#shui td+td,
#rui td+td {
padding: .2em 0 .2em .5em;
}
@@ -1887,10 +1916,15 @@ html.y #tree.nowrap .ntree a+a:hover {
font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
}
#shui td+td,
#rui td+td,
#shui td input[type="text"],
#rui td input[type="text"] {
width: 100%;
}
#shui td.exs input[type="text"] {
width: 3em;
}
#rn_f.m td:first-child {
white-space: nowrap;
}
@@ -2685,23 +2719,25 @@ html.b #u2conf a.b:hover {
#u2conf input[type="checkbox"]:checked+label {
position: relative;
cursor: pointer;
background: var(--u2-o-bg);
border-bottom: .2em solid var(--u2-o-b1);
box-shadow: 0 .1em .3em var(--u2-o-sh) inset;
background: var(--btn-bg);
box-shadow: var(--btn-bs);
border-bottom: var(--btn-bb);
text-shadow: 1px 1px 1px #000, 1px -1px 1px #000, -1px -1px 1px #000, -1px 1px 1px #000;
}
#u2conf input[type="checkbox"]:checked+label {
background: var(--u2-o-1-bg);
border-bottom: .2em solid var(--u2-o-1-b1);
box-shadow: 0 .1em .5em var(--u2-o-1-sh);
background: var(--btn-1-bg);
box-shadow: var(--btn-1-bs);
border-bottom: var(--btn-1-bb);
}
#u2conf input[type="checkbox"]+label:hover {
box-shadow: 0 .1em .3em var(--u2-o-h-sh);
border-color: var(--u2-o-h-b1);
background: var(--u2-o-h-bg);
background: var(--btn-h-bg);
box-shadow: var(--btn-h-bs);
border-bottom: var(--btn-h-bb);
}
#u2conf input[type="checkbox"]:checked+label:hover {
background: var(--u2-o-1h-bg);
background: var(--btn-1h-bg);
box-shadow: var(--btn-1h-bs);
border-bottom: var(--btn-1h-bb);
}
#op_up2k.srch #u2conf td:nth-child(2)>*,
#op_up2k.srch #u2conf td:nth-child(3)>* {
@@ -3061,14 +3097,6 @@ html.b #ggrid>a {
html.b .btn {
top: -.1em;
}
html.b .btn,
html.b #u2conf a.b,
html.b #u2conf input[type="checkbox"]:not(:checked)+label {
box-shadow: 0 .05em 0 var(--bg-d3) inset;
}
html.b .tgl.btn.on {
box-shadow: 0 .05em 0 var(--btn-1-is) inset;
}
html.b #op_up2k.srch sup {
color: #fc0;
}
@@ -3098,18 +3126,30 @@ html.by #u2cards a.act {
html.cy #wrap {
color: #000;
}
html.cy .mdo a {
background: #f00;
}
html.cy #wrap,
html.cy #acc_info a,
html.cy #op_up2k,
html.cy #files,
html.cy #files a,
html.cy #files tbody div a:last-child {
color: #000;
}
html.cy #u2tab a,
html.cy #u2cards a {
color: #f00;
}
html.cy #unpost a {
color: #ff0;
}
html.cy #barbuf {
filter: hue-rotate(267deg) brightness(0.8) contrast(4);
}
html.cy #pvol {
filter: hue-rotate(4deg) contrast(2.2);
}

View File

@@ -67,14 +67,14 @@
<div id="op_up2k" class="opview"></div>
<div id="op_cfg" class="opview opbox opwide"></div>
<h1 id="path">
<a href="#" id="entree">🌲</a>
{%- for n in vpnodes %}
<a href="{{ r }}/{{ n[0] }}">{{ n[1] }}</a>
{%- endfor %}
</h1>
<div id="tree"></div>
<div id="wrap">
@@ -108,21 +108,18 @@
{%- for f in files %}
<tr><td>{{ f.lead }}</td><td><a href="{{ f.href }}">{{ f.name|e }}</a></td><td>{{ f.sz }}</td>
{%- if f.tags is defined %}
{%- for k in taglist %}
<td>{{ f.tags[k] }}</td>
{%- endfor %}
{%- endif %}
<td>{{ f.ext }}</td><td>{{ f.dt }}</td></tr>
{%- if f.tags is defined %}
{%- for k in taglist %}<td>{{ f.tags[k] }}</td>{%- endfor %}
{%- endif %}<td>{{ f.ext }}</td><td>{{ f.dt }}</td></tr>
{%- endfor %}
</tbody>
</table>
<div id="epi" class="logue">{{ "" if sb_lg else logues[1] }}</div>
<h2 id="wfp"><a href="{{ r }}/?h" id="goh">control-panel</a></h2>
<a href="#" id="repl">π</a>
</div>

File diff suppressed because it is too large Load Diff

View File

@@ -11,7 +11,6 @@
td{border:1px solid #999;border-width:1px 1px 0 0;padding:0 5px}
a{display:block}
</style>
{{ html_head }}
</head>
<body>
@@ -52,11 +51,11 @@
</tbody>
</table>
{%- if logues[1] %}
<div>{{ logues[1] }}</div><br />
{%- endif %}
<h2><a href="{{ r }}/{{ url_suf }}{{ url_suf and '&amp;' or '?' }}h">control-panel</a></h2>
</body>

View File

@@ -49,7 +49,7 @@
<div id="mp" class="mdo"></div>
</div>
<a href="#" id="repl">π</a>
{%- if edit %}
<div id="helpbox">
<textarea autocomplete="off">
@@ -125,7 +125,7 @@ write markdown (most html is 🙆 too)
</textarea>
</div>
{%- endif %}
<script>
var SR = {{ r|tojson }},
@@ -159,5 +159,8 @@ try { l.light = drk? 0:1; } catch (ex) { }
{%- if edit %}
<script src="{{ r }}/.cpr/md2.js?_={{ ts }}"></script>
{%- endif %}
{%- if js %}
<script src="{{ js }}_={{ ts }}"></script>
{%- endif %}
</body></html>

View File

@@ -53,5 +53,8 @@ try { l.light = drk? 0:1; } catch (ex) { }
<script src="{{ r }}/.cpr/deps/marked.js?_={{ ts }}"></script>
<script src="{{ r }}/.cpr/deps/easymde.js?_={{ ts }}"></script>
<script src="{{ r }}/.cpr/mde.js?_={{ ts }}"></script>
{%- if js %}
<script src="{{ js }}_={{ ts }}"></script>
{%- endif %}
</body></html>

View File

@@ -46,6 +46,9 @@
}, 1000);
</script>
{%- endif %}
{%- if js %}
<script src="{{ js }}_={{ ts }}"></script>
{%- endif %}
</body>
</html>

83
copyparty/web/shares.css Normal file
View File

@@ -0,0 +1,83 @@
html {
color: #333;
background: #f7f7f7;
font-family: sans-serif;
font-family: var(--font-main), sans-serif;
touch-action: manipulation;
}
#wrap {
margin: 2em auto;
padding: 0 1em 3em 1em;
line-height: 2.3em;
}
#wrap>span {
margin: 0 0 0 1em;
border-bottom: 1px solid #999;
}
li {
margin: 1em 0;
}
a {
color: #047;
background: #fff;
text-decoration: none;
white-space: nowrap;
border-bottom: 1px solid #8ab;
border-radius: .2em;
padding: .2em .6em;
margin: 0 .3em;
}
#wrap td a {
margin: 0;
}
#w {
color: #fff;
background: #940;
border-color: #b70;
}
#repl {
border: none;
background: none;
color: inherit;
padding: 0;
position: fixed;
bottom: .25em;
left: .2em;
}
table {
border-collapse: collapse;
position: relative;
}
th {
top: -1px;
position: sticky;
background: #f7f7f7;
}
#wrap td,
#wrap th {
padding: .3em .6em;
text-align: left;
white-space: nowrap;
}
#wrap td+td+td+td+td+td+td+td {
font-family: var(--font-mono), monospace, monospace;
}
html.z {
background: #222;
color: #ccc;
}
html.z a {
color: #fff;
background: #057;
border-color: #37a;
}
html.z th {
background: #222;
}
html.bz {
color: #bbd;
background: #11121d;
}

79
copyparty/web/shares.html Normal file
View File

@@ -0,0 +1,79 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>{{ s_doctitle }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#{{ tcolor }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/shares.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
{{ html_head }}
</head>
<body>
<div id="wrap">
<a id="a" href="{{ r }}/?shares" class="af">refresh</a>
<a id="a" href="{{ r }}/?h" class="af">control-panel</a>
<span>axs = perms (read,write,move,delet)</span>
<span>nf = numFiles (0=dir)</span>
<span>min/hrs = time left</span>
<table id="tab"><thead><tr>
<th>sharekey</th>
<th>delete</th>
<th>pw</th>
<th>source</th>
<th>axs</th>
<th>nf</th>
<th>user</th>
<th>created</th>
<th>expires</th>
<th>min</th>
<th>hrs</th>
<th>add time</th>
</tr></thead><tbody>
{% for k, pw, vp, pr, st, un, t0, t1 in rows %}
<tr>
<td>
<a href="{{ r }}{{ shr }}{{ k }}?qr">qr</a>
<a href="{{ r }}{{ shr }}{{ k }}">{{ k }}</a>
</td>
<td><a href="#" k="{{ k }}">delete</a></td>
<td>{{ "yes" if pw else "--" }}</td>
<td><a href="{{ r }}/{{ vp|e }}">/{{ vp|e }}</a></td>
<td>{{ pr }}</td>
<td>{{ st }}</td>
<td>{{ un|e }}</td>
<td>{{ t0 }}</td>
<td>{{ t1 }}</td>
<td>{{ "inf" if not t1 else "dead" if t1 < now else ((t1 - now) / 60) | round(1) }}</td>
<td>{{ "inf" if not t1 else "dead" if t1 < now else ((t1 - now) / 3600) | round(1) }}</td>
<td></td>
</tr>
{% endfor %}
</tbody></table>
{% if not rows %}
(you don't have any active shares btw)
{% endif %}
<script>
var SR = {{ r|tojson }},
shr="{{ shr }}",
lang="{{ lang }}",
dfavico="{{ favico }}";
var STG = window.localStorage;
document.documentElement.className = (STG && STG.cpp_thm) || "{{ this.args.theme }}";
</script>
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
<script src="{{ r }}/.cpr/shares.js?_={{ ts }}"></script>
{%- if js %}
<script src="{{ js }}_={{ ts }}"></script>
{%- endif %}
</body>
</html>

78
copyparty/web/shares.js Normal file
View File

@@ -0,0 +1,78 @@
var t = QSA('a[k]');
for (var a = 0; a < t.length; a++)
t[a].onclick = rm;
function rm() {
var u = SR + shr + uricom_enc(this.getAttribute('k')) + '?eshare=rm',
xhr = new XHR();
xhr.open('POST', u, true);
xhr.onload = xhr.onerror = cb;
xhr.send();
}
function bump() {
var k = this.closest('tr').getElementsByTagName('a')[2].getAttribute('k'),
u = SR + shr + uricom_enc(k) + '?eshare=' + this.value,
xhr = new XHR();
xhr.open('POST', u, true);
xhr.onload = xhr.onerror = cb;
xhr.send();
}
function cb() {
if (this.status !== 200)
return modal.alert('<h6>server error</h6>' + esc(unpre(this.responseText)));
document.location = '?shares';
}
function qr(e) {
ev(e);
var href = this.href,
pw = this.closest('tr').cells[2].textContent;
if (pw.indexOf('yes') < 0)
return showqr(href);
modal.prompt("if you want to bypass the password protection by\nembedding the password into the qr-code, then\ntype the password now, otherwise leave this empty", "", function (v) {
if (v)
href += "&pw=" + v;
showqr(href);
});
}
function showqr(href) {
var vhref = href.replace('?qr&', '?').replace('?qr', '');
modal.alert(esc(vhref) + '<img class="b64" src="' + href + '" />');
}
(function() {
var tab = ebi('tab').tBodies[0],
tr = Array.prototype.slice.call(tab.rows, 0);
var buf = [];
for (var a = 0; a < tr.length; a++) {
tr[a].cells[0].getElementsByTagName('a')[0].onclick = qr;
for (var b = 7; b < 9; b++)
buf.push(parseInt(tr[a].cells[b].innerHTML));
}
var ibuf = 0;
for (var a = 0; a < tr.length; a++)
for (var b = 7; b < 9; b++) {
var v = buf[ibuf++];
tr[a].cells[b].innerHTML =
v ? unix2iso(v).replace(' ', ',&nbsp;') : 'never';
}
for (var a = 0; a < tr.length; a++)
tr[a].cells[11].innerHTML =
'<button value="1">1min</button> ' +
'<button value="60">1h</button>';
var btns = QSA('td button'), aa = btns.length;
for (var a = 0; a < aa; a++)
btns[a].onclick = bump;
})();

View File

@@ -53,7 +53,7 @@ a.r {
border-color: #c7a;
}
a.g {
color: #2b0;
color: #0a0;
border-color: #3a0;
box-shadow: 0 .3em 1em #4c0;
}
@@ -152,11 +152,13 @@ pre b,
code b {
color: #000;
font-weight: normal;
text-shadow: 0 0 .2em #0f0;
text-shadow: 0 0 .2em #3f3;
border-bottom: 1px solid #090;
}
html.z pre b,
html.z code b {
color: #fff;
border-bottom: 1px solid #9f9;
}
@@ -182,13 +184,18 @@ html.z a.g {
border-color: #af4;
box-shadow: 0 .3em 1em #7d0;
}
form {
line-height: 2.5em;
}
#x,
input {
color: #a50;
background: #fff;
border: 1px solid #a50;
border-radius: .5em;
padding: .5em .7em;
margin: 0 .5em 0 0;
border-radius: .3em;
padding: .25em .6em;
margin: 0 .3em 0 0;
font-size: 1em;
}
input::placeholder {
font-size: 1.2em;
@@ -197,6 +204,7 @@ input::placeholder {
opacity: 0.64;
color: #930;
}
#x,
html.z input {
color: #fff;
background: #626;

View File

@@ -14,6 +14,7 @@
<body>
<div id="wrap">
{%- if not in_shr %}
<a id="a" href="{{ r }}/?h" class="af">refresh</a>
<a id="v" href="{{ r }}/?hc" class="af">connect</a>
@@ -21,7 +22,8 @@
<p id="b">howdy stranger &nbsp; <small>(you're not logged in)</small></p>
{%- else %}
<a id="c" href="{{ r }}/?pw=x" class="logout">logout</a>
<p><span id="m">welcome back,</span> <strong>{{ this.uname }}</strong></p>
<p><span id="m">welcome back,</span> <strong>{{ this.uname|e }}</strong></p>
{%- endif %}
{%- endif %}
{%- if msg %}
@@ -30,6 +32,18 @@
</div>
{%- endif %}
{%- if ups %}
<h1 id="aa">incoming files:</h1>
<table class="vols">
<thead><tr><th>%</th><th>speed</th><th>eta</th><th>idle</th><th>dir</th><th>file</th></tr></thead>
<tbody>
{% for u in ups %}
<tr><td>{{ u[0] }}</td><td>{{ u[1] }}</td><td>{{ u[2] }}</td><td>{{ u[3] }}</td><td><a href="{{ u[4] }}">{{ u[5]|e }}</a></td><td>{{ u[6]|e }}</td></tr>
{% endfor %}
</tbody>
</table>
{%- endif %}
{%- if avol %}
<h1>admin panel:</h1>
<table><tr><td> <!-- hehehe -->
@@ -76,8 +90,43 @@
</ul>
{%- endif %}
<h1 id="cc">client config:</h1>
{%- if in_shr %}
<h1 id="z">unlock this share:</h1>
<div>
<form id="lf" method="post" enctype="multipart/form-data" action="{{ r }}/{{ qvpath }}">
<input type="hidden" id="la" name="act" value="login" />
<input type="password" id="lp" name="cppwd" placeholder=" password" />
<input type="hidden" name="uhash" id="uhash" value="x" />
<input type="submit" id="ls" value="Unlock" />
{% if ahttps %}
<a id="w" href="{{ ahttps }}">switch to https</a>
{% endif %}
</form>
</div>
{%- else %}
<h1 id="l">login for more:</h1>
<div>
<form id="lf" method="post" enctype="multipart/form-data" action="{{ r }}/{{ qvpath }}">
<input type="hidden" id="la" name="act" value="login" />
<input type="password" id="lp" name="cppwd" placeholder=" password" />
<input type="hidden" name="uhash" id="uhash" value="x" />
<input type="submit" id="ls" value="Login" />
{% if chpw %}
<a id="x" href="#">change password</a>
{% endif %}
{% if ahttps %}
<a id="w" href="{{ ahttps }}">switch to https</a>
{% endif %}
</form>
</div>
{%- endif %}
<h1 id="cc">other stuff:</h1>
<ul>
{%- if this.uname != '*' and this.args.shr %}
<li><a id="y" href="{{ r }}/?shares">edit shares</a></li>
{% endif %}
{% if k304 or k304vis %}
{% if k304 %}
<li><a id="h" href="{{ r }}/?k304=n">disable k304</a> (currently enabled)
@@ -90,18 +139,6 @@
<li><a id="k" href="{{ r }}/?reset" class="r" onclick="localStorage.clear();return true">reset client settings</a></li>
</ul>
<h1 id="l">login for more:</h1>
<div>
<form method="post" enctype="multipart/form-data" action="{{ r }}/{{ qvpath }}">
<input type="hidden" name="act" value="login" />
<input type="password" name="cppwd" placeholder=" password" />
<input type="hidden" name="uhash" id="uhash" value="x" />
<input type="submit" value="Login" />
{% if ahttps %}
<a id="w" href="{{ ahttps }}">switch to https</a>
{% endif %}
</form>
</div>
</div>
<a href="#" id="repl">π</a>
{%- if not this.args.nb %}
@@ -119,6 +156,9 @@ document.documentElement.className = (STG && STG.cpp_thm) || "{{ this.args.theme
</script>
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
<script src="{{ r }}/.cpr/splash.js?_={{ ts }}"></script>
{%- if js %}
<script src="{{ js }}_={{ ts }}"></script>
{%- endif %}
</body>
</html>

View File

@@ -9,7 +9,7 @@ var Ls = {
"e2": "leser inn konfigurasjonsfiler på nytt$N(kontoer, volumer, volumbrytere)$Nog kartlegger alle e2ds-volumer$N$Nmerk: endringer i globale parametere$Nkrever en full restart for å ta gjenge",
"f1": "du kan betrakte:",
"g1": "du kan laste opp til:",
"cc1": "klient-konfigurasjon",
"cc1": "brytere og sånt:",
"h1": "skru av k304",
"i1": "skru på k304",
"j1": "k304 bryter tilkoplingen for hver HTTP 304. Dette hjelper mot visse mellomtjenere som kan sette seg fast / plutselig slutter å laste sider, men det reduserer også ytelsen betydelig",
@@ -17,31 +17,77 @@ var Ls = {
"l1": "logg inn:",
"m1": "velkommen tilbake,",
"n1": "404: filen finnes ikke &nbsp;┐( ´ -`)┌",
"o1": 'eller kanskje du ikke har tilgang? prøv å logge inn eller <a href="' + SR + '/?h">gå hjem</a>',
"o1": 'eller kanskje du ikke har tilgang? prøv et passord eller <a href="' + SR + '/?h">gå hjem</a>',
"p1": "403: tilgang nektet &nbsp;~┻━┻",
"q1": 'du må logge inn eller <a href="' + SR + '/?h">gå hjem</a>',
"q1": 'prøv et passord eller <a href="' + SR + '/?h">gå hjem</a>',
"r1": "gå hjem",
".s1": "kartlegg",
"t1": "handling",
"u2": "tid siden noen sist skrev til serveren$N( opplastning / navneendring / ... )$N$N17d = 17 dager$N1h23 = 1 time 23 minutter$N4m56 = 4 minuter 56 sekunder",
"v1": "koble til",
"v2": "bruk denne serveren som en lokal harddisk$N$NADVARSEL: kommer til å vise passordet ditt!",
"v2": "bruk denne serveren som en lokal harddisk",
"w1": "bytt til https",
"x1": "bytt passord",
"y1": "dine delinger",
"z1": "lås opp område:",
"ta1": "du må skrive et nytt passord først",
"ta2": "gjenta for å bekrefte nytt passord:",
"ta3": "fant en skrivefeil; vennligst prøv igjen",
"aa1": "innkommende:",
},
"eng": {
"d2": "shows the state of all active threads",
"e2": "reload config files (accounts/volumes/volflags),$Nand rescan all e2ds volumes$N$Nnote: any changes to global settings$Nrequire a full restart to take effect",
"u2": "time since the last server write$N( upload / rename / ... )$N$N17d = 17 days$N1h23 = 1 hour 23 minutes$N4m56 = 4 minutes 56 seconds",
"v2": "use this server as a local HDD$N$NWARNING: this will show your password!",
"v2": "use this server as a local HDD",
"ta1": "fill in your new password first",
"ta2": "repeat to confirm new password:",
"ta3": "found a typo; please try again",
},
"chi": {
"a1": "更新",
"b1": "你好 &nbsp; <small>(你尚未登录)</small>",
"c1": "登出",
"d1": "状态",
"d2": "显示所有活动线程的状态",
"e1": "重新加载配置",
"e2": "重新加载配置文件(账户/卷/卷标),$N并重新扫描所有 e2ds 卷$N$N注意任何全局设置的更改$N都需要完全重启才能生效",
"f1": "你可以查看:",
"g1": "你可以上传到:",
"cc1": "开关等",
"h1": "关闭 k304",
"i1": "开启 k304",
"j1": "k304 会在每个 HTTP 304 时断开连接。这有助于避免某些代理服务器卡住或突然停止加载页面,但也会显著降低性能。",
"k1": "重置设置",
"l1": "登录:",
"m1": "欢迎回来,",
"n1": "404: 文件不存在 &nbsp;┐( ´ -`)┌",
"o1": '或者你可能没有权限?尝试输入密码或 <a href="' + SR + '/?h">回家</a>',
"p1": "403: 访问被拒绝 &nbsp;~┻━┻",
"q1": '尝试输入密码或 <a href="' + SR + '/?h">回家</a>',
"r1": "回家",
".s1": "映射",
"t1": "操作",
"u2": "自上次服务器写入的时间$N( 上传 / 重命名 / ... )$N$N17d = 17 天$N1h23 = 1 小时 23 分钟$N4m56 = 4 分钟 56 秒",
"v1": "连接",
"v2": "将此服务器用作本地硬盘",
"w1": "切换到 https",
"x1": "更改密码",
"y1": "你的分享",
"z1": "解锁区域",
"ta1": "请先输入新密码",
"ta2": "重复以确认新密码:",
"ta3": "发现拼写错误;请重试",
"aa1": "正在接收的文件:", //m
}
};
var LANGS = ["eng", "nor"];
if (window.langmod)
langmod();
var d = Ls[sread("cpp_lang", LANGS) || lang] || Ls.eng || Ls.nor;
var d = Ls[sread("cpp_lang", Object.keys(Ls)) || lang] ||
Ls.eng || Ls.nor || Ls.chi;
for (var k in (d || {})) {
var f = k.slice(-1),
@@ -74,3 +120,42 @@ if (o && /[0-9]+$/.exec(o.innerHTML))
o.innerHTML = shumantime(o.innerHTML);
ebi('uhash').value = '' + location.hash;
(function() {
if (!ebi('x'))
return;
var pwi = ebi('lp');
function redo(msg) {
modal.alert(msg, function() {
pwi.value = '';
pwi.focus();
});
}
function mok(v) {
if (v !== pwi.value)
return redo(d.ta3);
pwi.setAttribute('name', 'pw');
ebi('la').value = 'chpw';
ebi('lf').submit();
}
function stars() {
var m = ebi('modali');
function enstars(n) {
setTimeout(function() { m.value = ''; }, n);
}
m.setAttribute('type', 'password');
enstars(17);
enstars(32);
enstars(69);
}
ebi('x').onclick = function (e) {
ev(e);
if (!pwi.value)
return redo(d.ta1);
modal.prompt(d.ta2, "y", mok, null, stars);
};
})();

View File

@@ -56,7 +56,7 @@
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
<li>old version of rclone? replace all <code>=</code> with <code>&nbsp;</code> (space)</li>
</ul>
<p>if you want to use the native WebDAV client in windows instead (slow and buggy), first run <a href="{{ r }}/.cpr/a/webdav-cfg.bat">webdav-cfg.bat</a> to remove the 47 MiB filesize limit (also fixes latency and password login), then connect:</p>
<pre>
net use <b>w:</b> http{{ s }}://{{ ep }}/{{ rvp }}{% if accs %} k /user:<b>{{ pw }}</b>{% endif %}
@@ -64,16 +64,7 @@
</div>
<div class="os lin">
<pre>
yum install davfs2
{% if accs %}printf '%s\n' <b>{{ pw }}</b> k | {% endif %}mount -t davfs -ouid=1000 http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b>
</pre>
<p>make it automount on boot:</p>
<pre>
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>{{ pw }}</b> k" >> /etc/davfs2/secrets
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b> davfs rw,user,uid=1000,noauto 0 0" >> /etc/fstab
</pre>
<p>or you can use rclone instead, which is much slower but doesn't require root (plus it keeps lastmodified on upload):</p>
<p>rclone (v1.63 or later) is recommended:</p>
<pre>
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
@@ -85,6 +76,16 @@
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
<li>old version of rclone? replace all <code>=</code> with <code>&nbsp;</code> (space)</li>
</ul>
<p>alternatively use davfs2 (requires root, is slower, forgets lastmodified-timestamp on upload):</p>
<pre>
yum install davfs2
{% if accs %}printf '%s\n' <b>{{ pw }}</b> k | {% endif %}mount -t davfs -ouid=1000 http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b>
</pre>
<p>make davfs2 automount on boot:</p>
<pre>
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>{{ pw }}</b> k" >> /etc/davfs2/secrets
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b> davfs rw,user,uid=1000,noauto 0 0" >> /etc/fstab
</pre>
<p>or the emergency alternative (gnome/gui-only):</p>
<!-- gnome-bug: ignores vp -->
<pre>
@@ -104,7 +105,7 @@
<pre>
http{{ s }}://k:<b>{{ pw }}</b>@{{ ep }}/{{ rvp }}
</pre>
{% if s %}
<p><em>replace <code>https</code> with <code>http</code> if it doesn't work</em></p>
{% endif %}
@@ -191,6 +192,7 @@
<h1>partyfuse</h1>
<p>
<a href="{{ r }}/.cpr/a/partyfuse.py">partyfuse.py</a> -- fast, read-only,
needs <a href="{{ r }}/.cpr/deps/fuse.py">fuse.py</a> in the same folder,
<span class="os win">needs <a href="https://winfsp.dev/rel/">winfsp</a></span>
<span class="os lin">doesn't need root</span>
</p>
@@ -207,7 +209,6 @@
{% if args.smb %}
<h1>SMB / CIFS</h1>
<em><a href="https://github.com/SecureAuthCorp/impacket/issues/1433">bug:</a> max ~300 files in each folder</em>
<div class="os win">
<pre>
@@ -244,6 +245,9 @@ document.documentElement.className = (STG && STG.cpp_thm) || "{{ args.theme }}";
</script>
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
<script src="{{ r }}/.cpr/svcs.js?_={{ ts }}"></script>
{%- if js %}
<script src="{{ js }}_={{ ts }}"></script>
{%- endif %}
</body>
</html>

View File

@@ -69,6 +69,18 @@ html {
top: 2em;
bottom: unset;
}
#toastt {
position: absolute;
height: 1px;
top: 1px;
right: 1%;
width: 99%;
animation: toastt var(--tmtime) steps(var(--tmstep)) forwards;
transform-origin: right;
}
@keyframes toastt {
to {transform: scaleX(0)}
}
#toast a {
color: inherit;
text-shadow: inherit;
@@ -130,6 +142,9 @@ html {
#toast.inf #toastc {
background: #0be;
}
#toast.inf #toastt {
background: #8ef;
}
#toast.ok {
background: #380;
border-color: #8e4;
@@ -137,6 +152,9 @@ html {
#toast.ok #toastc {
background: #8e4;
}
#toast.ok #toastt {
background: #cf9;
}
#toast.warn {
background: #960;
border-color: #fc0;
@@ -144,6 +162,9 @@ html {
#toast.warn #toastc {
background: #fc0;
}
#toast.warn #toastt {
background: #fe9;
}
#toast.err {
background: #900;
border-color: #d06;
@@ -151,6 +172,9 @@ html {
#toast.err #toastc {
background: #d06;
}
#toast.err #toastt {
background: #f9c;
}
#toast code {
padding: 0 .2em;
background: rgba(0,0,0,0.2);
@@ -265,7 +289,11 @@ html.y #tth {
box-shadow: 0 .3em 3em rgba(0,0,0,0.5);
max-width: 50em;
max-height: 30em;
overflow: auto;
overflow-x: auto;
overflow-y: scroll;
}
#modalc.yk {
overflow-y: auto;
}
#modalc td {
text-align: unset;
@@ -289,6 +317,12 @@ html.y #tth {
#modalc a {
color: #07b;
}
#modalc .b64 {
display: block;
margin: .1em auto;
width: 60%;
height: 60%;
}
#modalb {
position: sticky;
text-align: right;
@@ -381,6 +415,7 @@ html.y textarea:focus {
}
.mdo pre,
.mdo code,
.mdo code[class*="language-"],
.mdo tt {
font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;

View File

@@ -152,12 +152,13 @@ function U2pvis(act, btns, uc, st) {
r.mod0 = null;
var markup = {
'404': '<span class="err">404</span>',
'ERROR': '<span class="err">ERROR</span>',
'OS-error': '<span class="err">OS-error</span>',
'found': '<span class="inf">found</span>',
'YOLO': '<span class="inf">YOLO</span>',
'done': '<span class="ok">done</span>',
'404': '<span class="err">' + L.utl_404 + '</span>',
'ERROR': '<span class="err">' + L.utl_err + '</span>',
'OS-error': '<span class="err">' + L.utl_oserr + '</span>',
'found': '<span class="inf">' + L.utl_found + '</span>',
'defer': '<span class="inf">' + L.utl_defer + '</span>',
'YOLO': '<span class="inf">' + L.utl_yolo + '</span>',
'done': '<span class="ok">' + L.utl_done + '</span>',
};
r.addfile = function (entry, sz, draw) {
@@ -445,9 +446,7 @@ function U2pvis(act, btns, uc, st) {
return;
r.npotato = 0;
var html = [
"<p>files: &nbsp; <b>{0}</b> finished, &nbsp; <b>{1}</b> failed, &nbsp; <b>{2}</b> busy, &nbsp; <b>{3}</b> queued</p>".format(
r.ctr.ok, r.ctr.ng, r.ctr.bz, r.ctr.q)];
var html = [L.u_pott.format(r.ctr.ok, r.ctr.ng, r.ctr.bz, r.ctr.q)];
while (r.head < r.tab.length && has(["ok", "ng"], r.tab[r.head].in))
r.head++;
@@ -602,7 +601,7 @@ function U2pvis(act, btns, uc, st) {
if (nf < 9000)
return go();
modal.confirm('about to show ' + nf + ' files\n\nthis may crash your browser, are you sure?', go, null);
modal.confirm(L.u_bigtab.format(nf), go, null);
};
}
@@ -658,7 +657,9 @@ function Donut(uc, st) {
}
function pos() {
return uc.fsearch ? Math.max(st.bytes.hashed, st.bytes.finished) : st.bytes.finished;
return uc.fsearch ?
Math.max(st.bytes.hashed, st.bytes.finished) :
st.bytes.inflight + st.bytes.finished;
}
r.on = function (ya) {
@@ -853,6 +854,7 @@ function up2k_init(subtle) {
setmsg(suggest_up2k, 'msg');
var parallel_uploads = ebi('nthread').value = icfg_get('nthread', u2j),
stitch_tgt = ebi('u2szg').value = icfg_get('u2sz', u2sz.split(',')[1]),
uc = {},
fdom_ctr = 0,
biggest_file = 0;
@@ -1034,7 +1036,7 @@ function up2k_init(subtle) {
}
catch (ex) {
document.body.ondragenter = document.body.ondragleave = document.body.ondragover = null;
return modal.alert('your browser does not support drag-and-drop uploading');
return modal.alert(L.u_nodrop);
}
if (btn)
return;
@@ -1101,7 +1103,7 @@ function up2k_init(subtle) {
}
if (!good_files.length && bad_files.length)
return toast.err(30, "that's not a folder!\n\nyour browser is too old,\nplease try dragdrop instead");
return toast.err(30, L.u_notdir);
return read_dirs(null, [], [], good_files, nil_files, bad_files);
}
@@ -1119,7 +1121,7 @@ function up2k_init(subtle) {
if (err)
return modal.alert('sorry, ' + err);
toast.inf(0, 'Scanning files...');
toast.inf(0, L.u_scan);
if ((dz == 'up_dz' && uc.fsearch) || (dz == 'srch_dz' && !uc.fsearch))
tgl_fsearch();
@@ -1207,7 +1209,7 @@ function up2k_init(subtle) {
match = false;
if (match) {
var msg = ['directory iterator got stuck on the following {0} items; good chance your browser is about to spinlock:<ul>'.format(missing.length)];
var msg = [L.u_dirstuck.format(missing.length) + '<ul>'];
for (var a = 0; a < Math.min(20, missing.length); a++)
msg.push('<li>' + esc(missing[a]) + '</li>');
@@ -1278,7 +1280,7 @@ function up2k_init(subtle) {
}
function gotallfiles(good_files, nil_files, bad_files) {
if (toast.txt == 'Scanning files...')
if (toast.txt == L.u_scan)
toast.hide();
if (uc.fsearch && !uc.turbo)
@@ -1434,7 +1436,7 @@ function up2k_init(subtle) {
if (!actx || actx.state != 'suspended' || toast.visible)
return;
toast.warn(30, "<div onclick=\"start_actx();toast.inf(3,'thanks!')\">please click this text to<br />unlock full upload speed</div>");
toast.warn(30, "<div onclick=\"start_actx();toast.inf(3,'thanks!')\">" + L.u_actx + "</div>");
}, 500);
}
@@ -1476,7 +1478,7 @@ function up2k_init(subtle) {
ev(e);
var txt = linklist();
cliptxt(txt + '\n', function () {
toast.inf(5, txt.split('\n').length + ' links copied to clipboard');
toast.inf(5, un_clip.format(txt.split('\n').length));
});
};
@@ -1736,16 +1738,13 @@ function up2k_init(subtle) {
}
}
var mou_ikkai = false;
if (st.busy.handshake.length &&
st.busy.handshake[0].t_busied < now - 30 * 1000
) {
console.log("retrying stuck handshake");
var t = st.busy.handshake.shift();
st.todo.handshake.unshift(t);
if (st.bytes.inflight && (st.bytes.inflight < 0 || !st.busy.upload.length)) {
console.log('insane inflight ' + st.bytes.inflight);
st.bytes.inflight = 0;
}
var mou_ikkai = false;
var nprev = -1;
for (var a = 0; a < st.todo.upload.length; a++) {
var nf = st.todo.upload[a].nfile;
@@ -2178,7 +2177,7 @@ function up2k_init(subtle) {
st.busy.head.push(t);
var xhr = new XMLHttpRequest();
xhr.onerror = function () {
xhr.onerror = xhr.ontimeout = function () {
console.log('head onerror, retrying', t.name, t);
if (!toast.visible)
toast.warn(9.98, L.u_enethd + "\n\nfile: " + t.name, t);
@@ -2222,6 +2221,7 @@ function up2k_init(subtle) {
try { orz(e); } catch (ex) { vis_exh(ex + '', 'up2k.js', '', '', ex); }
};
xhr.timeout = 34000;
xhr.open('HEAD', t.purl + uricom_enc(t.name), true);
xhr.send();
}
@@ -2246,8 +2246,11 @@ function up2k_init(subtle) {
if (keepalive)
console.log("sending keepalive handshake", t.name, t);
if (!t.srch && !t.t_handshake)
pvis.seth(t.n, 2, L.u_hs);
var xhr = new XMLHttpRequest();
xhr.onerror = function () {
xhr.onerror = xhr.ontimeout = function () {
if (t.t_busied != me) // t.done ok
return console.log('zombie handshake onerror', t.name, t);
@@ -2272,7 +2275,8 @@ function up2k_init(subtle) {
apop(st.busy.handshake, t);
st.todo.handshake.unshift(t);
t.cooldown = Date.now() + 5000 + Math.floor(Math.random() * 3000);
return toast.err(0, 'Handshake error; will retry...\n\n' + L.badreply + ':\n\n' + unpre(xhr.responseText));
var txt = t.t_uploading ? L.u_ehsfin : t.srch ? L.u_ehssrch : L.u_ehsinit;
return toast.err(0, txt + '\n\n' + L.badreply + ':\n\n' + unpre(xhr.responseText));
}
t.t_handshake = Date.now();
@@ -2374,11 +2378,39 @@ function up2k_init(subtle) {
var arr = st.todo.upload,
sort = arr.length && arr[arr.length - 1].nfile > t.n;
for (var a = 0; a < t.postlist.length; a++)
if (!t.stitch_sz) {
// keep all connections busy
var bpc = (st.bytes.total - st.bytes.finished) / (parallel_uploads || 1),
ocs = 1024 * 1024,
stp = 1024 * 512,
ccs = ocs;
while (ccs < bpc) {
ocs = ccs;
ccs += stp; if (ccs < bpc) ocs = ccs;
ccs += stp; stp *= 2;
}
ocs = Math.floor(ocs / 1024 / 1024);
t.stitch_sz = Math.min(ocs, stitch_tgt);
}
for (var a = 0; a < t.postlist.length; a++) {
var nparts = [], tbytes = 0, stitch = t.stitch_sz;
if (t.nojoin && t.nojoin - t.postlist.length < 6)
stitch = 1;
--a;
for (var b = 0; b < stitch; b++) {
nparts.push(t.postlist[++a]);
tbytes += chunksize;
if (tbytes + chunksize > stitch * 1024 * 1024 || t.postlist[a + 1] - t.postlist[a] !== 1)
break;
}
arr.push({
'nfile': t.n,
'npart': t.postlist[a]
'nparts': nparts
});
}
t.nojoin = 0;
msg = null;
done = false;
@@ -2387,7 +2419,7 @@ function up2k_init(subtle) {
arr.sort(function (a, b) {
return a.nfile < b.nfile ? -1 :
/* */ a.nfile > b.nfile ? 1 :
a.npart < b.npart ? -1 : 1;
/* */ a.nparts[0] < b.nparts[0] ? -1 : 1;
});
}
@@ -2423,6 +2455,7 @@ function up2k_init(subtle) {
pvis.seth(t.n, 2, L.u_ehstmp, t);
var err = "",
cls = "ERROR",
rsp = unpre(xhr.responseText),
ofs = rsp.lastIndexOf('\nURL: ');
@@ -2452,6 +2485,8 @@ function up2k_init(subtle) {
if (!t.rechecks && (err_pend || err_srcb)) {
t.rechecks = 0;
t.want_recheck = true;
err = L.u_dupdefer;
cls = 'defer';
}
}
if (rsp.indexOf('server HDD is full') + 1)
@@ -2461,7 +2496,7 @@ function up2k_init(subtle) {
if (!t.t_uploading)
st.bytes.finished += t.size;
pvis.seth(t.n, 1, "ERROR");
pvis.seth(t.n, 1, cls);
pvis.seth(t.n, 2, err);
pvis.move(t.n, 'ng');
@@ -2493,6 +2528,8 @@ function up2k_init(subtle) {
xhr.open('POST', t.purl, true);
xhr.responseType = 'text';
xhr.timeout = 42000 + (t.srch || t.t_uploaded ? 0 :
(t.size / (1048 * 20))); // safededup 20M/s hdd
xhr.send(JSON.stringify(req));
}
@@ -2534,7 +2571,10 @@ function up2k_init(subtle) {
function exec_upload() {
var upt = st.todo.upload.shift(),
t = st.files[upt.nfile],
npart = upt.npart,
nparts = upt.nparts,
pcar = nparts[0],
pcdr = nparts[nparts.length - 1],
snpart = pcar == pcdr ? pcar : ('' + pcar + '~' + pcdr),
tries = 0;
if (t.done)
@@ -2549,8 +2589,8 @@ function up2k_init(subtle) {
pvis.seth(t.n, 1, "🚀 send");
var chunksize = get_chunksize(t.size),
car = npart * chunksize,
cdr = car + chunksize;
car = pcar * chunksize,
cdr = (pcdr + 1) * chunksize;
if (cdr >= t.size)
cdr = t.size;
@@ -2560,14 +2600,19 @@ function up2k_init(subtle) {
var txt = unpre((xhr.response && xhr.response.err) || xhr.responseText);
if (txt.indexOf('upload blocked by x') + 1) {
apop(st.busy.upload, upt);
apop(t.postlist, npart);
for (var a = pcar; a <= pcdr; a++)
apop(t.postlist, a);
pvis.seth(t.n, 1, "ERROR");
pvis.seth(t.n, 2, txt.split(/\n/)[0]);
pvis.move(t.n, 'ng');
return;
}
if (xhr.status == 200) {
pvis.prog(t, npart, cdr - car);
var bdone = cdr - car;
for (var a = pcar; a <= pcdr; a++) {
pvis.prog(t, a, Math.min(bdone, chunksize));
bdone -= chunksize;
}
st.bytes.finished += cdr - car;
st.bytes.uploaded += cdr - car;
t.bytes_uploaded += cdr - car;
@@ -2576,18 +2621,21 @@ function up2k_init(subtle) {
}
else if (txt.indexOf('already got that') + 1 ||
txt.indexOf('already being written') + 1) {
console.log("ignoring dupe-segment error", t.name, t);
t.nojoin = t.nojoin || t.postlist.length;
console.log("ignoring dupe-segment with backoff", t.nojoin, t.name, t);
if (!toast.visible && st.todo.upload.length < 4)
toast.inf(10, L.u_cbusy);
}
else {
xhrchk(xhr, L.u_cuerr2.format(npart, Math.ceil(t.size / chunksize), t.name), "404, target folder not found (???)", "warn", t);
xhrchk(xhr, L.u_cuerr2.format(snpart, Math.ceil(t.size / chunksize), t.name), "404, target folder not found (???)", "warn", t);
chill(t);
}
orz2(xhr);
}
var orz2 = function (xhr) {
apop(st.busy.upload, upt);
apop(t.postlist, npart);
for (var a = pcar; a <= pcdr; a++)
apop(t.postlist, a);
if (!t.postlist.length) {
t.t_uploaded = Date.now();
pvis.seth(t.n, 1, 'verifying');
@@ -2601,28 +2649,48 @@ function up2k_init(subtle) {
btot = Math.floor(st.bytes.total / 1024 / 1024);
xhr.upload.onprogress = function (xev) {
var nb = xev.loaded;
st.bytes.inflight += nb - xhr.bsent;
var nb = xev.loaded,
db = nb - xhr.bsent;
if (!db)
return;
st.bytes.inflight += db;
xhr.bsent = nb;
pvis.prog(t, npart, nb);
xhr.timeout = 64000 + Date.now() - xhr.t0;
pvis.prog(t, pcar, nb);
};
xhr.onload = function (xev) {
try { orz(xhr); } catch (ex) { vis_exh(ex + '', 'up2k.js', '', '', ex); }
};
xhr.onerror = function (xev) {
xhr.onerror = xhr.ontimeout = function (xev) {
if (crashed)
return;
st.bytes.inflight -= (xhr.bsent || 0);
xhr.bsent = 0;
if (!toast.visible)
toast.warn(9.98, L.u_cuerr.format(npart, Math.ceil(t.size / chunksize), t.name), t);
toast.warn(9.98, L.u_cuerr.format(snpart, Math.ceil(t.size / chunksize), t.name), t);
t.nojoin = t.nojoin || t.postlist.length; // maybe rproxy postsize limit
console.log('chunkpit onerror,', ++tries, t.name, t);
orz2(xhr);
};
var chashes = [],
ctxt = t.hash[pcar],
plen = Math.floor(192 / nparts.length);
plen = plen > 9 ? 9 : plen < 2 ? 2 : plen;
for (var a = pcar + 1; a <= pcdr; a++)
chashes.push(t.hash[a].slice(0, plen));
if (chashes.length)
ctxt += ',' + plen + ',' + chashes.join('');
xhr.open('POST', t.purl, true);
xhr.setRequestHeader("X-Up2k-Hash", t.hash[npart]);
xhr.setRequestHeader("X-Up2k-Hash", ctxt);
xhr.setRequestHeader("X-Up2k-Wark", t.wark);
xhr.setRequestHeader("X-Up2k-Stat", "{0}/{1}/{2}/{3} {4}/{5} {6}".format(
pvis.ctr.ok, pvis.ctr.ng, pvis.ctr.bz, pvis.ctr.q, btot, btot - bfin,
@@ -2632,6 +2700,8 @@ function up2k_init(subtle) {
xhr.overrideMimeType('Content-Type', 'application/octet-stream');
xhr.bsent = 0;
xhr.t0 = Date.now();
xhr.timeout = 42000;
xhr.responseType = 'text';
xhr.send(t.fobj.slice(car, cdr));
}
@@ -2732,13 +2802,34 @@ function up2k_init(subtle) {
if (parallel_uploads > 16)
parallel_uploads = 16;
if (parallel_uploads > 7)
if (parallel_uploads > 6)
toast.warn(10, L.u_maxconn);
else if (toast.txt == L.u_maxconn)
toast.hide();
obj.value = parallel_uploads;
bumpthread({ "target": 1 });
}
var read_u2sz = function () {
var el = ebi('u2szg'), n = parseInt(el.value), dv = u2sz.split(',');
stitch_tgt = n = (
isNaN(n) ? dv[1] :
n < dv[0] ? dv[0] :
n > dv[2] ? dv[2] : n
);
if (n == dv[1]) sdrop('u2sz'); else swrite('u2sz', n);
if (el.value != n) el.value = n;
};
ebi('u2szg').addEventListener('blur', read_u2sz);
ebi('u2szg').onkeydown = function (e) {
if (anymod(e)) return;
var n = e.code == 'ArrowUp' ? 1 : e.code == 'ArrowDown' ? -1 : 0;
if (!n) return;
this.value = parseInt(this.value) + n;
read_u2sz();
}
function tgl_fsearch() {
set_fsearch(!uc.fsearch);
}

View File

@@ -127,13 +127,13 @@ if ((document.location + '').indexOf(',rej,') + 1)
try {
console.hist = [];
var CMAXHIST = 1000;
var CMAXHIST = MOBILE ? 9000 : 44000;
var hook = function (t) {
var orig = console[t].bind(console),
cfun = function () {
console.hist.push(Date.now() + ' ' + t + ': ' + Array.from(arguments).join(', '));
if (console.hist.length > CMAXHIST)
console.hist = console.hist.slice(CMAXHIST / 2);
console.hist = console.hist.slice(CMAXHIST / 4);
orig.apply(console, arguments);
};
@@ -473,6 +473,24 @@ function crc32(str) {
}
function randstr(len) {
var ret = '';
try {
var ar = new Uint32Array(Math.floor((len + 3) / 4));
crypto.getRandomValues(ar);
for (var a = 0; a < ar.length; a++)
ret += ('000' + ar[a].toString(36)).slice(-4);
return ret.slice(0, len);
}
catch (ex) {
console.log('using unsafe randstr because ' + ex);
while (ret.length < len)
ret += ('000' + Math.floor(Math.random() * 1679616).toString(36)).slice(-4);
return ret.slice(0, len);
}
}
function clmod(el, cls, add) {
if (!el)
return false;
@@ -517,6 +535,14 @@ function clgot(el, cls) {
}
function setcvar(k, v) {
try {
document.documentElement.style.setProperty(k, v);
}
catch (e) { }
}
var ANIM = true;
try {
var mq = window.matchMedia('(prefers-reduced-motion: reduce)');
@@ -904,15 +930,18 @@ function shumantime(v, long) {
function lhumantime(v) {
var t = shumantime(v, 1),
tp = t.replace(/([a-z])/g, " $1 ").split(/ /g).slice(0, -1);
var t = shumantime(v, 1);
if (/[0-9]$/.exec(t))
t += 's';
var tp = t.replace(/([a-z])/g, " $1 ").split(/ /g).slice(0, -1);
if (!L || tp.length < 2 || tp[1].indexOf('$') + 1)
return t;
var ret = '';
for (var a = 0; a < tp.length; a += 2)
ret += tp[a] + ' ' + L['ht_' + tp[a + 1]].replace(tp[a] == 1 ? /!.*/ : /!/, '') + L.ht_and;
ret += tp[a] + ' ' + L['ht_' + tp[a + 1] + (tp[a]==1?1:2)] + L.ht_and;
return ret.slice(0, -L.ht_and.length);
}
@@ -1396,10 +1425,10 @@ var tt = (function () {
o = ctr.querySelectorAll('*[tt]');
for (var a = o.length - 1; a >= 0; a--) {
o[a].onfocus = _cshow;
o[a].onblur = _hide;
o[a].onmouseenter = _dshow;
o[a].onmouseleave = _hide;
o[a].addEventListener('focus', _cshow);
o[a].addEventListener('blur', _hide);
o[a].addEventListener('mouseenter', _dshow);
o[a].addEventListener('mouseleave', _hide);
}
r.hide();
}
@@ -1498,13 +1527,21 @@ var toast = (function () {
if (sec)
te = setTimeout(r.hide, sec * 1000);
if (same && delta < 1000)
var tb = ebi('toastt');
if (same && delta < 1000 && tb) {
tb.style.animation = 'none';
tb.offsetHeight;
tb.style.animation = null;
return;
}
if (txt.indexOf('<body>') + 1)
txt = txt.slice(0, txt.indexOf('<')) + ' [...]';
obj.innerHTML = '<a href="#" id="toastc">x</a><div id="toastb">' + lf2br(txt) + '</div>';
setcvar('--tmtime', sec + 's');
setcvar('--tmstep', sec * 15);
obj.innerHTML = '<div id="toastt"></div><a href="#" id="toastc">x</a><div id="toastb">' + lf2br(txt) + '</div>';
obj.className = cl;
sec += obj.offsetWidth;
obj.className += ' vis';
@@ -1536,6 +1573,7 @@ var modal = (function () {
var r = {},
q = [],
o = null,
scrolling = null,
cb_up = null,
cb_ok = null,
cb_ng = null,
@@ -1556,6 +1594,7 @@ var modal = (function () {
r.nofocus = 0;
r.show = function (html) {
tt.hide();
o = mknod('div', 'modal');
o.innerHTML = '<table><tr><td><div id="modalc">' + html + '</div></td></tr></table>';
document.body.appendChild(o);
@@ -1579,6 +1618,7 @@ var modal = (function () {
document.addEventListener('focus', onfocus);
document.addEventListener('selectionchange', onselch);
timer.add(scrollchk, 1);
timer.add(onfocus);
if (cb_up)
setTimeout(cb_up, 1);
@@ -1586,6 +1626,8 @@ var modal = (function () {
r.hide = function () {
timer.rm(onfocus);
timer.rm(scrollchk);
scrolling = null;
try {
ebi('modal-ok').removeEventListener('blur', onblur);
}
@@ -1604,13 +1646,28 @@ var modal = (function () {
r.hide();
if (cb_ok)
cb_ok(v);
}
};
var ng = function (e) {
ev(e);
r.hide();
if (cb_ng)
cb_ng(null);
}
};
var scrollchk = function () {
if (scrolling === true)
return;
var o = ebi('modalc'),
vis = o.offsetHeight,
all = o.scrollHeight,
nsc = 8 + vis < all;
if (scrolling !== nsc)
clmod(o, 'yk', !nsc);
scrolling = nsc;
};
var onselch = function () {
try {

View File

@@ -1,3 +1,414 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0916-0107 `v1.15.3` incoming eta
## 🧪 new features
![cpanel-upload-eta2-or8](https://github.com/user-attachments/assets/eb003bd4-da3c-4995-bf6e-3a8c1c1b26dd)
* incoming uploads (and their ETA) are shown in the controlpanel 609c5921 844194ee
* list total directory sizes 427597b6
* show the total size and number of files of each directory in listings
* makes browsing a bit slower (up to 30%) so can be disabled with `--no-dirsz`
* sizes are calculated during startup, so it requires `-e2dsa`
* file-uploads will recalculate the sizes immediately, but a full rescan is necessary to see changes caused by moves/deletes
* optimizations;
* reduce broker overhead when multiprocessing is disabled 4e75534e
* should reduce cpu usage by uploads, thumbnails, prometheus metrics
* reduce cpu usage from downloading thumbnails 7d64879b
## 🩹 bugfixes
* fix sqlite indexes d67e9cc5
* upload handshakes would get exponentially slow if a volume has more than 200'000 files
* reindex on startup can be 150x faster in some rare cases (same filename in MANY folders)
* the database is now around 10% larger (likely worst-case)
* misc ux: 58835b2b
* shares: show media tags
* html hydrator assumed a folder named `foo.txt` was a doc
* due to sessions, use `pwd` as password placeholder on services
## 🔧 other changes
* add [example](https://github.com/9001/copyparty/tree/hovudstraum/contrib#flameshotsh) for uploading screenshots from linux with flameshot 1c2acdc9
* [nginx example](https://github.com/9001/copyparty/blob/hovudstraum/contrib/nginx/copyparty.conf): use unix-sockets for higher performance a5ce1032
* #97 chinese translation was improved, thx again @ultwcz 7a573caf
## 🗿 known issues
* prometheus metrics are busted
* **workaround:** disable monitoring of volume status with `--nos-vst`
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0909-2343 `v1.15.1` session
<img src="https://github.com/9001/copyparty/raw/hovudstraum/docs/logo.svg" width="250" align="right"/>
blessed by ⑨, this release is [certified strong](https://github.com/user-attachments/assets/05459032-736c-4b9a-9ade-a0044461194a) ([artist](https://x.com/hcnone))
## new features
* login sessions b5405174
* a random session cookie is generated for each known user, replacing the previous plaintext login cookie
* the logout button will nuke the session on all clients where that user is logged in
* the sessions are stored in the database at `--ses-db`, default `~/.config/copyparty/sessions.db` (docker uses `/cfg/sessions.db` similar to the other runtime configs)
* if you run multiple copyparty instances, much like [shares](https://github.com/9001/copyparty#shares) and [user-changeable passwords](https://github.com/9001/copyparty#user-changeable-passwords) you'll want to keep a separate db for each instance
* can be mostly disabled with `--no-ses` when it turns out to be buggy
## bugfixes
* v1.13.8 broke the u2c `--ow` option to replace/overwrite files on the server during upload 6eee6015
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0908-1925 `v1.15.0` fill the drives
## recent important news
* [v1.15.0 (2024-09-08)](https://github.com/9001/copyparty/releases/tag/v1.15.0) changed upload deduplication to be default-disabled
* [v1.14.3 (2024-08-30)](https://github.com/9001/copyparty/releases/tag/v1.14.3) fixed a bug that was introduced in v1.13.8 (2024-08-13); this bug could lead to **data loss** -- see the v1.14.3 release-notes for details
# upload deduplication now disabled by default
because many people found the behavior surprising. This also makes it easier to use copyparty together with other software, since there is no risk of damage to symlinks if there are no symlinks to damage
to enable deduplication, use either `--dedup` (old-default, symlink-based), or `--hardlink` (will use hardlinks when possible), or `--hardlink-only` (disallow symlinks). To choose the approach that fits your usecase, see [file deduplication](https://github.com/9001/copyparty#file-deduplication) in the readme
verification of local file consistency was also added; this happens when someone uploads a dupe, to ensure that no other software has modified the local file since last reindex. This unfortunately makes uploading of duplicate files much slower, and can be disabled with `--safe-dedup 1` if you know that only copyparty will be modifying the filesystem
## new features
* dedup improvements:
* verify consistency of local files before using them as dedup source 6e671c52
* if a local file has been altered by other software since the last reindexing, then this will now be detected
* u2c (commandline uploader): add mode to print hashes of local files 08848be7
* if you've lost a file but you know its `wark` (file identifier), you can now use u2c.exe to scan your whole filesystem for it: `u2c - .`
* #96 use local timezone in log messages b599fbae
## bugfixes
* dedup fixes:
* symlinks could break if moved/renamed inside a volume where deduplication was disabled after some files within had already been deduplicated 4401de04
* when moving/renaming, only consider symlinks between volumes if `xlink` volflag is set b5ad9369
* database consistency verifier (`-e2vp`):
* support filenames with newlines, and warn about missing files b0de84cb
* opengraph/`--og`: fix viewing textfiles e5a836cb
* up2k.js: fix confusing message when uploading many copies of the same file f1130db1
## other changes
* disable upload deduplication by default a2e0f986
* up2k.js: increase handshake timeout to several minutes because of the dedup changes c5988a04
* copyparty.exe: update to python 3.12.6
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0902-0108 `v1.14.4` another
## recent important news
* [v1.14.3 (2024-08-30)](https://github.com/9001/copyparty/releases/tag/v1.14.3) fixed a bug that was introduced in v1.13.8 (2024-08-13); this bug could lead to **data loss** -- see the v1.14.3 release-notes for details
## bugfixes
* a network glitch could cause the uploader UI to panic d9e95262
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0830-2311 `v1.14.3` important dedup fix
<img src="https://github.com/9001/copyparty/raw/hovudstraum/docs/logo.svg" width="250" align="right"/>
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates, such as [vulnerabilities](https://github.com/9001/copyparty/security) (most recently 2023-07-23)
# important bugfix ☢️
this version fixes a file deduplication bug which was introduced in [v1.13.8](https://github.com/9001/copyparty/releases/tag/v1.13.8), released 2024-08-13
its worst-case outcome is **loss of data** in the following scenario:
* someone uploads a file into a folder where that filename is already taken, but the file contents are different, and the server already has a copy of that new file elsewhere under a different name
specific example:
* the server has two existing files, `logo.png` and `logo-v2.png`, in the same volume but not necessarily in the same folder, and those files contain different data
* you have a local copy of `logo-v2.png` on your laptop, but your local filename is `logo.png`
* you upload your local `logo.png` onto the server, into the same folder as the server's `logo.png`
* because the files contain different data, the server accidentally replaces the contents of `logo.png` with your version
if you have been using the database feature (globally with `-e2dsa` or volflag `e2ds`), and you suspect you may have hit this bug, then it is a good idea to make a backup of the up2k databases for all your volumes (the files with names starting with `up2k.db`) before restarting copyparty and before you do anything else, especially if you do not have serverlogs from far back in time -- if you have either the databases and/or the serverlogs, then it is possible to identify replaced files with some manual work
you can check if you hit the bug using one of the following two approaches:
* if your OS has the [gnu find](https://linux.die.net/man/1/find) command, do a search for empty files with `find -type f -size 0`
* using copyparty (any OS), do the following steps:
* make sure that reindex-on-startup is enabled; either globally with `-e2dsa` or volflag `e2ds`
* then install this new copyparty version
* click the search tab `[🔎]` and type the number `0` into the `maximum MiB` textbox
if you find any empty files with a filename that indicates it was autogenerated to avoid a name collision, for example `logo.png-1725040569.239207-kbt0xteO.png`, and the value of the number after `logo.png` is larger than `1723507200` (unixtime for 2024-08-13), then this indicates that `logo.png` may have been replaced by another upload
if you have the serverlogs from when the original upload of `logo.png` was made, then this can be used to identify the original contents of the file that was replaced, and to look for other copies. Please get in touch on the discord for assistance if necessary
----
## new features
* shares: add revival and expiration extension ad2371f8
* share-owners can revive expired shares for `--shr-rt` minutes (default 1 day)
* ...and extend expiration time by adding 1 minute or 1 hour to the timer
* [sfx customizer](https://github.com/9001/copyparty/blob/hovudstraum/scripts/make-sfx.sh) improvements 03b13e8a
* improved translations stripper
* add more examples
## bugfixes
* the dedup bug 3da62ec2
* tftp: support unmapped root 01233991
## other changes
* copyparty.exe: update to pyinstaller 6.10.0
* textviewer wordwrapping c4e2b0f9
* add logo 7037e736 ee359742
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0823-2307 `v1.14.2` bing chilling
## new features
* #94 @ultwcz translated the UI to Chinese (thx!) 92edea1d
* #84 improvements to [shares](https://github.com/9001/copyparty#shares): 8122dded
* if one or more files are selected for sharing, they are placed into a virtual folder
* more appropriate password UI for accessing protected shares
* human-readable timestamps in shares listing
* u2c (commandline uploader): support multiple exclusion patterns f356faa2
## bugfixes
* remove confusing logmessage when downloading a zerobyte file 9f034d9c
* shares: 7ff46966
* fix crash if the root volume is unmapped
* log-spam on config reload
* password coalescing
* add chrome support
## other changes
* #93 add html IDs to the tabstrip 461f3158
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0819-0014 `v1.14.1` one step forward
[if i turn back now, then this will always follow... one step forward, forward](https://youtu.be/xe3Wkzc0O3k?t=27)
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates, such as [vulnerabilities](https://github.com/9001/copyparty/security) (most recently 2023-07-23)
## new features
* #92 users can change their own passwords 83fb569d 00da7440
* this feature is default-disabled; see [readme](https://github.com/9001/copyparty#user-changeable-passwords)
* #84 share files/folders by creating a temporary url 7c2beba5
* inspired by other file servers; click the share-button to create a link like `example.com/share/enkz8g374o8g`
* primary usecase is to sneak past authentication services (see issue description)
* the create-share UI has options to accept uploads into the share, and/or set expiration time
* this feature is default-disabled; see [readme](https://github.com/9001/copyparty#shares)
## bugfixes
* #93 fixes for vproxy / location-based / not-vhost-based reverse-proxying 0b46b1a6
* using `--rp-loc` to reverse-proxy from a subfolder made some UI stuff break
* listening on unix-sockets: 687df2fa
* fix `x-forwarded-for` support, and avoid a possible container-specific collision
* new syntax which allows setting unix-permissions and unix-group
* `-i unix:770:www:/tmp/party.sock` (see `--help-bind` for more examples)
* using relocation hooks (introduced in previous ver) could cause dedup issues c8f4aeae b0af4b37
* custom fonts using `@import` css statements 5a62cb48
* invert volume scrollwheel 7d8d9438
## other changes
* changed the button colors in theme 2 (pm-monokai) from red to yellow 5153db6b
* the red buttons look better, but are too confusing because usually red means off
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0813-0008 `v1.13.8` hook into place
## new features
* #86 intentional side-effects from hooks 6c94a63f
* use hooks (plugins) to conditionally move uploads into another folder depending on filename, extension, uploader ip/name, file contents, ...
* hooks can create additional files and tell copyparty to index them immediately, or delete an existing file based on some condition
* only one example so far though, [reloc-by-ext](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks#before-upload) which was a feature-request to dodge [sharex#3992](https://github.com/ShareX/ShareX/issues/3992)
* listen on unix-sockets ee9aad82
* `-i unix:/tmp/party.sock` stops listening on TCP ports entirely, and only listens on that unix-socket
* can be combined with regular sockets, `-i 127.0.0.1,unix:/tmp/a.sock`
* kinda buggy for now (need to `--xff-src=any` and doesn't let you set socket-perms yet), will be fixed in next ver
* makes it 10% faster, but more importantly offers tighter access control behind reverse-proxies
* inspired by https://www.oligo.security/blog/0-0-0-0-day-exploiting-localhost-apis-from-the-browser
* up2k stitching:
* more optimal stitch sizes for max throughput across connections c862ec1b
* improve fat32 compatibility 373194c3
* new option `--js-other` to load custom javascript dbd42bc6
* `--js-browser` affects the filebrowser page, `--js-other` does all the others
* endless possibilities, such as [adding a login-banner](https://github.com/9001/copyparty/blob/hovudstraum/contrib/plugins/banner.js) which [looks like this](https://github.com/user-attachments/assets/8ae8e087-b209-449c-b08d-74e040f0284b)
* list detected optional dependencies on startup 3db117d8
* hopefully reduces the guesswork / jank factor by a tiny bit
## bugfixes
* up2k stitching:
* put the request headers on a diet so they fit through more reverse-proxies 0da719f4
* fix deadlock on s390x (IBM mainframes) 250c8c56
## other changes
* add flags to disengage [features](https://github.com/9001/copyparty/tree/hovudstraum#feature-chickenbits) and [dependencies](https://github.com/9001/copyparty/tree/hovudstraum#dependency-chickenbits) in case they cause trouble 72361c99
* optimizations
* 6% faster on average d5c9c8eb
* docker: reduce ram usage 98ffaadf
* python2: reduce ram usage ebb19818
* docker: add [portainer howto](https://github.com/9001/copyparty/blob/hovudstraum/docs/examples/docker/portainer.md) e136231c
* update deps ca001c85
* pyftpdlib 1.5.10
* copyparty.exe: python 3.12.5
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0729-2028 `v1.13.6` not that big
## new features
* up2k.js: set clientside timeouts on http connections during upload 85e54980
* some reverse-proxy setups could cause uploads to hang indefinitely by eating requests; should recover nicely now
* audio-player shows statustext while loading 662541c6
* [bsod theme](https://github.com/9001/copyparty/tree/hovudstraum/contrib/themes) [(live demo)](https://cd.ocv.me/c/) 15ddcf53
## bugfixes
* fix bugs in the [long-distance upload optimizations](https://github.com/9001/copyparty/releases/tag/v1.13.5) in the previous version:
* up2k.js didn't necessarily use the expected chunksize when stitching 225bd80e
* u2c (commandline uploader): 8916bce3
* use the correct chunksize instead of overshooting like crazy
* could crash on exit if `-z` was enabled (so basically harmless)
* the "time spent uploading" statustext that was printed on exit could multiply by `-j` and exceed walltime
* misc ux 9bb6e0dc
* don't accept hotkeys until it's safe to do so
* improve messages regarding the [firefox crash](https://bugzilla.mozilla.org/show_bug.cgi?id=1790500)
* keep more console logs in memory (easier to debug)
* fix wordwrap in messageboxes on firefox a19a0fa9
## other changes
* changed the `xm` / "on message" [hook examples](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks#on-message) to reject users without write-access 99edba4f
* docker images were rebuilt on 2024-08-02, 23:30 UTC with new optimizations: 98ffaadf
* 😃 RAM usage decreased by `5-6 MiB` for most flavors; `10 MiB` for dj/iv
* 😕 image size grew by `4 MiB` (min), `6 MiB` (ac/im/iv), `9 MiB` (dj)
* 😃 startup time reduced to about half
* and avoids a deadlock on IBM mainframes
* updated comparison to other software 6b54972e
* `hfs2` is dead, `hfs3` and `filebrowser` improved
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0722-2323 `v1.13.5` american sized
## new features
* long-distance uploads are now **twice as fast** on average 132a8350
* boost tcp windowsize scaling by stitching together smaller chunks into bigger chonks so they fly better across the atlantic
* i'm not kidding, on the two routes we've tested this on we gained 1.6x / 160% (from US-West to Finland) and **2.6x / 260%** (Norway to US-East)
* files that are between 4 MiB and 256 MiB see the biggest improvement; 70% faster <= 768 MiB, 40% <= 1.5 GiB, 10% <= 6G
* if this turns out to be buggy, disable it serverside with `--u2sz 1,1,1` or clientside in the browser-ui: `[⚙️]` -> `up2k switches` -> change `64` to `1`
* u2c.py (CLI uploader): support stitching (☝️) + print a summary with hashing and upload speeds 987bce21
* video files can play as audio 53f1e3c9
* audio is extracted serverside to avoid wasting bandwidth
* extraction is lossy (converted to opus or mp3 depending on browser)
* togglebutton `🎧` in the gridview toolbar to enable/disable
* new hook: [into-the-cache-it-goes.py](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks#after-upload) d26a944d
* avoids a cloudflare bug (race condition?) where it will send truncated files to visitors on the very first load if several people simultaneously access a file that hasn't been viewed before
## bugfixes
* inline markdown/logues rendered black-on-black in firefox 54 and some other browsers from 2017 and older eeef8091
* unintuitive folder thumbnail selection if folder contains both `Cover.jpg` and `cover.jpg` f955d2bd
* the gridview toolbar got undocked after viewing a pic/vid dc449bf8
## other changes
* #90 recommend rclone in favor of davfs2 ef0ecf87
* improved some error messages e565ad5f
* added helptext exporters to generate the online [html](https://ocv.me/copyparty/helptext.html) and [txt](https://ocv.me/copyparty/helptext.txt) editions 59533990
* mention that cloudflare is incompatible with uploading files larger than 383.9 GiB
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0716-0457 `v1.13.4` descript.ion
## new features
* "medialinks"; instead of the usual hotlink, the basic-uploader (as used by sharex and such) can return a link that opens the file in the media viewer c9281f89
* enable for all uploads with volflag `medialinks`, or just for one upload by adding `?media` to the post url
* thumbnails are now fully compatible with dirkeys/filekeys 52e06226
* `--th-covers` will respect filename order, selecting the first matching filename as the folder thumbnail 1cdb1702
* new hook: [bittorrent downloader](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks#on-message) bd3b3863 803e1565
* hooks: d749683d
* can be restricted to only run when user has specific permissions
* user permissions are also included in the json message to the hook
* new syntax to prepend args to the hook's command
* (all this will be better documented after some additional upcoming hook-related features, see `--help-hooks` for now)
* support `descript.ion` usenet metadata; will parse and render into directory listings when possible 927c3bce
* directory listings are now 2% slower, eh who's keeping count anyways
* tftp-server: 45259251
* improved support for buggy clients
* improved ipv6 support, especially on macos
* improved robustness on unreliable networks
* #85 new option `--gsel` to default-enable the client setting to select files by ctrl-clicking them in the grid 9a87ee2f
* music player: set audio volume by scrollwheel 36d6d29a
## bugfixes
* race-the-beam (downloading an unfinished upload) could get interrupted near the end, requiring a manual resume in the browser's download manager to finish f37187a0
* ftp-server: when accessing the root folder of servers without a root folder, it could mention inaccessible folders 84e8e1dd
* ftp-server: uploads will automatically replace existing files if user has delete perms 0a9f4c60
* windows 2000 expects this behavior, otherwise it'll freak out and delete stuff and then not actually upload it, nice
* new option `--ftp-no-ow` restores old default behavior of rejecting upload if target filename exists
* music player:
* stop trying to recover from a corrupted file if the user already fixed it manually 55a011b9
* support downloading the currently playing song regardless of current folder c06aa683
* music player preloader: db6059e1
* stop searching after 5 folders of nothing
* don't crash playback by walking into error-pages
* `--og` (rich discord embeds) was incompatible with viewing markdown docs d75a2c77
* `--cgen` (configfile generator) much less jank d5de3f2f
## other changes
* mention that HTTP/2 is still usually slower than HTTP/1.1 dfe7f1d9
* give up much sooner if a client is supposed to send a request body but isn't c549f367
* support running copyparty as a server on windows 2000 and winXP 8c73e0cb 2fd12a83
* updated deps 6e58514b
* copyparty.exe: python 3.12, pillow 10.4, pyinstaller 6.9
* dompurify 3.1.6
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0601-2324 `v1.13.3` 700+
@@ -626,7 +1037,7 @@ dnf install https://ocv.me/copyparty/fedora/39/python3-copyparty.fc39.noarch.rpm
## other changes
* improved [systemd example](https://github.com/9001/copyparty/tree/hovudstraum/contrib/systemd) with hardening and a better example config
* logfiles are flushed for every line written; can be disabled with `--no-logflush` for ~3% more performance best-case
* iphones probably won't broadcast cover-art to car stereos over bluetooth anymore since the thingamajig in iOS that's in charge of that doesn't have cookie-access, and strapping in the auth is too funky so let's stop doing that b7723ac245b8b3e38d6410891ef1aa92d4772114
* iphones probably won't broadcast cover-art to car stereos over bluetooth anymore since the thingamajig in iOS that's in charge of that doesn't have cookie-access, and strapping in the auth is too funky so let's stop doing that b7723ac2
* can be remedied by enabling filekeys and granting unauthenticated people access that way, but that's too much effort for anyone to bother with I'm sure
@@ -895,12 +1306,12 @@ okay, i swear this is the last version for weeks! probably
* [r0c is much better](https://github.com/9001/r0c) than this joke
## bugfixes
* 163e3fce46122d64bf824762b6733ff2c3551ba5 the `x-forwarded-for` header was ignored if the nearest reverse-proxy is not asking from 127.0.0.1, which broke client IPs in containerized deployments
* 163e3fce the `x-forwarded-for` header was ignored if the nearest reverse-proxy is not asking from 127.0.0.1, which broke client IPs in containerized deployments
* the serverlog will now explain how to trust the reverse-proxy to provide client IPs, but basically,
* `--xff-hdr` specifies which header to read the client's real ip from
* `--xff-src` is an allowlist of IP-addresses to trust that header from
* a62f744a187bc9f821b540e8bb4e0b9a67bd01c8 if copyparty was started while an external HDD was not connected, and that volume's index was stored elsewhere, then the index would get wiped (since all the files are gone)
* 3b8f66c0d5c27a68841814ec06f1758f146a5ff5 javascript could crash while uploading from a very unreliable internet connection
* a62f744a if copyparty was started while an external HDD was not connected, and that volume's index was stored elsewhere, then the index would get wiped (since all the files are gone)
* 3b8f66c0 javascript could crash while uploading from a very unreliable internet connection
## other changes
* copyparty.exe: updated pillow to 10.0.1 which fixes the webp cve
@@ -946,7 +1357,7 @@ hello! it's been a while, an entire day even...
* default compression levels are gz:3, bz2:2, xz:1; override with `?tar=gz:9`
# bugfixes
* c1efd227b7377144a5760bc6cff64f4e87b626d9 symlink-deduplicated files got indexed with the wrong last-modified timestamp
* c1efd227 symlink-deduplicated files got indexed with the wrong last-modified timestamp
* mostly inconsequential; would cause the dupe's uploader-ip to be forgotten on the next server restart since it would reindex to "fix" the timestamp
* when linking [a search query](https://a.ocv.me/pub/#q=tags%20like%20soundsho*) it loads the results faster
@@ -961,12 +1372,12 @@ hello! it's been a while, an entire day even...
## new features
* iPhones and iPads are now able to...
* 9986136dfb2364edb35aa9fbb87410641c6d6af3 play entire albums while the screen is off without the music randomly stopping
* 9986136d play entire albums while the screen is off without the music randomly stopping
* apple keeps breaking AudioContext in new and interesting ways; time to give up (no more equalizer)
* 1c0d978979a703edeb792e552b18d3b7695b2d90 perform search queries and execude js code
* 1c0d9789 perform search queries and execude js code
* by translating [smart-quotes](https://stackoverflow.com/questions/48678359/ios-11-safari-html-disable-smart-punctuation) into regular `'` and `"` characters
* python 3.12 support
* technically a bugfix since it was added [a year ago](https://github.com/9001/copyparty/commit/32e22dfe84d5e0b13914b4d0e15c1b8c9725a76d) way before the first py3.12 alpha was released but turns out i botched it, oh well
* technically a bugfix since it was added [a year ago](https://github.com/9001/copyparty/commit/32e22dfe) way before the first py3.12 alpha was released but turns out i botched it, oh well
* filter error messages so they never include the filesystem path where copyparty's python files reside
* print more context in server logs if someone hits an unexpected permission-denied
@@ -1187,8 +1598,8 @@ Thanks for flying copyparty! And especially if you decide to continue doing so :
```bash
(gzip -dc access.log.*.gz; cat access.log) | sed -r 's/" [0-9]+ .*//' | grep -E 'cpr/.*%2[^0]' | grep -vF data:image/svg
```
* 77f1e5144455eb946db7368792ea11c934f0f6da fixes an extremely unlikely race-condition (see the commit for details)
* 8f59afb1593a75b8ce8c91ceee304097a07aea6e fixes another race-condition which is a bit worse:
* 77f1e514 fixes an extremely unlikely race-condition (see the commit for details)
* 8f59afb1 fixes another race-condition which is a bit worse:
* the unpost feature could collide with other database activity, with the worst-case outcome being aborted batch operations, for example a directory move or a batch-rename which stops halfways
----
@@ -1362,7 +1773,7 @@ don't get excited! nothing new and revolutionary, but `xvol` and `xdev` changed
# 2023-0426-2300 `v1.6.15` unexpected boost
## new features
* 30% faster folder listings due to [the very last thing](https://github.com/9001/copyparty/commit/55c74ad164633a0a64dceb51f7f534da0422cbb5) i'd ever expect to be a bottleneck, [thx perf](https://docs.python.org/3.12/howto/perf_profiling.html)
* 30% faster folder listings due to [the very last thing](https://github.com/9001/copyparty/commit/55c74ad1) i'd ever expect to be a bottleneck, [thx perf](https://docs.python.org/3.12/howto/perf_profiling.html)
* option to see the lastmod timestamps of symlinks instead of the target files
* makes the turbo mode of [u2cli, the commandline uploader and folder-sync tool](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) more turbo since copyparty dedupes uploads by symlinking to an existing copy and the symlink is stamped with the deduped file's lastmod
* **webdav:** enabled by default (because rclone will want this), can be disabled with arg `--dav-rt` or volflag `davrt`
@@ -1566,7 +1977,7 @@ don't get excited! nothing new and revolutionary, but `xvol` and `xdev` changed
the commandline up2k upload / filesearch client, now as a standalone windows exe
* based on python 3.7 so it runs on 32bit windows7 or anything newer
* *no https support* (saves space + the python3.7 openssl is getting old)
* built from b39ff92f34e3fca389c78109d20d5454af761f8e so it can do long filepaths and mojibake
* built from b39ff92f so it can do long filepaths and mojibake
----
@@ -1773,7 +2184,7 @@ but nothing is affected (that i know of):
* tar/zip-download of hidden folders
* unpost filtering was buggy for non-ascii characters
* moving a deduplicated file on a volume where deduplication was since disabled
* improved the [linux 6.0.16](https://utcc.utoronto.ca/~cks/space/blog/linux/KernelBindBugIn6016) kernel bug [workaround](https://github.com/9001/copyparty/commit/9065226c3d634a9fc15b14a768116158bc1761ad) because there is similar funk in 5.x
* improved the [linux 6.0.16](https://utcc.utoronto.ca/~cks/space/blog/linux/KernelBindBugIn6016) kernel bug [workaround](https://github.com/9001/copyparty/commit/9065226c) because there is similar funk in 5.x
* add custom text selection colors because chrome is currently broken on fedora
* blockdevs (`/dev/nvme0n1`) couldn't be downloaded as files
* misc fixes for location-based reverse-proxying
@@ -1802,7 +2213,7 @@ hello from warsaw airport (goodbye japan ;_;)
* browser ui didn't allow specifying number of threads for file search
* dont panic if a digit key is pressed while viewing an image
* workaround [linux kernel bug](https://utcc.utoronto.ca/~cks/space/blog/linux/KernelBindBugIn6016) causing log spam on dualstack
* ~~related issue (also mostly harmless) will be fixed next relese 010770684db95bece206943768621f2c7c27bace~~
* ~~related issue (also mostly harmless) will be fixed next relese 01077068~~
* they fixed it in linux 6.1 so these workarounds will be gone too
@@ -2768,7 +3179,7 @@ fixed another dumdum, sorry for the spam
* the ftp server is not compatible with python 3.12 (releasing october 2023)
* will be fixed in a [future version of pyftpdlib](https://github.com/giampaolo/pyftpdlib/issues/560)
the sfx was built from https://github.com/9001/copyparty/commit/39e7a7a2311ab8da43b2a9a18ae39d06202105e3
the sfx was built from https://github.com/9001/copyparty/commit/39e7a7a2
@@ -3549,7 +3960,7 @@ we did it reddit 👉😎👉
* latest gzip edition of the sfx: [v0.11.18](https://github.com/9001/copyparty/releases/tag/v0.11.18)
* if upgrading from v0.11.x or before, see [v0.12.4](https://github.com/9001/copyparty/releases/tag/v0.12.4)
note: `copyparty-sfx.py` is https://github.com/9001/copyparty/commit/5955940b82adddb7149125a60463aba22f1c8c31 which fixes upload eta
note: `copyparty-sfx.py` is https://github.com/9001/copyparty/commit/5955940b which fixes upload eta
## new features
* provide password using basic-authentication
@@ -5019,7 +5430,7 @@ nothing really important happened since [v0.11.6](https://github.com/9001/copypa
* this release fixes a missing permission check which could allow users to download write-only folders
* this bug was introduced 19 days ago, in `v0.10.17`
* the requirement to be affected is write-only folders mounted within readable folders
* and the worst part is there was a unit-test exactly for this, https://github.com/9001/copyparty/commit/273ca0c8da0d94f9d06ca16bd86c0301d9d06455 way overdue
* and the worst part is there was a unit-test exactly for this, https://github.com/9001/copyparty/commit/273ca0c8 way overdue
* also fixes minor bugs introduced in `v0.11.1`
* this version is the same as `v0.11.5` on pypi
@@ -5203,8 +5614,8 @@ in other news, minor ui tweaks:
* a few lightmode adjustments
* less cpu usage? should be
`copyparty-sfx.py` (latest) made from c5db7c1a0c8f6ab23138ad7ea7642a6260e7da9b (v0.10.15-15) fixes `-j` (multiprocessing/high-performance)
`copyparty-sfx-5a579db.py` (old) made from 5a579dba52e46c202b79c3d80c3b1c996c7b2e4a (v0.10.15-5) reduced the size
`copyparty-sfx.py` (latest) made from c5db7c1a (v0.10.15-15) fixes `-j` (multiprocessing/high-performance)
`copyparty-sfx-5a579db.py` (old) made from 5a579dba (v0.10.15-5) reduced the size
@@ -5517,7 +5928,7 @@ and i just realized i never added runtime tag scanning so copyparty will have to
use `-e2dsa` and `-e2ts` to enable the media tag features globally, or enable/disable them per-volume (see readme)
**NOTE:** older fuse clients (from before 5e3775c1afc9438f9930080a9b8542a063ba1765 / older than v0.8.0) must be upgraded for this copyparty release, however the new client still supports connecting to old servers
**NOTE:** older fuse clients (from before 5e3775c1 / older than v0.8.0) must be upgraded for this copyparty release, however the new client still supports connecting to old servers
other changes include
* support chunked PUT requests from curl
@@ -5737,7 +6148,7 @@ valvrave-stop.jpg
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2020-0818-1822 `v0.5.2` da setter vi punktum
full disclaimer: `copyparty-sfx.py` was built using `sfx.py` from ~~82e568d4c9f25bfdfd1bf5166f0ebedf058723ee~~ f550a8171d298992f4ef569d2fc99a6037a44ea8
full disclaimer: `copyparty-sfx.py` was built using `sfx.py` from ~~82e568d4~~ f550a817

View File

@@ -1,7 +1,7 @@
## devnotes toc
* top
* [future plans](#future-plans) - some improvement ideas
* [future ideas](#future-ideas) - list of dreams which will probably never happen
* [design](#design)
* [up2k](#up2k) - quick outline of the up2k protocol
* [why not tus](#why-not-tus) - I didn't know about [tus](https://tus.io/)
@@ -12,6 +12,8 @@
* [write](#write)
* [admin](#admin)
* [general](#general)
* [event hooks](#event-hooks) - on writing your own [hooks](../README.md#event-hooks)
* [hook effects](#hook-effects) - hooks can cause intentional side-effects
* [assumptions](#assumptions)
* [mdns](#mdns)
* [sfx repack](#sfx-repack) - reduce the size of an sfx by removing features
@@ -25,9 +27,9 @@
* [discarded ideas](#discarded-ideas)
# future plans
# future ideas
some improvement ideas
list of dreams which will probably never happen
* the JS is a mess -- a ~~preact~~ rewrite would be nice
* preferably without build dependencies like webpack/babel/node.js, maybe a python thing to assemble js files into main.js
@@ -55,8 +57,8 @@ quick outline of the up2k protocol, see [uploading](https://github.com/9001/cop
* server creates the `wark`, an identifier for this upload
* `sha512( salt + filesize + chunk_hashes )`
* and a sparse file is created for the chunks to drop into
* client uploads each chunk
* header entries for the chunk-hash and wark
* client sends a series of POSTs, with one or more consecutive chunks in each
* header entries for the chunk-hashes (comma-separated) and wark
* server writes chunks into place based on the hash
* client does another handshake with the hashlist; server replies with OK or a list of chunks to reupload
@@ -137,6 +139,7 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
| GET | `?tar&w` | pregenerate webp thumbnails |
| GET | `?tar&j` | pregenerate jpg thumbnails |
| GET | `?tar&p` | pregenerate audio waveforms |
| GET | `?shares` | list your shared files/folders |
| GET | `?ups` | show recent uploads from your IP |
| GET | `?ups&filter=f` | ...where URL contains `f` |
| GET | `?mime=foo` | specify return mimetype `foo` |
@@ -173,6 +176,9 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
| mPOST | `?media` | `f=FILE` | ...and return medialink (not hotlink) |
| mPOST | | `act=mkdir`, `name=foo` | create directory `foo` at URL |
| POST | `?delete` | | delete URL recursively |
| POST | `?eshare=rm` | | stop sharing a file/folder |
| POST | `?eshare=3` | | set expiration to 3 minutes |
| jPOST | `?share` | (complicated) | create temp URL for file/folder |
| jPOST | `?delete` | `["/foo","/bar"]` | delete `/foo` and `/bar` recursively |
| uPOST | | `msg=foo` | send message `foo` into server log |
| mPOST | | `act=tput`, `body=TEXT` | overwrite markdown document at URL |
@@ -204,6 +210,32 @@ upload modifiers:
| GET | `?pw=x` | logout |
# event hooks
on writing your own [hooks](../README.md#event-hooks)
## hook effects
hooks can cause intentional side-effects, such as redirecting an upload into another location, or creating+indexing additional files, or deleting existing files, by returning json on stdout
* `reloc` can redirect uploads before/after uploading has finished, based on filename, extension, file contents, uploader ip/name etc.
* `idx` informs copyparty about a new file to index as a consequence of this upload
* `del` tells copyparty to delete an unrelated file by vpath
for these to take effect, the hook must be defined with the `c1` flag; see example [reloc-by-ext](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reloc-by-ext.py)
a subset of effect types are available for a subset of hook types,
* most hook types (xbu/xau/xbr/xar/xbd/xad/xm) support `idx` and `del` for all http protocols (up2k / basic-uploader / webdav), but not ftp/tftp/smb
* most hook types will abort/reject the action if the hook returns nonzero, assuming flag `c` is given, see examples [reject-extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) and [reject-mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py)
* `xbu` supports `reloc` for all http protocols (up2k / basic-uploader / webdav), but not ftp/tftp/smb
* `xau` supports `reloc` for basic-uploader / webdav only, not up2k or ftp/tftp/smb
* so clients like sharex are supported, but not dragdrop into browser
to trigger indexing of files `/foo/1.txt` and `/foo/bar/2.txt`, a hook can `print(json.dumps({"idx":{"vp":["/foo/1.txt","/foo/bar/2.txt"]}}))` (and replace "idx" with "del" to delete instead)
* note: paths starting with `/` are absolute URLs, but you can also do `../3.txt` relative to the destination folder of each uploaded file
# assumptions
## mdns
@@ -327,10 +359,6 @@ can be reproduced with `--no-sendfile --s-wr-sz 8192 --s-wr-slp 0.3 --rsp-slp 6`
* remove brokers / multiprocessing stuff; https://github.com/9001/copyparty/tree/no-broker
* reduce the nesting / indirections in `HttpCli` / `httpcli.py`
* nearly zero benefit from stuff like replacing all the `self.conn.hsrv` with a local `hsrv` variable
* reduce up2k roundtrips
* start from a chunk index and just go
* terminate client on bad data
* not worth the effort, just throw enough conncetions at it
* single sha512 across all up2k chunks?
* crypto.subtle cannot into streaming, would have to use hashwasm, expensive
* separate sqlite table per tag

View File

@@ -0,0 +1,45 @@
the following setup appears to work (copyparty starts, accepts uploads, is able to persist config)
tested on debian 12 using [portainer-ce](https://docs.portainer.io/start/install-ce/server/docker/linux) with [docker-ce](https://docs.docker.com/engine/install/debian/) as root (not rootless)
before making the container, first `mkdir /etc/copyparty /srv/pub` which will be bind-mounts into the container
> both `/etc/copyparty` and `/srv/pub` are examples; you can change them if you'd like
put your copyparty config files directly into `/etc/copyparty` and the files to share inside `/srv/pub`
on first startup, copyparty will create a subfolder inside `/etc/copyparty` called `copyparty` where it puts some runtime state; for example replacing `/etc/copyparty/copyparty/cert.pem` with another TLS certificate is a quick and dirty way to get valid HTTPS (if you really want copyparty to handle that and not a reverse-proxy)
## in portainer:
```
environments -> local -> containers -> add container:
name = copyparty-ac
registry = docker hub
image = copyparty/ac
always pull = no
manual network port publishing:
3923 to 3923 [TCP]
advanced -> command & logging:
console = interactive & tty
advanced -> volumes -> map additional volume:
container = /cfg [Bind]
host = /etc/copyparty [Writable]
advanced -> volumes -> map additional volume:
container = /w [Bind]
host = /srv/pub [Writable]
```
notes:
* `/cfg` is where copyparty expects to find its config files; `/etc/copyparty` is just an example mapping to that
* `/w` is where copyparty expects to find the folder to share; `/srv/pub` is just an example mapping to that
* the volumes must be bind-mounts to avoid permission issues (or so the theory goes)

View File

@@ -16,7 +16,7 @@ open up notepad and save the following as `c:\users\you\documents\party.conf` (f
```yaml
[global]
lo: ~/logs/cpp-%Y-%m%d.xz # log to c:\users\you\logs\
e2dsa, e2ts, no-dedup, z # sets 4 flags; see expl.
e2dsa, e2ts, z # sets 3 flags; see explanation
p: 80, 443 # listen on ports 80 and 443, not 3923
theme: 2 # default theme: protonmail-monokai
lang: nor # default language: viking
@@ -46,11 +46,10 @@ open up notepad and save the following as `c:\users\you\documents\party.conf` (f
### config explained: [global]
the `[global]` section accepts any config parameters [listed here](https://ocv.me/copyparty/helptext.html), also viewable by running copyparty (either the exe or the sfx.py) with `--help`, so this is the same as running copyparty with arguments `--lo c:\users\you\logs\copyparty-%Y-%m%d.xz -e2dsa -e2ts --no-dedup -z -p 80,443 --theme 2 --lang nor`
the `[global]` section accepts any config parameters [listed here](https://ocv.me/copyparty/helptext.html), also viewable by running copyparty (either the exe or the sfx.py) with `--help`, so this is the same as running copyparty with arguments `--lo c:\users\you\logs\copyparty-%Y-%m%d.xz -e2dsa -e2ts -z -p 80,443 --theme 2 --lang nor`
* `lo: ~/logs/cpp-%Y-%m%d.xz` writes compressed logs (the compression will make them delayed)
* `e2dsa` enables the upload deduplicator and file indexer, which enables searching
* `e2dsa` enables the file indexer, which enables searching and upload-undo
* `e2ts` enables music metadata indexing, making albums / titles etc. searchable too
* `no-dedup` writes full dupes to disk instead of symlinking, since lots of windows software doesn't handle symlinks well
* but the improved upload speed from `e2dsa` is not affected
* `z` enables zeroconf, making the server available at `http://HOSTNAME.local/` from any other machine in the LAN
* `p: 80,443` listens on the ports `80` and `443` instead of the default `3923`

210
docs/logo.svg Normal file
View File

@@ -0,0 +1,210 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<svg
width="300mm"
height="207mm"
viewBox="0 0 300 207"
version="1.1"
id="svg1"
inkscape:version="1.3.2 (091e20ef0f, 2023-11-25)"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:dc="http://purl.org/dc/elements/1.1/">
<title
id="title1">copyparty_logo</title>
<defs
id="defs1">
<linearGradient
inkscape:collect="always"
id="linearGradient1">
<stop
style="stop-color:#ffcc55;stop-opacity:1"
offset="0"
id="stop1" />
<stop
style="stop-color:#ffcc00;stop-opacity:1"
offset="0.2"
id="stop2" />
<stop
style="stop-color:#ff8800;stop-opacity:1"
offset="1"
id="stop3" />
</linearGradient>
<linearGradient
inkscape:collect="always"
xlink:href="#linearGradient1"
id="linearGradient2"
x1="15"
y1="15"
x2="15"
y2="143"
gradientUnits="userSpaceOnUse" />
</defs>
<metadata
id="metadata5">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title>copyparty_logo</dc:title>
<dc:source>github.com/9001/copyparty</dc:source>
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:groupmode="layer"
id="layer1"
inkscape:label="kassett">
<rect
style="fill:#333333"
id="rect1"
width="300"
height="205"
x="0"
y="0"
rx="12"
ry="12" />
<rect
style="fill:url(#linearGradient2)"
id="rect2"
width="270"
height="128"
x="15"
y="15"
rx="8"
ry="8" />
<rect
style="fill:#333333"
id="rect3"
width="172"
height="52"
x="64"
y="72"
rx="26"
ry="26" />
<circle
style="fill:#cccccc"
id="circle1"
cx="91"
cy="98"
r="18" />
<circle
style="fill:#cccccc"
id="circle2"
cx="209"
cy="98"
r="18" />
<path
style="fill:#737373;stroke-width:1px"
d="m 48,207 10,-39 c 1.79,-6.2 5.6,-7.8 12,-8 60,-1 100,-1 160,0 6.4,0.2 10,1.8 12,8 l 10,39 z"
id="path1"
sodipodi:nodetypes="ccccccc" />
</g>
<g
inkscape:groupmode="layer"
id="layer3"
inkscape:label="tekst"
style="display:none">
<text
xml:space="preserve"
style="font-size:38.8056px;line-height:1.25;font-family:Akbar;-inkscape-font-specification:Akbar;letter-spacing:3.70417px;word-spacing:0px;fill:#333333"
x="47.153069"
y="55.548954"
id="text1"><tspan
sodipodi:role="line"
id="tspan1"
x="47.153069"
y="55.548954"
style="-inkscape-font-specification:Akbar"
rotate="0 0">copyparty</tspan></text>
</g>
<g
inkscape:groupmode="layer"
id="layer4"
inkscape:label="stensatt">
<path
d="m 63.5,50.9 q -0.85,0.93 -4.73,2.3 -3.6,1.3 -4.4,1.3 -3.3,0 -5.1,-2.1 -1.75,-2 -1.75,-5.36 0,-4.6 3.76,-7.64 3.3,-2.7 7.3,-2.7 0.4,0 0.93,0.74 0.54,0.7 0.54,1.16 0,2.06 -2.2,2.7 -1.36,0.4 -4.04,1.16 -2.2,1.16 -2.2,4.4 0,3.2 2.9,3.2 0.85,0 0.85,0 0.54,0 1.44,-0.16 1.1,-0.23 2.9,-0.74 1.8,-0.54 2.13,-0.54 0.4,0 1.75,0.6 z"
style="fill:#333333"
id="path11" />
<path
d="m 87.6,45 q 0,4.2 -3.7,6.95 -3.2,2.3 -6.87,2.3 -3.4,0 -6,-2.6 -2.5,-2.6 -2.5,-6 0,-3.6 3.14,-6.64 3.2,-3 6.8,-3 3.5,0 6.3,2.76 2.83,2.76 2.83,6.25 z m -3.4,0.16 q 0,-2.25 -1.75,-3.7 -1.7,-1.5 -4,-1.5 -0.1,0 -1.6,1.6 -1.44,1.55 -2.44,1.55 -0.6,0 -0.8,-0.3 -1.16,2.3 -1.16,3 0,2.25 2.13,3.4 1.6,0.9 3.6,0.9 2,0 3.76,-1.1 2.25,-1.4 2.25,-3.84 z"
style="fill:#333333"
id="path12" />
<path
d="m 112.8,46.8 q 0,2.8 -1.9,4.4 -1.8,1.5 -4.7,1.5 -0.7,0 -2.7,-0.4 -1.9,-0.4 -2.6,-0.4 -2.1,0 -2.1,2.64 0,0.85 0.23,2.6 0.2,1.75 0.2,2.6 0,1.9 -0.77,2.83 -1.44,0 -3,-0.85 -1.46,-9.5 -1.46,-12 0,-3.65 1.75,-8.1 2.37,-6.05 6.45,-6.05 3.7,0 7.3,4.1 3.3,3.84 3.3,7.14 z m -3.8,0.2 q -0.6,-2.2 -2.6,-4.4 -2.3,-2.5 -4.3,-2.5 -1.3,0 -2.33,2.2 -0.9,1.8 -0.9,3.26 0,0.47 0.38,1.24 0.43,0.8 0.85,0.8 1.1,0 3.2,0.3 2.1,0.3 3.2,0.3 0.3,0 1.3,-0.4 1,-0.47 1.3,-0.74 z"
style="fill:#333333"
id="path13" />
<path
d="m 133,40 q -2.1,4.1 -3.2,7 -0.1,0.3 -1.6,4.5 -0.4,1.36 -1,4.2 -0.5,2.83 -1,4.2 -1,2.83 -2.3,2.64 -1.4,-0.2 -1.6,-1.6 0,-0.2 0,-0.5 0,-0.16 0.3,-1.5 1,-5.04 1,-6.44 0,-0.54 -0.1,-0.74 -1.4,-2.44 -4.1,-7.4 -2.7,-4.97 -2.4,-7.7 1.5,-1.36 2.1,-1.36 0.4,0 1.1,0.6 0.6,0.6 0.7,1.1 0.8,6.2 4.9,11.1 1,-1.8 1.8,-4.04 0.5,-1.4 1.6,-4.15 1.9,-4.46 3.4,-4.46 0.2,0 0.4,0.1 0.9,0.3 1.3,2.8 z"
style="fill:#333333"
id="path14" />
<path
d="m 157.5,48 q 0,2.8 -1.9,4.4 -1.8,1.5 -4.7,1.5 -0.7,0 -2.7,-0.4 -1.9,-0.4 -2.6,-0.4 -2,0 -2,2.64 0,0.85 0.2,2.6 0.2,1.75 0.2,2.6 0,1.9 -0.7,2.83 -1.5,0 -3,-0.85 -1.5,-9.5 -1.5,-11.95 0,-3.65 1.8,-8.1 2.3,-6.05 6.4,-6.05 3.7,0 7.2,4.1 3.3,3.84 3.3,7.14 z m -3.8,0.2 q -0.6,-2.2 -2.6,-4.4 -2.3,-2.5 -4.3,-2.5 -1.3,0 -2.3,2.2 -0.9,1.8 -0.9,3.26 0,0.47 0.4,1.24 0.4,0.8 0.8,0.8 1.1,0 3.2,0.3 2.1,0.3 3.2,0.3 0.3,0 1.3,-0.4 1,-0.47 1.3,-0.74 z"
style="fill:#333333"
id="path15" />
<path
d="m 182,53.3 q 0,0.9 -0.6,1.5 -0.6,0.6 -1.4,0.6 -1.6,0 -3,-0.9 -1.4,-0.93 -2.1,-2.3 -0.7,-0.1 -1.5,0.85 -0.9,1.16 -1.1,1.24 -1.2,0.54 -3.9,0.54 -2.2,0 -3.9,-2.44 -1.5,-2.13 -1.5,-4 0,-3.4 3.4,-6.4 3.2,-2.9 6.7,-2.9 0.9,0 1.7,0.6 0.8,0.6 0.8,1.44 0,0.54 -0.4,1.1 2.4,0.9 2.4,2.83 0,0.35 -0.1,1.05 -0.1,0.7 -0.1,1.05 0,0.4 0.1,0.6 0.5,1.3 2.5,3.4 1.9,1.9 1.9,2.2 z m -8.1,-10.1 q -0.4,0 -1.1,-0.1 -0.8,-0.16 -1.1,-0.16 -1.3,0 -3.2,1.94 -1.9,1.94 -1.9,3.3 0,0.8 0.7,1.8 0.9,1.3 2.2,1.3 2.6,0 3.5,-2.9 0.5,-2.6 1,-5.16 z"
style="fill:#333333"
id="path16" />
<path
d="m 203.8,42.4 q -0.4,0.4 -1.5,0.4 -0.9,0 -2.5,-0.3 -1.7,-0.3 -2.5,-0.3 -4.7,0 -5.5,6.9 -0.3,3.1 -0.4,3.3 -0.4,1 -1.7,2.3 h -1.1 q -0.7,-1.2 -1.3,-4.1 -0.6,-2.76 -0.6,-4.27 0,-1.16 0.1,-1.5 0.2,-0.54 1,-0.54 0.3,0 0.6,0.3 0.4,0.3 0.4,0.3 1.9,-3.53 3.1,-4.6 1.8,-1.7 5.1,-1.7 1.4,0 3.6,0.9 2.8,1.16 3.3,2.8 z"
style="fill:#333333"
id="path17" />
<path
d="m 229.5,37.16 q 0.3,0.8 0.3,1.44 0,1.86 -2.4,1.86 -1,0 -3.5,-0.5 -2.5,-0.54 -3.4,-0.54 -1.3,0 -1.5,0.1 -0.4,0.2 -0.4,1.2 0,2.2 0.6,6.9 0.7,5.86 1.6,6.13 -0.4,0.35 -0.4,1.1 -1.2,0.7 -2.6,0.7 -1.4,0 -2,-3.9 -0.2,-1.36 -0.5,-7.76 -0.2,-4.6 -0.8,-5.5 -0.3,-0.47 -4.3,-0.35 -1,0 -1.6,0.1 -0.5,0 -0.3,0 -0.8,0 -1.2,-0.7 -0.5,-1.3 -0.5,-1.4 0,-1.44 4.1,-2 1.6,-0.16 4.7,-0.5 0,-0.85 -0.1,-2.56 0,-1.75 0,-2.6 0,-4.35 2.1,-4.35 0.5,0 1.1,0.6 0.6,0.6 0.6,1.1 v 7.9 q 1.1,1.2 5,1.7 3.9,0.5 5.3,1.86 z"
style="fill:#333333"
id="path18" />
<path
d="m 251.2,40.2 q -2,4.1 -3.2,7 -0.1,0.3 -1.5,4.5 -0.5,1.36 -1,4.2 -0.5,2.83 -1,4.2 -1,2.83 -2.4,2.64 -1.4,-0.2 -1.5,-1.6 -0.1,-0.2 -0.1,-0.5 0,-0.16 0.3,-1.5 1.1,-5.04 1.1,-6.44 0,-0.54 -0.1,-0.74 -1.4,-2.44 -4.1,-7.4 -2.7,-4.97 -2.4,-7.7 1.4,-1.36 2.1,-1.36 0.4,0 1,0.6 0.6,0.6 0.7,1.1 0.9,6.2 4.9,11.1 1,-1.8 1.9,-4.04 0.5,-1.4 1.6,-4.15 1.8,-4.46 3.4,-4.46 0.2,0 0.4,0.1 0.8,0.3 1.2,2.8 z"
style="fill:#333333"
id="path19" />
</g>
<g
inkscape:groupmode="layer"
id="layer5"
inkscape:label="tagger">
<g
id="g1">
<path
id="path4"
style="fill:#333333"
d="m 111.4,83.335 -9.526,5.5 2.5,4.33 9.526,-5.5 z m -33.775,19.5 -9.526,5.5 2.5,4.33 9.526,-5.5 z"
sodipodi:nodetypes="cccccccccc" />
<path
id="path5"
style="fill:#333333"
d="M 88.5,73 V 84 h 5 V 73 Z m 0,39 v 11 h 5 V 112 Z"
sodipodi:nodetypes="cccccccccc" />
<path
id="path6"
style="fill:#333333"
d="m 68.1,87.665 9.526,5.5 2.5,-4.33 -9.526,-5.5 z m 33.775,19.5 9.527,5.5 2.5,-4.33 -9.527,-5.5 z"
sodipodi:nodetypes="cccccccccc" />
</g>
<g
id="g2"
transform="rotate(30,150,318.19)">
<path
id="path7"
style="fill:#333333"
d="m 111.4,83.335 -9.526,5.5 2.5,4.33 9.526,-5.5 z m -33.775,19.5 -9.526,5.5 2.5,4.33 9.526,-5.5 z"
sodipodi:nodetypes="cccccccccc" />
<path
id="path8"
style="fill:#333333"
d="M 88.5,73 V 84 h 5 V 73 Z m 0,39 v 11 h 5 V 112 Z"
sodipodi:nodetypes="cccccccccc" />
<path
id="path9"
style="fill:#333333"
d="m 68.1,87.665 9.526,5.5 2.5,-4.33 -9.526,-5.5 z m 33.775,19.5 9.527,5.5 2.5,-4.33 -9.527,-5.5 z"
sodipodi:nodetypes="cccccccccc" />
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 8.3 KiB

View File

@@ -141,6 +141,9 @@ find -maxdepth 1 -printf '%s %p\n' | sort -n | awk '!/-([0-9a-zA-Z_-]{11})\.(mkv
# unique stacks in a stackdump
f=a; rm -rf stacks; mkdir stacks; grep -E '^#' $f | while IFS= read -r n; do awk -v n="$n" '!$0{o=0} o; $0==n{o=1}' <$f >stacks/f; h=$(sha1sum <stacks/f | cut -c-16); mv stacks/f stacks/$h-"$n"; done ; find stacks/ | sort | uniq -cw24
# find unused css variables
cat browser.css | sed -r 's/(var\()/\n\1/g' | awk '{sub(/:/," ")} $1~/^--/{d[$1]=1} /var\(/{sub(/.*var\(/,"");sub(/\).*/,"");u[$1]=1} END{for (x in u) delete d[x]; for (x in d) print x}' | tr '\n' '|'
##
## sqlite3 stuff
@@ -252,6 +255,9 @@ cat copyparty/httpcli.py | awk '/^[^a-zA-Z0-9]+def / {printf "%s\n%s\n\n", f, pl
# create a folder with symlinks to big files
for d in /usr /var; do find $d -type f -size +30M 2>/dev/null; done | while IFS= read -r x; do ln -s "$x" big/; done
# up2k worst-case testfiles: create 64 GiB (256 x 256 MiB) of sparse files; each file takes 1 MiB disk space; each 1 MiB chunk is globally unique
for f in {0..255}; do echo $f; truncate -s 256M $f; b1=$(printf '%02x' $f); for o in {0..255}; do b2=$(printf '%02x' $o); printf "\x$b1\x$b2" | dd of=$f bs=2 seek=$((o*1024*1024)) conv=notrunc 2>/dev/null; done; done
# py2 on osx
brew install python@2
pip install virtualenv

View File

@@ -63,9 +63,28 @@ add your own translations by using the english or norwegian one from `browser.js
the easy way is to open up and modify `browser.js` in your own installation; depending on how you installed copyparty it might be named `browser.js.gz` instead, in which case just decompress it, restart copyparty, and start editing it anyways
you will be delighted to see inline html in the translation strings; to help prevent syntax errors, there is [a very jank linux script](https://github.com/9001/copyparty/blob/hovudstraum/scripts/tlcheck.sh) which is slightly better than nothing -- just beware the false-positives, so even if it complains it's not necessarily wrong/bad
if you're running `copyparty-sfx.py` then you'll find it at `/tmp/pe-copyparty.1000/copyparty/web` (on linux) or `%TEMP%\pe-copyparty\copyparty\web` (on windows)
* make sure to keep backups of your work religiously! since that location is volatile af
if editing `browser.js` is inconvenient in your setup then you can instead do this:
* add your translation to a separate javascript file (`tl.js`) and make it load before `browser.js` with the help of `--html-head='<script src="/tl.js"></script>'`
* as the page loads, `browser.js` will look for a function named `langmod` so define that function and make it insert your translation into the `Ls` and `LANGS` variables so it'll take effect
## translations (docker-friendly)
if editing `browser.js` is inconvenient in your setup, for example if you're running in docker, then you can instead do this:
* if you have python, go to the `scripts` folder and run `./tl.py fra Français` to generate a `tl.js` which is perfect for translating to French, using the three-letter language code `fra`
* if you do not have python, you can also just grab `tl.js` from the scripts folder, but I'll probably forget to keep that up to date... and then you'll have to find/replace all `"eng"` and `Ls.eng` to your three-letter language code
* put your `tl.js` inside a folder that is being shared by your copyparty, preferably the webroot
* run copyparty with the argument `--html-head='<script src="/tl.js"></script>'`
* if you placed `tl.js` in the webroot then you're all good, but if you put it somewhere else then change `/tl.js` accordingly
* if you are running copyparty with config files, you can do this:
```yaml
[global]
html-head: <script src="/tl.js"></script>
```
you can now edit `tl.js` and press CTRL-SHIFT-R in the browser to see your changes take effect as you go
if you want to contribute your translation back to the project (please do!) then you'll want to...
* grab all of the text inside your `var tl_cpanel = {` and add it to the translations inside `copyparty/web/splash.js` in the repo
* and the text inside your `var tl_browser = {` and add that to the translations inside `copyparty/web/browser.js` in the repo

View File

@@ -20,6 +20,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
* 💾 = what copyparty offers as an alternative
* 🔵 = similarities
* ⚠️ = disadvantages (something copyparty does "better")
* 🔥 = hazards
## toc
@@ -37,7 +38,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
* [another matrix](#another-matrix)
* [reviews](#reviews)
* [copyparty](#copyparty)
* [hfs2](#hfs2)
* [hfs2](#hfs2) 🔥
* [hfs3](#hfs3)
* [nextcloud](#nextcloud)
* [seafile](#seafile)
@@ -83,8 +84,8 @@ the table headers in the matrixes below are the different softwares, with a quic
the softwares,
* `a` = [copyparty](https://github.com/9001/copyparty)
* `b` = [hfs2](https://rejetto.com/hfs/)
* `c` = [hfs3](https://github.com/rejetto/hfs)
* `b` = [hfs2](https://github.com/rejetto/hfs2/) 🔥
* `c` = [hfs3](https://rejetto.com/hfs/)
* `d` = [nextcloud](https://github.com/nextcloud/server)
* `e` = [seafile](https://github.com/haiwen/seafile)
* `f` = [rclone](https://github.com/rclone/rclone), specifically `rclone serve webdav .`
@@ -152,19 +153,20 @@ symbol legend,
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| download folder as zip | █ | █ | █ | █ | | | █ | | █ | █ | | █ | |
| download folder as tar | █ | | | | | | | | | | | | |
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | █ | █ |
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | | █ |
| resumable uploads | █ | | | | | | | | █ | | █ | | |
| upload segmenting | █ | | | | | | | █ | █ | | █ | | █ |
| download folder as tar | █ | | | | | | | | | | | | |
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | █ | █ |
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | | █ |
| resumable uploads | █ | | | | | | | | █ | | █ | | |
| upload segmenting | █ | | | | | | | █ | █ | | █ | | █ |
| upload acceleration | █ | | | | | | | | █ | | █ | | |
| upload verification | █ | | | █ | █ | | | | █ | | | | |
| upload deduplication | █ | | | | █ | | | | █ | | | | |
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | | |
| race the beam ("p2p") | █ | | | | | | | | | • | | | |
| CTRL-V from device | █ | | | █ | | | | | | | | | |
| race the beam ("p2p") | █ | | | | | | | | | | | | |
| keep last-modified time | █ | | | █ | █ | █ | | | | | | █ | |
| upload rules | | | | | | | | | | | | | |
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | | █ | █ |
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | | █ | █ |
| ┗ max filesize | █ | | | | | | | █ | | | █ | █ | █ |
| ┗ max items in folder | █ | | | | | | | | | | | | |
| ┗ max file age | █ | | | | | | | | █ | | | | |
@@ -173,6 +175,7 @@ symbol legend,
| ┗ randomize filename | █ | | | | | | | █ | █ | | | | |
| ┗ mimetype reject-list | | | | | | | | | • | | | | • |
| ┗ extension reject-list | | | | | | | | █ | • | | | | • |
| ┗ upload routing | █ | | | | | | | | | | | | |
| checksums provided | | | | █ | █ | | | | █ | | | | |
| cloud storage backend | | | | █ | █ | █ | | | | | █ | █ | |
@@ -182,8 +185,13 @@ symbol legend,
* `upload verification` = uploads are checksummed or otherwise confirmed to have been transferred correctly
* `CTRL-V from device` = press CTRL-C in Windows Explorer (or whatever) and paste into the webbrowser to upload it
* `race the beam` = files can be downloaded while they're still uploading; downloaders are slowed down such that the uploader is always ahead
* `upload routing` = depending on filetype / contents / uploader etc., the file can be redirected to another location or otherwise transformed; mitigates limitations such as [sharex#3992](https://github.com/ShareX/ShareX/issues/3992)
* copyparty example: [reloc-by-ext](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks#before-upload)
* `checksums provided` = when downloading a file from the server, the file's checksum is provided for verification client-side
* `cloud storage backend` = able to serve files from (and write to) s3 or similar cloud services; `` means the software can do this with some help from `rclone mount` as a bridge
@@ -213,7 +221,7 @@ symbol legend,
| serve sftp (ssh) | | | | | | █ | | | | | | █ | █ |
| serve smb/cifs | | | | | | █ | | | | | | | |
| serve dlna | | | | | | █ | | | | | | | |
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ | |
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ | |
| zeroconf | █ | | | | | | | | | | | | █ |
| supports netscape 4 | | | | | | █ | | | | | • | | |
| ...internet explorer 6 | | █ | | █ | | █ | | | | | • | | |
@@ -243,7 +251,7 @@ symbol legend,
| listen multiple ports | █ | | | | | | | | | | | █ | |
| virtual file system | █ | █ | █ | | | | █ | | | | | █ | |
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ | |
| folder-rproxy ok | █ | | | | █ | █ | | • | • | | • | | • |
| folder-rproxy ok | █ | | | | █ | █ | | • | • | | • | | • |
* `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead
* `l`/sftpgo:
@@ -266,9 +274,9 @@ symbol legend,
| per-folder permissions | | | | █ | █ | | █ | | █ | █ | | █ | █ |
| per-file permissions | | | | █ | █ | | █ | | █ | | | | █ |
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | | █ |
| unmap subfolders | █ | | | | | | █ | | | █ | | • | |
| unmap subfolders | █ | | | | | | █ | | | █ | | • | |
| index.html blocks list | | | | | | | █ | | | • | | | |
| write-only folders | █ | | | | | | | | | | █ | █ | |
| write-only folders | █ | | | | | | | | | | █ | █ | |
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ | █ |
| file versioning | | | | █ | █ | | | | | | | | |
| file encryption | | | | █ | █ | █ | | | | | | █ | |
@@ -298,6 +306,7 @@ symbol legend,
* `file action event hooks` = run script before/after upload, move, rename, ...
* `one-way folder sync` = like rsync, optionally deleting unexpected files at target
* `full sync` = stateful, dropbox-like sync
* `speed throttle` = rate limiting (per ip, per user, per connection, anything like that)
* `curl-friendly ls` = returns a [sortable plaintext folder listing](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png) when curled
* `curl-friendly upload` = uploading with curl is just `curl -T some.bin http://.../`
* `a`/copyparty remarks:
@@ -323,14 +332,14 @@ symbol legend,
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | | █ |
| themes | █ | █ | | █ | | | | | █ | | | | |
| themes | █ | █ | | █ | | | | | █ | | | | |
| directory tree nav | █ | | | | █ | | | | █ | | | | |
| multi-column sorting | █ | | | | | | | | | | | | |
| thumbnails | █ | | | | | | | █ | █ | | | | █ |
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | | █ |
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | | █ |
| ┗ audio spectrograms | █ | | | | | | | | | | | | |
| audio player | █ | | | █ | █ | | | | █ | | | | █ |
| audio player | █ | | | █ | █ | | | | █ | | | | █ |
| ┗ gapless playback | █ | | | | | | | | • | | | | |
| ┗ audio equalizer | █ | | | | | | | | | | | | |
| ┗ waveform seekbar | █ | | | | | | | | | | | | |
@@ -348,16 +357,16 @@ symbol legend,
| search by custom parser | █ | | | | | | | | | | | | |
| find local file | █ | | | | | | | | | | | | |
| undo recent uploads | █ | | | | | | | | | | | | |
| create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | | █ |
| create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | | █ |
| markdown viewer | █ | | | | █ | | | | █ | | | | █ |
| markdown editor | █ | | | | █ | | | | █ | | | | █ |
| readme.md in listing | █ | | | █ | | | | | | | | | |
| rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ | █ | █ |
| batch rename | █ | | | | | | | | █ | | | | |
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | | █ |
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | | █ |
| delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | | █ |
| delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| copy files | | | | | █ | | | | █ | █ | █ | | █ |
* `single-page app` = multitasking; possible to continue navigating while uploading
@@ -367,8 +376,12 @@ symbol legend,
* `undo recent uploads` = accounts without delete permissions have a time window where they can undo their own uploads
* `a`/copyparty has teeny-tiny skips playing gapless albums depending on audio codec (opus best)
* `b`/hfs2 has a very basic directory tree view, not showing sibling folders
* `c`/hfs3 remarks:
* audio playback does not continue into next song
* `f`/rclone can do some file management (mkdir, rename, delete) when hosting througn webdav
* `j`/filebrowser has a plaintext viewer/editor
* `j`/filebrowser remarks:
* audio playback does not continue into next song
* plaintext viewer/editor
* `k`/filegator directory tree is a modal window
@@ -424,6 +437,7 @@ symbol legend,
* 💾 are what copyparty offers as an alternative
* 🔵 are similarities
* ⚠️ are disadvantages (something copyparty does "better")
* 🔥 are hazards
## [copyparty](https://github.com/9001/copyparty)
* resumable uploads which are verified server-side
@@ -431,8 +445,9 @@ symbol legend,
* both of the above are surprisingly uncommon features
* very cross-platform (python, no dependencies)
## [hfs2](https://rejetto.com/hfs/)
* the OG, the legend
## [hfs2](https://github.com/rejetto/hfs2/)
* the OG, the legend (now replaced by [hfs3](#hfs3))
* 🔥 hfs2 is dead and dangerous! unfixed RCE: [info](https://github.com/rejetto/hfs2/issues/44), [info](https://github.com/drapid/hfs/issues/3), [info](https://asec.ahnlab.com/en/67650/)
* ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* ⚠️ windows-only
@@ -440,10 +455,19 @@ symbol legend,
* vfs with gui config, per-volume permissions
* starting to show its age, hence the rewrite:
## [hfs3](https://github.com/rejetto/hfs)
## [hfs3](https://rejetto.com/hfs/)
* nodejs; cross-platform
* vfs with gui config, per-volume permissions
* still early development, let's revisit later
* 🔵 uploads are resumable
* ⚠️ uploads are not segmented; max upload size 100 MiB on cloudflare
* ⚠️ uploads are not accelerated (copyparty is 3x faster across the atlantic)
* ⚠️ uploads are not integrity-checked
* ⚠️ copies the file after upload; need twice filesize free disk space
* ⚠️ doesn't support crazy filenames
* ✅ config GUI
* ✅ download counter
* ✅ watch active connections
* ✅ plugins
## [nextcloud](https://github.com/nextcloud/server)
* php, mariadb
@@ -497,6 +521,7 @@ symbol legend,
* rust; cross-platform (windows, linux, macos)
* ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* ⚠️ across the atlantic, copyparty is 3x faster
* ⚠️ doesn't support crazy filenames
* ✅ per-url access control (copyparty is per-volume)
* 🔵 basic but really snappy ui
@@ -539,8 +564,10 @@ symbol legend,
## [filebrowser](https://github.com/filebrowser/filebrowser)
* go; cross-platform (windows, linux, mac)
* ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* 🔵 uploads are resumable and segmented
* 🔵 multiple files are uploaded in parallel, but...
* ⚠️ big files are not accelerated (copyparty is 5x faster across the atlantic)
* ⚠️ uploads are not integrity-checked
* ⚠️ http only; no webdav / ftp / zeroconf
* ⚠️ doesn't support crazy filenames
* ⚠️ no directory tree nav
@@ -550,12 +577,14 @@ symbol legend,
* ⚠️ but no directory tree for navigation
* ✅ user signup
* ✅ command runner / remote shell
* 🔵 supposed to have write-only folders but couldn't get it to work
* ✅ more efficient; can handle around twice as much simultaneous traffic
## [filegator](https://github.com/filegator/filegator)
* go; cross-platform (windows, linux, mac)
* php; cross-platform (windows, linux, mac)
* 🔵 *it has upload segmenting and acceleration*
* ⚠️ but uploads are still not integrity-checked
* ⚠️ on copyparty, uploads are 40x faster
* compared to the official filegator docker example which might be bad
* ⚠️ http only; no webdav / ftp / zeroconf
* ⚠️ does not support symlinks
* ⚠️ expensive download-as-zip feature
@@ -566,6 +595,7 @@ symbol legend,
* go; cross-platform (windows, linux, mac)
* ⚠️ http uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* ⚠️ across the atlantic, copyparty is 2.5x faster
* 🔵 sftp uploads are resumable
* ⚠️ web UI is very minimal + a bit slow
* ⚠️ no thumbnails / image viewer / audio player
@@ -573,6 +603,7 @@ symbol legend,
* ⚠️ no filesystem indexing / search
* ⚠️ doesn't run on phones, tablets
* ⚠️ no zeroconf (mdns/ssdp)
* ⚠️ impractical directory URLs
* ⚠️ AGPL licensed
* 🔵 ftp, ftps, webdav
* ✅ sftp server
@@ -589,11 +620,13 @@ symbol legend,
## [arozos](https://github.com/tobychui/arozos)
* big suite of applications similar to [kodbox](#kodbox), copyparty is better at downloading/uploading/music/indexing but arozos has other advantages
* go; primarily linux (limited support for windows)
* ⚠️ needs root
* ⚠️ uploads not resumable / integrity-checked
* ⚠️ uploading small files to copyparty is 2.7x faster
* ⚠️ uploading large files to copyparty is at least 10% faster
* arozos is websocket-based, 512 KiB chunks; writes each chunk to separate files and then merges
* copyparty splices directly into the final file; faster and better for the HDD and filesystem
* ⚠️ across the atlantic, uploading to copyparty is 6x faster
* ⚠️ no directory tree navpane; not as easy to navigate
* ⚠️ download-as-zip is not streaming; creates a temp.file on the server
* ⚠️ not self-contained (pulls from jsdelivr)

View File

@@ -3,7 +3,7 @@ WORKDIR /z
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
ver_hashwasm=4.10.0 \
ver_marked=4.3.0 \
ver_dompf=3.1.6 \
ver_dompf=3.1.7 \
ver_mde=2.18.0 \
ver_codemirror=5.65.16 \
ver_fontawesome=5.13.0 \
@@ -37,6 +37,7 @@ RUN mkdir -p /z/dist/no-pk \
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
&& wget https://github.com/PrismJS/prism/archive/refs/tags/v$ver_prism.tar.gz -O prism.tgz \
&& wget https://files.pythonhosted.org/packages/04/0b/4506cb2e831cea4b0214d3625430e921faaa05a7fb520458c75a2dbd2152/fusepy-3.0.1.tar.gz -O fusepy.tgz \
&& (mkdir hash-wasm \
&& cd hash-wasm \
&& unzip ../hash-wasm.zip) \
@@ -56,6 +57,7 @@ RUN mkdir -p /z/dist/no-pk \
&& npm i gulp-cli -g ) \
&& tar -xf dompurify.tgz \
&& tar -xf prism.tgz \
&& tar -xf fusepy.tgz \
&& unzip fontawesome.zip \
&& tar -xf zopfli.tgz
@@ -158,6 +160,18 @@ RUN cd /z/dist \
&& rmdir no-pk
# build fusepy
COPY uncomment.py /z
RUN mv /z/fusepy-3.0.1/fuse.py /z/dist/f1 \
&& cd /z/dist \
&& python3 /z/uncomment.py f1 \
&& sed -ri '/self.__critical_exception = e/d' f1 \
&& awk '/^log =/{s=0} !s; /^from traceback im/{s=1;print"from functools import partial";print"basestring = str"}' <f1 >f2 \
&& awk '/LoggingMixIn:/{exit} --s<0;/self.use_ns = getattr/{s=7}' <f2 >f1 \
&& awk "/if _machine =/{s=0} /'(mips|ppc|ppc64)'/{s=1} !s" <f1 >f2 \
&& rm f1 && mv f2 fuse.py
# git diff -U2 --no-index marked-1.1.0-orig/ marked-1.1.0-edit/ -U2 | sed -r '/^index /d;s`^(diff --git a/)[^/]+/(.* b/)[^/]+/`\1\2`; s`^(---|\+\+\+) ([ab]/)[^/]+/`\1 \2`' > ../dev/copyparty/scripts/deps-docker/marked-ln.patch
# d=/home/ed/dev/copyparty/scripts/deps-docker/; tar -cf ../x . && ssh root@$bip "cd $d && tar -xv >&2 && make >&2 && tar -cC ../../copyparty/web deps" <../x | (cd ../../copyparty/web/; cat > the.tgz; tar -xvf the.tgz; rm the.tgz)
# gzip -dkf ../dev/copyparty/copyparty/web/deps/deps/marked.full.js.gz && diff -NarU2 ../dev/copyparty/copyparty/web/deps/{,deps/}marked.full.js

View File

@@ -4,8 +4,10 @@ vend := $(self)/../../copyparty/web/deps
# prefers podman-docker (optionally rootless) over actual docker/moby
all:
cp -pv ../uncomment.py .
docker build -t build-copyparty-deps .
rm -rf $(vend)
mkdir $(vend)

View File

@@ -5,19 +5,16 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-ac" \
org.opencontainers.image.description="copyparty with Pillow and FFmpeg (image/audio/video thumbnails, audio transcoding, media tags)"
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
XDG_CONFIG_HOME=/cfg
ENV XDG_CONFIG_HOME=/cfg
RUN apk --no-cache add !pyc \
wget \
py3-argon2-cffi py3-pillow \
ffmpeg \
&& rm -rf /tmp/pyc \
&& mkdir /cfg /w \
&& chmod 777 /cfg /w \
&& echo % /cfg > initcfg
tzdata wget \
py3-jinja2 py3-argon2-cffi py3-pillow \
ffmpeg
COPY i/dist/copyparty-sfx.py innvikler.sh ./
RUN ash innvikler.sh && rm innvikler.sh
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
ENTRYPOINT ["python3", "-m", "copyparty", "--no-crt", "-c", "/z/initcfg"]

View File

@@ -5,15 +5,14 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-dj" \
org.opencontainers.image.description="copyparty with all optional dependencies, including musical key / bpm detection"
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
XDG_CONFIG_HOME=/cfg
ENV XDG_CONFIG_HOME=/cfg
COPY i/bin/mtag/install-deps.sh ./
COPY i/bin/mtag/audio-bpm.py /mtag/
COPY i/bin/mtag/audio-key.py /mtag/
RUN apk add -U !pyc \
wget \
py3-argon2-cffi py3-pillow py3-pip py3-cffi \
tzdata wget \
py3-jinja2 py3-argon2-cffi py3-pillow py3-pip py3-cffi \
ffmpeg \
vips-jxl vips-heif vips-poppler vips-magick \
py3-numpy fftw libsndfile \
@@ -27,18 +26,12 @@ RUN apk add -U !pyc \
&& python3 -m pip install pyvips \
&& bash install-deps.sh \
&& apk del py3-pip .bd \
&& rm -rf /var/cache/apk/* /tmp/pyc \
&& chmod 777 /root \
&& ln -s /root/vamp /root/.local / \
&& mkdir /cfg /w \
&& chmod 777 /cfg /w \
&& echo % /cfg > initcfg
&& ln -s /root/vamp /root/.local /
COPY i/dist/copyparty-sfx.py innvikler.sh ./
RUN ash innvikler.sh && rm innvikler.sh
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
# size: 286 MB
# bpm/key: 529 sec
# idx-bench: 2352 MB/s
ENTRYPOINT ["python3", "-m", "copyparty", "--no-crt", "-c", "/z/initcfg"]

View File

@@ -5,18 +5,15 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-im" \
org.opencontainers.image.description="copyparty with Pillow and Mutagen (image thumbnails, media tags)"
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
XDG_CONFIG_HOME=/cfg
ENV XDG_CONFIG_HOME=/cfg
RUN apk --no-cache add !pyc \
wget \
py3-argon2-cffi py3-pillow py3-mutagen \
&& rm -rf /tmp/pyc \
&& mkdir /cfg /w \
&& chmod 777 /cfg /w \
&& echo % /cfg > initcfg
tzdata wget \
py3-jinja2 py3-argon2-cffi py3-pillow py3-mutagen
COPY i/dist/copyparty-sfx.py innvikler.sh ./
RUN ash innvikler.sh && rm innvikler.sh
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
ENTRYPOINT ["python3", "-m", "copyparty", "--no-crt", "-c", "/z/initcfg"]

View File

@@ -5,12 +5,11 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-iv" \
org.opencontainers.image.description="copyparty with Pillow, FFmpeg, libvips (image/audio/video thumbnails, audio transcoding, media tags)"
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
XDG_CONFIG_HOME=/cfg
ENV XDG_CONFIG_HOME=/cfg
RUN apk add -U !pyc \
wget \
py3-argon2-cffi py3-pillow py3-pip py3-cffi \
tzdata wget \
py3-jinja2 py3-argon2-cffi py3-pillow py3-pip py3-cffi \
ffmpeg \
vips-jxl vips-heif vips-poppler vips-magick \
&& apk add -t .bd \
@@ -18,13 +17,11 @@ RUN apk add -U !pyc \
python3-dev py3-wheel \
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
&& python3 -m pip install pyvips \
&& apk del py3-pip .bd \
&& rm -rf /var/cache/apk/* /tmp/pyc \
&& mkdir /cfg /w \
&& chmod 777 /cfg /w \
&& echo % /cfg > initcfg
&& apk del py3-pip .bd
COPY i/dist/copyparty-sfx.py innvikler.sh ./
RUN ash innvikler.sh && rm innvikler.sh
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
ENTRYPOINT ["python3", "-m", "copyparty", "--no-crt", "-c", "/z/initcfg"]

View File

@@ -5,17 +5,14 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-min" \
org.opencontainers.image.description="just copyparty, no thumbnails / media tags / audio transcoding"
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
XDG_CONFIG_HOME=/cfg
ENV XDG_CONFIG_HOME=/cfg
RUN apk --no-cache add !pyc \
python3 \
&& rm -rf /tmp/pyc \
&& mkdir /cfg /w \
&& chmod 777 /cfg /w \
&& echo % /cfg > initcfg
py3-jinja2
COPY i/dist/copyparty-sfx.py innvikler.sh ./
RUN ash innvikler.sh && rm innvikler.sh
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "--no-thumb", "-c", "/z/initcfg"]
ENTRYPOINT ["python3", "-m", "copyparty", "--no-crt", "--no-thumb", "-c", "/z/initcfg"]

View File

@@ -22,6 +22,11 @@ this example is also available as a podman-compatible [docker-compose yaml](http
i'm not very familiar with containers, so let me know if this section could be better 🙏
## portainer
* there is a [portainer howto](https://github.com/9001/copyparty/blob/hovudstraum/docs/examples/docker/portainer.md) which is mostly untested
## configuration
> this section basically explains how the [docker-compose yaml](https://github.com/9001/copyparty/blob/hovudstraum/docs/examples/docker/basic-docker-compose) works, so you may look there instead

View File

@@ -17,3 +17,19 @@ but I don't really know what i'm doing here 💩
`podman login docker.io`
`podman login ghcr.io -u 9001`
[about gchq](https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-container-registry) (takes a classic token as password)
## building on alpine
```bash
apk add podman{,-docker}
rc-update add cgroups
service cgroups start
vim /etc/containers/storage.conf # driver = "btrfs"
modprobe tun
echo ed:100000:65536 >/etc/subuid
echo ed:100000:65536 >/etc/subgid
apk add qemu-openrc qemu-tools qemu-{arm,armeb,aarch64,s390x,ppc64le}
rc-update add qemu-binfmt
service qemu-binfmt start
```

View File

@@ -0,0 +1,46 @@
#!/bin/ash
set -ex
# cleanup for flavors with python build steps (dj/iv)
rm -rf /var/cache/apk/* /root/.cache
# initial config; common for all flavors
mkdir /cfg /w
chmod 777 /cfg /w
echo % /cfg > initcfg
# unpack sfx and dive in
python3 copyparty-sfx.py --version
cd /tmp/pe-copyparty.0
# steal the stuff we need
mv copyparty partftpy ftp/* /usr/lib/python3.*/site-packages/
# golf
cd /usr/lib/python3.*/
rm -rf \
/tmp/pe-* /z/copyparty-sfx.py \
ensurepip pydoc_data turtle.py turtledemo lib2to3
# drop bytecode
find / -xdev -name __pycache__ -print0 | xargs -0 rm -rf
# build the stuff we want
python3 -m compileall -qj4 site-packages sqlite3 xml
# drop the stuff we dont
find -name __pycache__ |
grep -E 'ty/web/|/pycpar' |
tr '\n' '\0' | xargs -0 rm -rf
# two-for-one:
# 1) smoketest copyparty even starts
# 2) build any bytecode we missed
# this tends to race other builders (alle gode ting er tre)
cd /z
python3 -m copyparty \
--ign-ebind -p$((1024+RANDOM)),$((1024+RANDOM)),$((1024+RANDOM)) \
--no-crt -qi127.1 --exit=idx -e2dsa -e2ts
# output from -e2d
rm -rf .hist

View File

@@ -6,6 +6,8 @@ set -e
exit 1
}
suf=-b1
suf=
sarchs="386 amd64 arm/v7 arm64/v8 ppc64le s390x"
archs="amd64 arm s390x 386 arm64 ppc64le"
imgs="dj iv min im ac"
@@ -98,16 +100,17 @@ filt=
aa="$(printf '%11s' $a-$i)"
# arm takes forever so make it top priority
[ ${a::3} == arm ] && nice= || nice=nice
[ ${a::3} == arm ] && nice= || nice=-n20
# --pull=never does nothing at all btw
(set -x
$nice podman build \
nice $nice podman build \
--squash \
--pull=never \
--from localhost/alpine-$a \
-t copyparty-$i-$a \
-t copyparty-$i-$a$suf \
-f Dockerfile.$i . ||
(echo $? $i-$a >> err)
(echo $? $i-$a >> err; printf '%096d\n' $(seq 1 42))
rm -f .blk
) 2> >(tee $a.err | sed "s/^/$aa:/" >&2) > >(tee $a.out | sed "s/^/$aa:/") &
done
@@ -134,9 +137,10 @@ filt=
variants=
for a in $archs; do
[[ " ${ngs[*]} " =~ " $i-$a " ]] && continue
variants="$variants containers-storage:localhost/copyparty-$i-$a"
variants="$variants containers-storage:localhost/copyparty-$i-$a$suf"
done
podman manifest create copyparty-$i $variants
podman manifest rm copyparty-$i$suf || echo "(that's fine btw)"
podman manifest create copyparty-$i$suf $variants
done
}

81
scripts/help2html.py Executable file
View File

@@ -0,0 +1,81 @@
#!/usr/bin/env python3
import re
import subprocess as sp
# to convert the copyparty --help to html, run this in xfce4-terminal @ 140x43:
_ = r""""
echo; for a in '' -bind -accounts -flags -handlers -hooks -urlform -exp -ls -dbd -pwhash -zm; do
./copyparty-sfx.py --help$a 2>/dev/null; printf '\n\n\n%0139d\n\n\n'; done # xfce4-terminal @ 140x43
"""
# click [edit] => [select all]
# click [edit] => [copy as html]
# and then run this script
def readclip():
cmds = [
"xsel -ob",
"xclip -selection CLIPBOARD -o",
"pbpaste",
]
for cmd in cmds:
try:
return sp.check_output(cmd.split()).decode("utf-8")
except:
pass
def cnv(src):
yield '<html style="background:#222;color:#fff"><body>'
skip_sfx = False
in_sfx = 0
in_salt = 0
while True:
ln = next(src)
if "<font" in ln:
if not ln.startswith("<pre>"):
ln = "<pre>" + ln
yield ln
break
for ln in src:
ln = ln.rstrip()
if re.search(r"^<font[^>]+>copyparty v[0-9]", ln):
in_sfx = 3
if in_sfx:
in_sfx -= 1
if not skip_sfx:
yield ln
continue
if '">uuid:' in ln:
ln = re.sub(r">uuid:[0-9a-f-]{36}<", ">autogenerated<", ln)
if "-salt SALT" in ln:
in_salt = 3
if in_salt:
in_salt -= 1
t = ln
ln = re.sub(r">[0-9a-zA-Z/+]{24}<", ">24-character-autogenerated<", ln)
ln = re.sub(r">[0-9a-zA-Z/+]{40}<", ">40-character-autogenerated<", ln)
if t != ln:
in_salt = 0
ln = ln.replace(">/home/ed/", ">~/")
if ln.startswith("0" * 20):
skip_sfx = True
yield ln
yield "</pre>eof</body></html>"
def main():
src = readclip()
src = re.split("0{100,200}", src[::-1], 1)[1][::-1]
with open("helptext.html", "wb") as fo:
for ln in cnv(iter(src.split("\n")[:-3])):
fo.write(ln.encode("utf-8") + b"\r\n")
if __name__ == "__main__":
main()

26
scripts/help2txt.sh Executable file
View File

@@ -0,0 +1,26 @@
#!/bin/bash
set -e
( xsel -ob | sed -r '
s`/home/ed/`~/`;
s/uuid:[0-9a-f-]{36}/autogenerated/;
s/(-salt SALT.*default: )[0-9a-zA-Z/+]{24}\)/\124-character-autogenerated)/;
s/(-salt SALT.*default: )[0-9a-zA-Z/+]{40}\)/\140-character-autogenerated)/;
' | awk '
/^copyparty/{a=1} !a{next}
/^0{20}/{b=1} b&&/^copyparty v[0-9]+\./{s=3}
s{s-=1;next} 1' |
head -n-6; echo eof ) >helptext.txt
exit 0
# =====================================================================
# end of script; below is the explanation how to use this:
# first open an infinitely wide console (this is why you own an ultrawide) and copypaste this into it:
for a in '' -bind -accounts -flags -handlers -hooks -urlform -exp -ls -dbd -pwhash -zm; do
./copyparty-sfx.py --help$a 2>/dev/null; printf '\n\n\n%0255d\n\n\n'; done
# then copypaste all of the output by pressing ctrl-shift-a, ctrl-shift-c
# and finally actually run this script which should produce helptext.txt

View File

@@ -42,12 +42,6 @@ ver="$(cat ../sfx/ver)"
mkdir -p ../dist
pyz_out=../dist/copyparty.pyz
echo creating z.tar
( cd copyparty
tar -cf z.tar "${targs[@]}" --numeric-owner web res
rm -rf web res
)
echo creating loader
sed -r 's/^(VER = ).*/\1"'"$ver"'"/; s/^(STAMP = ).*/\1'$(date +%s)/ \
<../scripts/ziploader.py \

View File

@@ -3,6 +3,7 @@ set -e
echo
berr() { p=$(head -c 72 </dev/zero | tr '\0' =); printf '\n%s\n\n' $p; cat; printf '\n%s\n\n' $p; }
aerr() { printf '%s\n' "$*" | berr; }
help() { exec cat <<'EOF'
@@ -26,11 +27,13 @@ help() { exec cat <<'EOF'
#
# `no-ftp` saves ~30k by removing the ftp server, disabling --ftp
#
# `no-pf` saves ~12k by removing the option to download partyfuse
#
# `no-tfp` saves ~10k by removing the tftp server, disabling --tftp
#
# `no-smb` saves ~3.5k by removing the smb / cifs server
# `no-zm` saves ~7k by removing the zeroconf mDNS server
#
# `no-zm` saves ~k by removing the zeroconf mDNS server
# `no-smb` saves ~3.5k by removing the smb / cifs server
#
# _____________________________________________________________________
# web features:
@@ -52,10 +55,15 @@ help() { exec cat <<'EOF'
#
# `ign-wd` allows building an sfx without webdeps
#
# ---------------------------------------------------------------------
#
# _____________________________________________________________________
# if you are on windows, you can use msys2:
# PATH=/c/Users/$USER/AppData/Local/Programs/Python/Python310:"$PATH" ./make-sfx.sh fast
#
# _____________________________________________________________________
# some usage examples:
# ./scripts/make-sfx.sh lang eng no-cm no-hl no-dd no-fnt no-smb no-pf
# ./scripts/rls.sh sfx lang eng no-cm no-hl no-dd no-fnt no-smb no-pf
# (reduces v1.14.2 from 700k to 495k)
EOF
}
@@ -101,7 +109,7 @@ pybin=$(command -v python3 || command -v python) || {
langs=
use_gz=
zopf=2560
zopf=2000
while [ ! -z "$1" ]; do
case $1 in
clean) clean=1 ; ;;
@@ -112,6 +120,7 @@ while [ ! -z "$1" ]; do
no-tfp) no_tfp=1 ; ;;
no-smb) no_smb=1 ; ;;
no-zm) no_zm=1 ; ;;
no-pf) no_pf=1 ; ;;
no-fnt) no_fnt=1 ; ;;
no-hl) no_hl=1 ; ;;
no-dd) no_dd=1 ; ;;
@@ -119,7 +128,6 @@ while [ ! -z "$1" ]; do
dl-wd) dl_wd=1 ; ;;
ign-wd) ign_wd=1 ; ;;
fast) zopf= ; ;;
ultra) ultra=1 ; ;;
lang) shift;langs="$1"; ;;
*) help ; ;;
esac
@@ -138,6 +146,13 @@ ised() {
sed -r "$1" <"$2" >t
tmv "$2"
}
dlf() {
[ -s "$f" ] && return 0
wget -O "$f" "$1" && return 0
curl -L "$1" >"$f" && return 0
rm -f "$f"
exit 1
}
stamp=$(
for d in copyparty scripts; do
@@ -170,8 +185,7 @@ necho() {
necho collecting ipaddress
f="../build/ipaddress-1.0.23.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz
tar -zxf $f
mkdir py37
@@ -181,8 +195,7 @@ necho() {
necho collecting jinja2
f="../build/Jinja2-2.11.3.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/4f/e7/65300e6b32e69768ded990494809106f87da1d436418d5f1367ed3966fd7/Jinja2-2.11.3.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/4f/e7/65300e6b32e69768ded990494809106f87da1d436418d5f1367ed3966fd7/Jinja2-2.11.3.tar.gz
tar -zxf $f
mv Jinja2-*/src/jinja2 .
@@ -191,8 +204,7 @@ necho() {
necho collecting markupsafe
f="../build/MarkupSafe-1.1.1.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz
tar -zxf $f
mv MarkupSafe-*/src/markupsafe .
@@ -202,14 +214,13 @@ necho() {
mv {markupsafe,jinja2} j2/
necho collecting pyftpdlib
f="../build/pyftpdlib-1.5.9.tar.gz"
f="../build/pyftpdlib-1.5.10.tar.gz"
[ -e "$f" ] ||
(url=https://github.com/giampaolo/pyftpdlib/archive/refs/tags/release-1.5.9.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/cf/31/8d910cf40317dd0db74ba0b8558d0dee23c8b002468c14d3a5dec0e6e9fd/pyftpdlib-1.5.10.tar.gz
tar -zxf $f
mv pyftpdlib-release-*/pyftpdlib .
rm -rf pyftpdlib-release-* pyftpdlib/test
mv pyftpdlib-*/pyftpdlib .
rm -rf pyftpdlib-* pyftpdlib/test
for f in pyftpdlib/_async{hat,ore}.py; do
[ -e "$f" ] || continue;
iawk 'NR<4||NR>27||!/^#/;NR==4{print"# license: https://opensource.org/licenses/ISC\n"}' $f
@@ -221,8 +232,7 @@ necho() {
necho collecting partftpy
f="../build/partftpy-0.4.0.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/8c/96/642bb3ddcb07a2c6764eb29aa562d1cf56877ad6c330c3c8921a5f05606d/partftpy-0.4.0.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/8c/96/642bb3ddcb07a2c6764eb29aa562d1cf56877ad6c330c3c8921a5f05606d/partftpy-0.4.0.tar.gz
tar -zxf $f
mv partftpy-*/partftpy .
@@ -232,8 +242,7 @@ necho() {
v=0.4.27
f="../build/python-magic-$v.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/da/db/0b3e28ac047452d079d375ec6798bf76a036a08182dbb39ed38116a49130/python-magic-0.4.27.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/da/db/0b3e28ac047452d079d375ec6798bf76a036a08182dbb39ed38116a49130/python-magic-0.4.27.tar.gz
tar -zxf $f
mkdir magic
@@ -248,8 +257,7 @@ necho() {
necho collecting strip-hints
f=../build/strip-hints-0.1.10.tar.gz
[ -e $f ] ||
(url=https://files.pythonhosted.org/packages/9c/d4/312ddce71ee10f7e0ab762afc027e07a918f1c0e1be5b0069db5b0e7542d/strip-hints-0.1.10.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/9c/d4/312ddce71ee10f7e0ab762afc027e07a918f1c0e1be5b0069db5b0e7542d/strip-hints-0.1.10.tar.gz
tar -zxf $f
mv strip-hints-0.1.10/src/strip_hints .
@@ -307,7 +315,7 @@ necho() {
[ ! -e copyparty/web/deps/mini-fa.woff ] && [ $dl_wd ] && {
echo "could not find webdeps; downloading..."
url=https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py
wget -Ox.py "$url" || curl -L "$url" >x.py
f=x.py; rm -f $f; dlf $url
echo "extracting webdeps..."
wdsrc="$("$pybin" x.py --version 2>&1 | tee /dev/stderr | awk '/sfxdir:/{sub(/.*: /,"");print;exit}')"
@@ -413,7 +421,7 @@ rm have
ised /fork_process/d ftp/pyftpdlib/servers.py
iawk '/^class _Base/{s=1}!s' ftp/pyftpdlib/authorizers.py
iawk '/^ {0,4}[^ ]/{s=0}/^ {4}def (serve_forever|_loop)/{s=1}!s' ftp/pyftpdlib/servers.py
iawk '/^ {0,4}[a-zA-Z]/{s=0}/^ {4}def (serve_forever|_loop)/{s=1}!s' ftp/pyftpdlib/servers.py
rm -f ftp/pyftpdlib/{__main__,prefork}.py
[ $no_ftp ] &&
@@ -428,6 +436,9 @@ rm -f ftp/pyftpdlib/{__main__,prefork}.py
[ $no_zm ] &&
rm -rf copyparty/mdns.py copyparty/stolen/dnslib
[ $no_pf ] &&
rm -rf copyparty/web/a/partyfuse.py copyparty/web/deps/fuse.py
[ $no_cm ] && {
rm -rf copyparty/web/mde.* copyparty/web/deps/easymde*
echo h > copyparty/web/mde.html
@@ -451,11 +462,16 @@ rm -f ftp/pyftpdlib/{__main__,prefork}.py
ised 's/(cursor: ?)url\([^)]+\), ?(pointer)/\1\2/; s/[0-9]+% \{cursor:[^}]+\}//; s/animation: ?cursor[^};]+//' $f
}
[ $langs ] &&
[ $langs ] && {
echo $langs | grep -q eng || {
langs="eng|$langs"
aerr "ERROR: removing english is not supported; will do this instead: $langs"
}
for f in copyparty/web/{browser.js,splash.js}; do
gzip -d "$f.gz" || true
iawk '/^\}/{l=0} !l; /^var Ls =/{l=1;next} o; /^\t["}]/{o=0} /^\t"'"$langs"'"/{o=1;print}' $f
iawk '/^\}/{l=0} !l; /^var Ls =/{l=1;next} !l{next} o; /^\t["}]/{o=0} /^\t"'"$langs"'"/{o=1;print}' $f
done
}
[ ! $repack ] && {
# uncomment
@@ -476,8 +492,8 @@ iawk '/^def /{s=0}/^def generate_lorem_ipsum/{s=1}!s' j2/jinja2/utils.py
iawk '/^(class|def) /{s=0}/^(class InternationalizationExtension|def _make_new_n?gettext)/{s=1}!s' j2/jinja2/ext.py
iawk '/^[^ ]/{s=0}/^def babel_extract/{s=1}!s' j2/jinja2/ext.py
ised '/InternationalizationExtension/d' j2/jinja2/ext.py
iawk '/^class/{s=0}/^class (Package|Dict|Function|Prefix|Choice|Module)Loader/{s=1}!s' j2/jinja2/loaders.py
sed -ri '/^from .bccache | (Package|Dict|Function|Prefix|Choice|Module)Loader$/d' j2/jinja2/__init__.py
iawk '/^class/{s=0}/^class (Package|Dict|Prefix|Choice|Module)Loader/{s=1}!s' j2/jinja2/loaders.py
sed -ri '/^from .bccache | (Package|Dict|Prefix|Choice|Module)Loader$/d' j2/jinja2/__init__.py
rm -f j2/jinja2/async* j2/jinja2/{bccache,sandbox}.py
cat > j2/jinja2/_identifier.py <<'EOF'
import re
@@ -593,7 +609,7 @@ pc="bzip2 -"; pe=bz2
[ $use_gzz ] && pc="pigz -11 -I$use_gzz" && pe=gz
echo compressing tar
for n in {2..9}; do cp tar t.$n; nice $pc$n t.$n & done; wait
for n in {2..9}; do cp tar t.$n; nice -n20 $pc$n t.$n & done; wait
minf=$(for f in t.*.$pe; do
s1=$(wc -c <$f)
s2=$(tr -d '\r\n\0' <$f | wc -c)

View File

@@ -77,11 +77,14 @@ excl=(
email._header_value_parser
email.header
email.parser
importlib.resources
importlib_resources
inspect
multiprocessing
packaging
pdb
pickle
pkg_resources
PIL.EpsImagePlugin
pyftpdlib.prefork
urllib.request

View File

@@ -11,14 +11,10 @@ ckpypi() {
pyinstaller
pyinstaller-hooks-contrib
pywin32-ctypes
certifi
charset_normalizer
idna
Jinja2
MarkupSafe
mutagen
Pillow
requests
)
for dep in "${deps[@]}"; do
k=

View File

@@ -4,12 +4,6 @@ f117016b1e6a7d7e745db30d3e67f1acf7957c443a0dd301b6c5e10b8368f2aa4db6be9782d2d3f8
749a473646c6d4c7939989649733d4c7699fd1c359c27046bf5bc9c070d1a4b8b986bbc65f60d7da725baf16dbfdd75a4c2f5bb8335f2cb5685073f5fee5c2d1 pywin32_ctypes-0.2.2-py3-none-any.whl
085d39ef4426aa5f097fbc484595becc16e61ca23fc7da4d2a8bba540a3b82e789e390b176c7151bdc67d01735cce22b1562cdb2e31273225a2d3e275851a4ad setuptools-70.3.0-py3-none-any.whl
360a141928f4a7ec18a994602cbb28bbf8b5cc7c077a06ac76b54b12fa769ed95ca0333a5cf728923a8e0baeb5cc4d5e73e5b3de2666beb05eb477d8ae719093 upx-4.2.4-win32.zip
# u2c (win7)
7a3bd4849f95e1715fe2e99613df70a0fedd944a9bfde71a0fadb837fe62c3431c30da4f0b75c74de6f1a459f1fdf7cb62eaf404fdbe45e2d121e0b1021f1580 certifi-2024.2.2-py3-none-any.whl
9cc8acc5e269e6421bc32bb89261101da29d6ca337d39d60b9106de9ed7904e188716e4a48d78a2c4329026443fcab7acab013d2fe43778e30d6c4e4506a1b91 charset_normalizer-3.3.2-cp37-cp37m-win32.whl
0ec1ae5c928b4a0001a254c8598b746049406e1eed720bfafa94d4474078eff76bf6e032124e2d4df4619052836523af36162443c6d746487b387d2e3476e691 idna-3.6-py3-none-any.whl
cc08d0d87d184401872a2f82266d589253979b4cd02f23b51290fbb2a20082848fc72acbed8aacb74ac4af068d575ef96e66196c5068bc38fb0bcafdc7626869 requests-2.29.0-py3-none-any.whl
fe5fee6cb8a2c68800b32353a0015e5d2e1ad1cb6e0c9e6acf86e48e5cdb5606ad465dc4485ea5fbc8701d8716a8a7f7148c57724ef9da26b0c0a76f6dbbd698 urllib3-1.26.19-py2.py3-none-any.whl
# win7
3253e86471e6f9fa85bfdb7684cd2f964ed6e35c6a4db87f81cca157c049bef43e66dfcae1e037b2fb904567b1e028aaeefe8983ba3255105df787406d2aa71e en_windows_7_professional_with_sp1_x86_dvd_u_677056.iso
ab0db0283f61a5bbe44797d74546786bf41685175764a448d2e3bd629f292f1e7d829757b26be346b5044d78c9c1891736d93237cee4b1b6f5996a902c86d15f en_windows_7_professional_with_sp1_x64_dvd_u_676939.iso
@@ -34,6 +28,6 @@ d1420c8417fad7888766dd26b9706a87c63e8f33dceeb8e26d0056d5127b0b3ed9272e44b4b76113
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
0203ec2551c4836696cfab0b2c9fff603352f03fa36e7476e2e1ca7ec57a3a0c24bd791fcd92f342bf817f0887854d9f072e0271c643de4b313d8c9569ba8813 packaging-24.1-py3-none-any.whl
2be320b4191f208cdd6af183c77ba2cf460ea52164ee45ac3ff17d6dfa57acd9deff016636c2dd42a21f4f6af977d5f72df7dacf599bebcf41757272354d14c1 pillow-10.4.0-cp312-cp312-win_amd64.whl
776378f5414efd26ec8a1cb3228a7b5fdf6afca3fa335a0e9b071266d55d9d9e66ee157c25a468a05bfa70ccd33c48b101998523fc6ff6bcf5e82a1d81ed0af8 pyinstaller-6.9.0-py3-none-win_amd64.whl
c0af77d2a57cb063ab038dc986ed3582bc5acc8c8bd91d726101935d6388f50854ddbca26bc846ed5d1022cdee4d96242938c66f0ddc4565c36b60d691064db8 pyinstaller_hooks_contrib-2024.7-py2.py3-none-any.whl
2f9a11ffae6d9f1ed76bf816f28812fcba71f87080b0c92e52bfccb46243118c5803a7e25dd78003ca7d66501bfcdce8ff7c691c63c0038b0d409ca3842dcc89 python-3.12.4-amd64.exe
896ddddbd4b85e86e0600cb65eb4c07fbc7f3802d47e7f660411e20b5500831469b97ed4770f25820f4e75cbfac40308da624fd86d4f62e578149d5c276a9cde pyinstaller-6.10.0-py3-none-win_amd64.whl
873781decaeef07f6a79b0ed8b9f35f3fa534a1ea0d866991e40278a10818fa5b60c70b0d5828971b045364f1099694cd1e5d5d60d480acb93fcfbfbced4a09e pyinstaller_hooks_contrib-2024.8-py3-none-any.whl
912b710007c7b29f29c0097aff8f825412166eed7777a7cef135b14316e8fff31b5df56d26d835d8ca090468cc0e914730f201a56caa3dd6dbef2f91088942b1 python-3.12.7-amd64.exe

View File

@@ -13,13 +13,6 @@ https://pypi.org/project/MarkupSafe/#files
https://pypi.org/project/mutagen/#files
https://pypi.org/project/Pillow/#files
# u2c (win7) additionals
https://pypi.org/project/certifi/#files
https://pypi.org/project/charset-normalizer/#files # cp37-cp37m-win32.whl
https://pypi.org/project/idna/#files
https://pypi.org/project/requests/#files
https://pypi.org/project/urllib3/#files
# win7 additionals
https://pypi.org/project/future/#files
https://pypi.org/project/importlib-metadata/#files

Some files were not shown because too many files have changed in this diff Show More